- Lumiera
- Posts
- 🔆 The Invisible Empire: Who Owns AI’s Physical Future?
🔆 The Invisible Empire: Who Owns AI’s Physical Future?
Understanding AI infrastructure signals, landmark amendments, and ethical data studies.
🗞️ Issue 79 // ⏱️ Read Time: 6 min
In this week's newsletter
What we’re talking about: The physical infrastructure that powers AI and why understanding who controls it matters more than the specs they advertise.
How it’s relevant: When you hear "NVIDIA launches new chip" or "Microsoft builds $80 billion data centers," these aren't just tech announcements. They're strategic moves that will determine an organisation’s AI options, costs, and competitive advantages for years to come.
Why it matters: The companies that win the infrastructure “battle” get to set the rules. They decide which AI capabilities you can access as a company, how much you'll pay, and whether you'll have genuine choices or be locked into their ecosystem. Understanding this landscape helps you make smarter bets about where to invest your AI strategy.
Hello 👋
Today, we’re getting physical. That is to say, we’ll be discussing the physical hardware that powers AI and the business around it. If you read our insights from Paris last week, you'll know that we attended a conference where some of the biggest players in AI infrastructure were on stage, talking about the current state and future of AI.
Most coverage misses that this isn't really about the technology. It's about power, control, and who gets to shape the future of business.
We're approaching this from a business strategy perspective, not a technical deep dive. Think "what this means for my AI roadmap" rather than "how the chips work." By the end of this newsletter, you'll know which infrastructure announcements actually affect your AI strategy and which are just noise.
Let's decode what's really happening.
Understanding the AI Infrastructure Stack
Think of AI infrastructure like a city's foundation systems. Just as you don't think about power grids until the lights go out, most businesses don't think about AI infrastructure until they can't get what they need. The full stack consists of four different layers: Physical, platform, distribution, and access. Let’s get into the details.
🧱 The Physical Layer: Who Controls the Hardware
This is where the raw computational power lives – specialized chips and the massive data centers that house them. Control at this layer determines who has the foundational compute resources for modern AI.
Key players include:
NVIDIA is the incumbent company
AMD is the challenger betting on openness & flexible integration
Groq is the speed specialist focusing on ultra-fast AI responses
☁️ The Platform Layer: Who Controls Access
This layer encompasses the cloud platforms and developer frameworks that determine how companies access and build with AI. Control here shapes the developer experience, integration options, and default tooling.
Key players include:
Microsoft leverages its OpenAI partnership
Amazon builds custom chips while maintaining platform dominance
Google uses proprietary processors and research leadership
🚛 The Distribution Layer: Who Controls Delivery
This covers how AI actually reaches end users: The networks and edge infrastructure that impact performance, latency, and reliability. When your AI takes 3 seconds to respond instead of 100 milliseconds, that's a broken user experience in real time. This layer determines whether your AI investments translate to competitive advantage. Control here means deciding response times and geographic availability for AI applications. Key players include:
Cloudflare brings AI processing closer to users via a global edge network
Traditional Content Delivery Networks (CDNs) are adapting their established networks for AI workloads
🛂 The Access Layer: Who Sets the Rules
This layer defines pricing models, service agreements, and overall market dynamics. Essentially, who gets what capabilities and at what cost. Control here shapes economic access to AI. Key players include:
CoreWeave is a specialized provider focused exclusively on AI workloads
Traditional Cloud Giants like Microsoft, Amazon, and Google each develop pricing strategies and vendor lock-in approaches
Understanding the Power Game
When you see infrastructure news, ask yourself: What type of control is this company trying to establish?
Ecosystem Lock-in Strategies: NVIDIA doesn't just sell chips. They've built CUDA, a software ecosystem that makes it expensive to switch to competitors. When your development team learns NVIDIA's tools, your company becomes more likely to buy NVIDIA's future products.
Vertical Integration Plays: Google, Amazon, and Microsoft aren't just buying chips, they're designing their own. This isn't about saving money; it's about reducing dependence on NVIDIA and gaining control over their own destiny. When a cloud provider controls the full stack (vertical integration), they can optimize for their specific services and lock you into their platform.
Platform Standardization Battles: Different companies are pushing different standards for how AI systems communicate and operate. The winner doesn't just get market share, they get to influence how the entire industry develops.
What to Watch For in Infrastructure News
Not all infrastructure announcements are created equal. Here's how to spot what actually matters for your business:
🔴 Red Flag – Pure Performance Claims: Headlines like "New Chip 10x Faster!" usually matter less than they seem. When NVIDIA announced their H100 was "30x faster," most businesses saw no meaningful performance improvement in their actual applications. What matters more: cost-performance ratios and ecosystem compatibility.
🟢 Green Flag – Ecosystem Moves: Pay attention when companies announce partnerships, new developer tools, or standardization efforts. Microsoft's GitHub Copilot integration mattered more than raw Azure compute improvements because it changed how developers actually work.
🚨 Critical Signal: Pricing Model Changes: Infrastructure pricing tells you everything about market dynamics. When providers cut prices aggressively, it signals oversupply or new competition. When they introduce new pricing tiers, they're testing market willingness to pay.
With infrastructure decisions today determining AI capabilities for the next decade, how are you balancing what's best for your immediate business needs with what might position you for long-term success?
What This Means for Your Business
Understanding infrastructure control helps you make better strategic decisions about AI adoption.
Small/Medium Business: Focus on avoiding vendor lock-in and watch for democratization signals that make AI capabilities more accessible.
Enterprise: Track vertical integration moves and geographic expansion that could affect your multi-region operations.
Tech Companies: Monitor ecosystem battles and partnership announcements that could shift competitive dynamics in your industry.
The $500 Billion Reality Check
The January ‘25 Stargate project announcement from OpenAI, $500 billion over four years, isn't just about building more data centers. It's about establishing strategic control over AI infrastructure in the US at national scale.
This level of investment signals that AI infrastructure is moving beyond traditional enterprise spending into geopolitical competition. Countries and companies are racing to establish infrastructure dominance because they understand a fundamental truth: whoever controls the infrastructure controls the future of AI-powered business.
For business leaders, this creates both opportunity and urgency. The infrastructure decisions being made today will determine which AI capabilities you can access five years from now, and at what cost.
Three questions to ask when you see infrastructure news:
Does this change my costs?
Does this affect my vendor options?
Does this create new competitive dynamics in my industry?
Big tech news of the week…
⚖️ The Danish government has proposed a landmark amendment to its copyright laws that would allow individuals to copyright their own face, voice, and body, making it illegal for others to use these attributes in AI-generated deepfakes without consent
💰Google has struck a major deal with AI coding startup Windsurf after OpenAI’s planned $3 billion acquisition of the company fell through. Instead of a full buyout, Google is paying about $2.4 billion to license Windsurf’s technology and is hiring key members of its team, including the CEO.
📈A recent study titled "Can Performant LLMs Be Ethical? Quantifying the Impact of Web Crawling Opt-Outs" provides new empirical evidence on the relationship between ethical data practices (specifically, honoring web crawling opt-outs) and the performance of large language models (LLMs).
🤖Goldman Sachs becomes the first major bank to implement a start-to-finish AI-coding tool, with the “hiring” of an AI software engineer, Devin, made by the startup Cognition.
Until next time.
On behalf of Team Lumiera
Lumiera has gathered the brightest people from the technology and policy sectors to give you top-quality advice so you can navigate the new AI Era.
Follow the carefully curated Lumiera podcast playlist to stay informed and challenged on all things AI.
What did you think of today's newsletter? |

Disclaimer: Lumiera is not a registered investment, legal, or tax advisor, or a broker/dealer. All investment/financial opinions expressed by Lumiera and its authors are for informational purposes only, and do not constitute or imply an endorsement of any third party's products or services. Information was obtained from third-party sources, which we believe to be reliable but not guaranteed for accuracy or completeness. |