The Full Throttle Cloud Weekly: The Infrastructure Buildout Confirmed.
Well, that escalated quickly. While everyone was obsessing over the latest model releases and benchmark battles, this week delivered something far more interesting: the infrastructure layer started ma
Well, that escalated quickly. While the focus in some quarters was over the latest model releases and benchmark battles, this week delivered something far more interesting: the infrastructure layer started making the real decisions.
The Big Signal: xAI Goes for the Jugular
xAI launched Grok 4.3 at $1.25/$2.50 per million tokens—roughly 40% cheaper on input and 60% cheaper on output than its predecessor. That's not incremental pricing. That's a declaration of war.
Grok 4.3 isn't just cheap, it's strategically cheap. By positioning itself in the bottom half of all major foundation models cost-wise (closer to Chinese open-source offerings than U.S. proprietary rivals), xAI is forcing a conversation every enterprise AI team has been avoiding: What happens when "good enough" costs 5x less than "state-of-the-art"?
The Infrastructure Buildout Accelerates
Three signals this week show how seriously the market is taking compute scarcity:
Meta buys robotics startup to beef up its humanoid AI ambitions. Not a partnership, not an investment—an acquisition. When you're building the infrastructure for embodied AI, you don't license. You own.
Coatue has a plan to buy land near power sources for data centers, possibly for Anthropic. We're not just talking about renting compute anymore. We're talking about buying dirt and building power plants. That's vertically integrated AI at scale.
Pentagon inks deals with [Nvidia](https://www.fullthrottle.cloud/?node=Company:4:42986984-4792-43c2-bd50-9983fafe2d6f:1), Microsoft, and AWS Nvidia, Microsoft, and AWS** to deploy AI on classified networks. When the military diversifies its AI vendor exposure after disputes with Anthropic, that's not just procurement policy. That's risk management for national security infrastructure.
The Earnings Confirm It: ~$700B in Capex Says So
If you wanted hard numbers behind the buildout narrative, this week delivered them. Microsoft, Alphabet, Amazon, and Meta all reported, and every single one raised their 2026 capex guidance — collectively committing somewhere north of $700 billion in capital spending this calendar year. That's larger than the GDP of Switzerland.
The receipts:
Microsoft spent $31.9 billion in Q3 capex (up 49% YoY) and now expects to invest ~$190 billion in calendar 2026 — a $25 billion bump driven by component pricing alone. Azure grew 40% on a constant-currency basis, and the AI business hit a $37 billion annual run rate, up 123% YoY.
Alphabet dropped $35.7 billion in Q1 and raised full-year guidance to $180–$190 billion. Google Cloud grew 63%, and the cloud backlog nearly doubled to $460 billion. The single most important sentence from the call, courtesy of Sundar Pichai: "We are compute constrained in the near term. Our cloud revenue would have been higher if we were able to meet the demand." CFO Anat Ashkenazi added that 2027 capex will "significantly increase" from here.
Amazon posted the largest capex print of the bunch — $43.2 billion in a single quarter — pushing trailing-twelve-month capex to $147.3 billion against a full-year target of ~$200 billion. AWS grew 28% (fastest in 15 quarters), the backlog hit $364 billion before layering in Anthropic's $100B+ multi-year commitment, and Trainium is now an annualized $20B+ run rate growing triple digits.
Meta is the outlier. It raised 2026 capex guidance to $125–$145 billion (from $115–$135B), and the market punished it — the stock dropped ~6% after-hours despite a 33% revenue beat. The signal: investors will tolerate massive infrastructure spend as long as cloud-and-AI revenue scales alongside it (Microsoft, Alphabet). When the payback story is Reels recommendations and ad targeting, patience runs thin.
Two things to take away. First, Coatue buying dirt for Anthropic suddenly looks modest when AWS is signing $100B+ compute deals with the same company. Second, "compute constrained" coming from the firm that owns its own TPUs, its own fiber, its own data centers, and Intersect's land portfolio is not a supply chain story. It's a physics story. The hyperscalers are running out of places to plug things in faster than they're running out of capital.
The Dataset Gold Rush
This week's graph additions tell a story about who's building the moats that actually matter. Look at the new datasets that appeared: OpenAI's HumanEval, AllenAI's C4, Google's IFEval, and Nvidia's PhysicalAI datasets for autonomous vehicles and robotics.
These aren't just training resources. They're the raw materials for competitive advantage. When LlamaIndex's CEO tells us that "context is becoming the moat" and that "95% of LlamaIndex code is generated by AI," he's describing a world where the quality of your training data matters more than the cleverness of your architecture.
News That Actually Matters
AI-generated actors and scripts are now ineligible for Oscars. Hollywood just drew a line in the sand. This isn't about artistic integrity—it's about labor economics. When the entertainment industry preemptively bans AI-generated content from its highest honors, they're signaling that human creativity still has protected status, at least in some domains.
Anthropic's potential $900B+ valuation round could happen within two weeks. If this closes, we'll have our first near-trillion-dollar AI company. That's not just a valuation milestone. That's the moment AI officially becomes too big to fail.
MCP servers expose command execution flaws affecting 200,000 instances. The Model Context Protocol that everyone adopted without reading the fine print has a feature that looks suspiciously like a bug: it executes arbitrary OS commands by design. OX Security found this affects everyone, and Anthropic's response was essentially "working as intended." That's what happens when you ship protocols first and security models later.
The Deeper Pattern: Infrastructure Is Strategy
We're watching the collapse of the traditional software stack, but not in the way anyone expected. It's not that AI is replacing programmers—it's that AI is making infrastructure decisions that used to require human strategy.
When Runpod launches Flash to eliminate Docker containers from serverless development, they're not just reducing friction. They're collapsing the boundary between local development and global deployment. When Alibaba's Metis agent cuts redundant tool calls from 98% to 2%, it's not just about efficiency. It's about teaching AI when not to use tools—a metacognitive skill that most humans struggle with.
The companies winning this cycle aren't the ones with the smartest models. They're the ones building the infrastructure that makes everyone else's models irrelevant. Hugging Face didn't become the center of the AI ecosystem by building better transformers—they built better infrastructure for everyone else's transformers.
What to Watch Next Week
The Anthropic funding round is the obvious headline to track, but the more interesting signals will be in the infrastructure layer. Watch for more data center acquisitions, more vertical integration plays, and more "infrastructure as strategy" moves from companies that used to be pure AI plays.
Also keep an eye on the MCP security fallout. When a protocol that OpenAI, Google DeepMind, and others have adopted has a systemic security flaw that affects 200,000 servers, the patches and workarounds will tell us a lot about who actually understands the infrastructure they're building on.
The AI wars aren't being fought with benchmarks anymore. They're being fought with real estate, power contracts, and control of the pipes that everyone else depends on.
Explore the connections yourself in our graph explorer.

