Strong Convictions, Loosely Held:  Day 1

11.18.2025 12:43 PM - By Michael Edgar

Day 1: We Are Not in an AI Bubble

Utilization vs. Speculation: Why This Buildout Is Different


The telecom crash of 2000 left a graveyard. 97% of the fiber-optic cable laid during the boom sat dark—unused capacity, speculative infrastructure, no real demand. Investors lost trillions betting on a future that arrived a decade too early.


             Today's AI buildout is the opposite.


There are no idle GPU chips. Every H100, every A100, every available unit is running at capacity. But the data reveals something even more striking: AI servers operate at 80-90% utilization while non-AI servers run below 60%. Hyperscalers are buying everything NVIDIA can ship, and still can't meet internal demand. Meta's Llama 3 405B training achieved 38% model flop utilization over 54 days—meaning raw compute isn't even the constraint; optimization is.


The data also shows why this matters at scale: AI workloads demand 10x the power of traditional servers. Global data center power demand currently stands at roughly 55 GW, but the projections are staggering: 84 GW by 2027—a 50% increase in three years.


This destroys the telecom comparison. In the 1990s, data center occupancy never approached saturation. Today? Data center occupancy is moving from 85% to over 95% by late 2026. The opposite problem. The 1990s had unused fiber sitting everywhere; 2025 has waiting lists for rack space.


But here's where skeptics have a point: Tech giants invested approximately $560 billion in AI infrastructure over two years but generated only $35 billion in AI-related revenue. That's a 16:1 ratio of capex to revenue. With a recent MIT study finding 95% of AI pilot projects fail to yield meaningful results despite over $40 billion in generative AI investment, it's fair to ask: are these dollars actually being deployed into revenue-generating workloads?


The answer is yes—but with a critical nuance. Infrastructure buildout always precedes revenue realization. The money spent now is building capacity that will be utilized for the next 10-15 years. The infrastructure isn't idle—it's running generative AI inference for consumer products, enterprise deployments, and emerging applications we haven't fully commercialized yet. This is utilization without yet-obvious ROI, not speculation with zero utilization.


The Demand Signal Is Global and Sustained

Sovereign AI initiatives in the UAE, Saudi Arabia, Japan, and the EU are adding billions in committed spend. Enterprises are migrating workloads. Startups are building on foundation models that require continuous inference at scale. This isn't a spike—it's a step-function shift in how compute gets deployed. And the capital deployment confirms it.


Hyperscalers collectively expect capital expenditures to exceed $380 billion in 2025—exceeding 1% of U.S. GDP from a single sector. This is not a technology capex cycle; it's an infrastructure buildout at the scale of interstate highways or the electrification of America. Updated October 2025 earnings confirm acceleration, not moderation: Amazon targeting $125 billion (up from prior $118B forecast), Microsoft fiscal 2026 capex exceeding $94 billion (45% growth from fiscal 2025's $64.55B), Alphabet raising guidance to $91-93 billion (up from $75B in February), and Meta narrowing range to $70-72 billion from prior $66-72B.


The Unit Economics Are Unforgiving

Each GB200/NVL72 rack costs $5.9 million—$3.4 million for compute hardware and $2.5 million for physical infrastructure. GPUs dominate at 39% of total capex (roughly $13.65 billion per GW), with Nvidia's gross profit alone representing 29% of total AI data center spending at approximately 70% margins. Networking accounts for 13%, mechanical and electrical systems consume 33%, with annual electricity costs reaching $1.3 billion per year at $0.15 per kWh for 1 GW capacity.


What This Means

  • GPU shortages persist through 2026 minimum

  • Data center construction accelerates, not slows

  • The winners will be those who locked in power and chip supply early

  • Secondary markets (cooling, networking, power management) see sustained growth


Bottom Line

If this were a bubble, you'd see inventory piling up and utilization dropping. Instead, you see waiting lists, forward contracts stretching years out, and data center occupancy moving from 85% toward 95%. The constraint isn't demand—it's how fast the world can build the infrastructure to serve it. The infrastructure isn't sitting idle. It's being deployed into revenue-generating workloads that will compound for the next decade. This is utilization, not speculation. And unlike 2000, we have the financial data to prove it.


Ready for Day 2? We tackle the real binding constraint: Power.