There's a version of the AI story that Wall Street is telling, and there's a version that I think is actually true.

The Wall Street version: a handful of large-cap technology companies are racing to build the most powerful AI systems, and whoever wins the model race captures the most value. Buy the hyperscalers. Buy NVIDIA. Watch the multiples expand.

That's not wrong, exactly. But it misses what I consider the more compelling investment thesis - one that's sitting closer to the foundation of the whole story.

Let me explain what I mean.

The Constraint Nobody Wants to Talk About

Every AI application you interact with - every chatbot, every image generator, every recommendation engine, every coding assistant - runs on physical infrastructure. That infrastructure consists of data centers filled with specialized chips, cooling systems, networking hardware, and power delivery equipment.

And all of it runs on electricity.

Here are some numbers worth keeping in mind:

A single large AI data center uses 100 to 500 megawatts of power. For context, a mid-size American city of 100,000 people uses roughly 300 to 400 megawatts. Microsoft has committed approximately $80 billion to AI infrastructure in a single year. Google, Amazon, and Meta have announced comparable figures.

The math is straightforward: all of that infrastructure requires power. Enormous amounts of it, on a continuous basis, in locations where the grid may not currently have the capacity to deliver it.

The U.S. electrical grid has not been meaningfully expanded in decades. The major power transformer manufacturers - the companies that build the equipment needed to step voltage up and down at the scale AI data centers require - are running multi-year backlogs. Some transformer models now have a 3-year lead time from order to delivery.

Copper, which is essential throughout the grid and data center buildout, is in structural undersupply relative to what the buildout requires.

And nuclear power - which was effectively being written off five years ago - is experiencing a direct renaissance driven specifically by the need for reliable, 24/7, carbon-free baseload electricity. Microsoft signed a deal to restart the Three Mile Island nuclear plant specifically to power its data centers. Amazon and Google have announced similar nuclear agreements.

Where the Value Gap Is

This is the investment question I find most interesting right now: if AI infrastructure requires all of this, why is the pricing gap so large?

NVIDIA's valuation reflects enormous expectations. The major hyperscalers trade at premiums that price in a great deal of success.

The companies supplying the power infrastructure - the transformer manufacturers, the utilities positioned to serve new data center loads, the uranium and nuclear enablers - are at earlier stages of repricing. They haven't been re-rated the way the semiconductor names have.

That doesn't mean they're "cheap" in an absolute sense. It means the market has been slower to connect the dots between AI demand and physical infrastructure requirements. That gap tends to close. History of commodity cycles suggests that when it does close, it closes quickly.

The One Honest Risk

I want to be direct about where this thesis can break down.

If AI adoption disappoints - if the enterprise deployment of AI tools hits significant friction, if the hyperscaler capex cycle gets cut in a downturn, or if model efficiency improves faster than expected and reduces per-inference power demand - then the power demand growth story is weaker than it looks today.

Infrastructure buildouts take years. Demand may not materialize as fast as the spending cycle suggests. The grid upgrade is underway regardless of AI, but AI is the accelerant - and if it slows, some of the premium built into infrastructure plays will compress.

This is a thesis with a multi-year time horizon. It's not a momentum trade.

One Action to Take This Week

If this thesis interests you, the starting point is understanding the power infrastructure supply chain better before looking at any specific security.

A useful starting point: trace the supply chain from "AI model runs an inference" backward. GPU cluster -> data center -> power delivery -> grid -> generation source -> fuel or resource. Each link in that chain is an industry, and most of those industries have long-established public market representatives.

The most interesting names to research across that chain - not as buy recommendations but as starting points for your own due diligence - include the large industrial electrical equipment manufacturers (the companies that make transformers, switchgear, and power distribution equipment), uranium producers and nuclear power operators, and the commodity producers who supply the physical materials for grid expansion.

If you want to discuss the paid research where I go into specific names and valuation context, the link is below.

Coming up in future issues:

  • The dollar debasement playbook: what actually works when the purchasing power of cash is in structural decline, and what doesn't.

  • The commodity super-cycle: why supply-side dynamics in copper and uranium are different from past cycles, and what that implies for timing.

  • AI infrastructure, part 2: a closer look at the nuclear renaissance and why utilities are behaving differently than they did ten years ago.

If someone shared this with you and you'd like to subscribe, go to openmarketwire.com.

To go deeper with full-length research, specific company coverage, and portfolio construction context:

- Grant Calloway
Editor, Open Market Wire

Keep Reading