According to TheRegister.com, Turner & Townsend’s 2025-2026 Datacenter Construction Cost Index reveals critical barriers to AI infrastructure growth. The survey of 280 industry experts across 300+ projects found 48% cite power access as their biggest scheduling constraint, with US grid connection wait times stretching to seven years. OpenAI’s disclosed projects alone would consume 55.2 gigawatts—enough to power 44.2 million households, nearly triple California’s housing stock. Deloitte warned AI datacenter power needs in the US may grow 30 times within a decade, while 83% of professionals believe local supply chains can’t support advanced cooling technology. AI-optimized liquid-cooled facilities now cost 7-10% more than air-cooled designs, adding to construction challenges.
Power grid reality check
Here’s the thing nobody wants to admit: we’re trying to plug 21st century AI infrastructure into 20th century power grids. And it’s not working. Seven-year wait times for grid connections? That’s basically telling AI companies “maybe try again in 2032.” By then, the entire AI landscape will have shifted multiple times.
What really struck me is the sheer scale of the power demand. OpenAI’s planned projects needing 55.2 gigawatts is staggering. That’s not just competing with other datacenters—it’s competing with entire cities and manufacturing hubs for limited grid capacity. We’re talking about power demands that rival small countries.
The cooling conundrum
But power is only half the story. The cooling requirements for these AI beasts are creating their own supply chain nightmares. When 83% of industry professionals say local supply chains can’t handle the advanced cooling tech needed, that’s a red flag waving violently.
Liquid-cooled facilities costing 7-10% more might not sound catastrophic, but when you’re talking about billion-dollar projects, that extra percentage represents real money. And it’s not like traditional datacenters are getting cheaper either—they’ve still seen a 5.5% cost-per-watt increase. Basically, everything about building for AI is getting more expensive and complicated.
Solutions or band-aids?
The report recommends on-site generation and energy storage, which sounds great in theory. But let’s be real—when they mention “renewables for energy creation though in reality this is likely to be generators driven by gas-powered turbines,” what they’re really saying is we’ll probably fall back on fossil fuels because renewables can’t scale fast enough.
So we’re potentially looking at a situation where AI—the supposed technology of the future—ends up being powered by the energy sources of the past. There’s some irony there that shouldn’t be lost on anyone.
Bigger picture problems
Paul Barry from Turner & Townsend nailed it when he said there’s “stronger competition than ever before for power due to increased business and consumer demand.” We’re not just talking about AI versus other industries here. We’re talking about AI versus electric vehicles, versus home heating electrification, versus manufacturing reshoring.
And let’s not forget the hardware angle the article mentions. Even if we solve the power and cooling problems, can chip makers actually produce enough supply? We might be building empty shells waiting for components that never arrive.
The full Turner & Townsend report paints a pretty sobering picture. Everyone’s racing to build AI infrastructure, but the physical constraints of power grids and supply chains don’t care about Silicon Valley hype cycles. Something’s gotta give—and it might be the AI boom itself.
