When AI expert Rui Ma returned from her recent tour of China’s tech hubs, she brought back a sobering observation that should make Silicon Valley uncomfortable: “Everywhere we went, people treated energy availability as a given.”
For anyone following the AI boom in the United States, that statement hits differently. While American companies scramble to secure power for their data centers—some even building their own power plants—China has apparently solved what Goldman Sachs calls AI’s most critical bottleneck: electricity.
The Tale of Two Grids
The contrast couldn’t be starker. In China, according to electricity expert David Fishman, the question isn’t whether there’s enough power for AI infrastructure—it’s how to use the excess capacity. China maintains a reserve margin of 80-100% nationwide, meaning they consistently operate with at least twice the electricity capacity they need.
Meanwhile, US regional grids typically run with just a 15% reserve margin, sometimes less during extreme weather. In Texas and California, grid operators regularly issue warnings about strain on the system. The difference is like comparing a highway with multiple empty lanes to a single-lane road during rush hour.
The numbers tell the story: China adds more new electricity demand each year than Germany’s entire annual consumption. One Chinese province alone matches India’s total electricity supply. In contrast, Ohio households are seeing their bills increase by at least $15 this summer just from new data center demand.
China’s Infrastructure Strategy: Overbuilding by Design
China’s approach is simple but effective: build first, optimize later. For decades, the country has deliberately overbuilt every layer of its power infrastructure—generation, transmission, and next-generation nuclear capacity. This isn’t accidental overcapacity; it’s strategic preparation.
The method works because China treats energy planning as a long-term, technocratic exercise. The state coordinates massive infrastructure investments before demand materializes, accepting that not every project will succeed but ensuring capacity exists when needed. It’s the infrastructure equivalent of Amazon’s “Day 1” mentality—always prepare for tomorrow’s scale.
When AI data centers emerged as major power consumers, China didn’t see a crisis. They saw an opportunity to “soak up oversupply,” as Fishman puts it. What the US views as a threatening new load, China treats as a convenient way to utilize idle capacity.
The US Constraint: Short-Term Capital Meets Long-Term Infrastructure
America’s challenge isn’t technical—it’s structural. US infrastructure development relies heavily on private investment, but most investors expect returns within 3-5 years. Power projects, however, can take a decade to build and decades to pay off. This timing mismatch creates a fundamental problem.
As Fishman notes, Silicon Valley has funneled billions into “the nth iteration of software-as-a-service” while energy projects struggle for funding. The same venture capital ecosystem that can rapidly scale software companies hits a wall when confronting the realities of physical infrastructure.
Meanwhile, regulatory fragmentation adds another layer of complexity. Where China can coordinate nationwide energy policy through centralized planning, the US must navigate state-by-state permitting processes, local opposition, and fragmented market rules.
The AI Implications Are Immediate
This energy infrastructure gap has direct consequences for AI development. McKinsey projects that companies worldwide need to invest $6.7 trillion in new data center capacity between 2025 and 2030. In the US, this massive buildout is already bumping against grid limitations.
American tech companies are responding creatively but inefficiently. Google, Amazon, and Microsoft are building their own power plants rather than relying on existing grids. It’s a workaround that works for individual companies but doesn’t solve the systemic problem.
China’s AI companies face no such constraints. They can focus resources on model development, hardware optimization, and application deployment rather than basic infrastructure logistics. When your biggest infrastructure concern is how to use excess capacity rather than where to find any capacity at all, you can move faster on everything else.
Competitive Implications: Beyond Just AI
The energy advantage extends beyond immediate AI capabilities. China’s infrastructure surplus gives them flexibility to experiment with energy-intensive AI applications that might be impractical in power-constrained environments. They can afford to run larger models, process more data, and iterate faster on power-hungry research projects.
More importantly, this infrastructure gap affects the geographic distribution of AI development. If the US grid continues to struggle with data center demand, AI companies may increasingly locate their most ambitious projects overseas—not just in China, but in any region with abundant, reliable power.
The cultural framing matters too. In China, renewables are promoted as economically strategic, not morally virtuous. Coal isn’t demonized—it’s simply seen as outdated. This pragmatic approach lets policymakers focus on results rather than political positioning, enabling faster infrastructure decisions.
The Path Forward: Learning from Infrastructure Success
The US doesn’t need to copy China’s governance model, but it does need to address the fundamental mismatch between infrastructure timelines and investment horizons. This might require new financing mechanisms that can bridge the gap between private capital’s expectations and infrastructure reality.
Some solutions are already emerging. Public-private partnerships, infrastructure funds with longer time horizons, and regulatory streamlining could help. The challenge is implementing these solutions at the scale and speed the AI boom demands.
The stakes are substantial. As Fishman bluntly puts it: “The gap in capability is only going to continue to become more obvious—and grow in the coming years.” For an industry where compute capacity often determines competitive advantage, falling behind on energy infrastructure could mean falling behind on AI leadership.
China’s lesson isn’t about central planning versus market economics. It’s about the importance of building infrastructure ahead of demand rather than trying to catch up. In the AI race, the country that solves the power problem first may well determine who leads the technology that shapes the next decade.
The question for US policymakers and tech leaders is straightforward: Can America build the grid that AI needs, or will energy infrastructure become the bottleneck that limits American AI ambitions? Based on current trajectories, China seems confident they already know the answer.