AI REGS & RISKS | Help! We’re Running Out of Power

164

By Greg Woolf, AI RegRisk Think Tank

The AI revolution may not end with a killer robot. It may end with a blackout. 

Sam Altman, CEO of OpenAI, summed up the new industrial equation better than any economist: “If you simplify what we do… melt sand, run energy through it, and get intelligence out the other end.” The new race isn’t just about algorithms or data anymore—it’s about electricity. 

Click Here to Learn More About the AI Readiness
Program for Financial & Wealthtech Firms

For decades, software ate the world because it was cheap to distribute and ran on someone else’s infrastructure. AI breaks that rule. Generating intelligence at scale demands vast compute power and staggering amounts of energy. What’s unfolding right now isn’t a technology story—it’s an energy story. 

From Sand to Intelligence 

OpenAI’s new partnership with Broadcom is more than a chip deal. It’s a declaration of independence from the traditional compute supply chain. Together, they’re co-designing custom silicon, networking, and full data-center systems optimized for AI inference—where models actually think, respond, and generate. 

This isn’t about speeding up chips. It’s about owning the full pipeline of intelligence. By controlling every layer—from transistor to token—OpenAI hopes to extract more intelligence per watt than any off-the-shelf GPU can deliver. It’s vertical integration on a civilizational scale. Altman called it “the biggest joint industrial project in human history.” 

If that sounds familiar, look at Tesla’s new AI5 and AI6, built in partnership with foundries and external vendors designed not just for vehicle autonomy but for compute-scale applications: converting electricity into thinking. Tesla measures progress in miles per watt. OpenAI measures it in tokens per watt. Both are playing the same game: wringing intelligence out of energy. 

The Gold Rush for Power 

The hyperscalers are rushing into chip design, data-center construction, and private electricity deals because our energy grid can’t keep up.  U.S. electricity generation has barely budged since 1999.  Meanwhile, China is increasing its capacity nearly fivefold building power plants, grid lines, and solar farms at breakneck speed. That’s a geopolitical problem hiding in plain sight. 

For the first time since the dawn of the microchip, America’s bottleneck isn’t innovation—it’s infrastructure. America still leads in model quality, but it risks losing the power race that makes those models run. 

Coming Soon to You: Rolling Blackouts 

According to the Department of Energy, peak demand could rise 38 percent by 2030, while 104 gigawatts of power plants are scheduled for retirement and only 22 gigawatts of new firm capacity are planned, a 17% net decline from today’s baseline of 463 gigawatts. 

Add an estimated six to seven trillion dollars of AI-related data-center investment by the end of the decade, and you get an equation that doesn’t balance. The Department of Energy also warns that if those plants retire too quickly without reliable replacements, the country could see up to 800 hours of blackouts a year by 2030. 

Even if we decide today to build enough capacity, infrastructure takes decades. Railroads connected cities in a hundred years. The Internet connected people in thirty. How long will it take to rewire the planet for artificial intelligence? Five years? Ten? Long before the grid catches up, demand will have doubled again. 

The Hyperscaler Oligarchy 

Only a handful of companies—OpenAI, Microsoft, Google, Amazon, Meta, and Tesla—are capable of building this end-to-end infrastructure. They control the chips, the clouds, and soon, the power. The ability to convert electrons into intelligence at scale will determine who benefits from AI’s economic windfall. 

If Google’s search algorithms shaped the last twenty years of human behavior, AI infrastructure will shape the next fifty. Every breakthrough in automation, finance, drug discovery, or defense will route through a hyperscaler’s servers. 

The Hidden Cost of “Free” Intelligence 

Electricity prices are already reflecting the shift. USA Today reports household utility costs have risen 41 percent since 2020, outpacing inflation. Bloomberg found that wholesale electricity prices near major data-center clusters have jumped 267 percent in five years. Utilities are struggling to plan for unpredictable AI demand, while hyperscalers lock in bulk-rate discounts. Consumers end up subsidizing the grid expansion that powers the “free” chatbots and image generators they use. 

Local governments are starting to revolt. Counties in Arizona, Indiana, and Wisconsin have blocked or delayed new data-center projects. Oregon and New Jersey passed laws requiring hyperscalers to pay for grid upgrades instead of pushing those costs onto residents. 

Every time you ask a model to summarize an email, you’re burning a measurable amount of electricity. Multiply that by billions of users, and you begin to see the invisible cost of “free” intelligence. 

Are We Breaking the Social Contract 

Technology is supposed to make life better for everyone. The promise of AI was higher productivity, smarter healthcare, better education, faster discovery—yet the benefits increasingly flow to the companies that can afford the most compute. The rest of us just pay the power bill. 

That imbalance is unsustainable. If AI is going to consume an ever-growing share of the world’s electricity, it has to give something back—real value that improves lives, not just convenience and shareholder returns. Otherwise, the same public that once embraced the Internet as liberation will see AI as extraction. Governments will respond with regulation, communities with opposition, and consumers with resentment. 

Conclusion: The Power Lords 

The new AI economy is a vertical-integration play from silicon to civilization. Its scarce resource isn’t data or code—it’s electricity. Whoever controls the electrons controls the intelligence, the automation, and ultimately, the economy. Consumers won’t have much say in that outcome—beholden to a few companies that literally power the future.  Maybe Elon was right. If this is the new power game, I’m booking the next commercial flight to Mars. 

Author’s Note: This article was researched using an AI model, contributing in my own small way to the energy problem.


Greg Woolf is an accomplished innovator and AI strategist with over 20 years of experience in founding and leading AI and data analytics companies. Recognized for his visionary leadership, he has been honored as AI Global IT-CEO of the Year, received the FIMA FinTech Innovation Award, and was a winner of an FDIC Tech Sprint. Currently, he leads the AI Reg-Risk™ Think Tank, advising financial institutions, FinTech companies, and government regulators on leveraging AI within the financial services industry. https://airegrisk.com