OpenAI CEO Sam Altman just announced plans for artificial intelligence infrastructure that will consume a staggering 17 gigawatts of energy – more electricity than New York City and San Diego use together during their busiest days. This massive power demand has experts calling it a “seminal moment” that could reshape how the world thinks about AI’s impact on society.
Understanding the Massive Scale
To put 17 gigawatts in perspective, New York City typically uses about 10 gigawatts during peak summer demand when every air conditioner is running and the subway system is at full capacity. San Diego’s record electricity demand reached just over 5 gigawatts during an intense 2024 heat wave that nearly broke their power grid.
Combined, these two major American cities consume about 15 gigawatts at their absolute peak – still less than what Altman’s AI infrastructure will require continuously, 24 hours a day, 365 days a year.
Andrew Chien, a computer science professor at the University of Chicago, told Fortune that this represents roughly 20% of the entire Texas electrical grid, which normally runs around 80 gigawatts to power all the state’s refineries, factories, and households.
Why AI Needs So Much Power
The enormous energy demand comes from the massive computer chips and data centers needed to train and run advanced AI models. Altman’s plan involves building multiple “AI factories” that can produce a gigawatt of new AI infrastructure every single week – enough to power nearly 900,000 homes.
These AI systems require thousands of high-powered graphics processing units (GPUs) working together simultaneously. Nvidia CEO Jensen Huang explained that the planned 10 gigawatts equals the power consumption of between 4 million and 5 million graphics cards running at full capacity.
Unlike regular computer programs, AI models like ChatGPT need continuous power to process millions of user requests simultaneously while also training new versions of themselves. The computational requirements have grown exponentially – ChatGPT usage has increased 10 times over the past 18 months, forcing OpenAI to scale up dramatically.
The Growing AI Energy Crisis
Computer science experts warn that AI could consume 10-12% of the world’s total electricity by 2030, compared to being virtually negligible just a few years ago. This represents a fundamental shift in global energy consumption patterns.
Fenqi You, an energy systems professor at Cornell University, noted that 17 gigawatts is equivalent to powering both Switzerland and Portugal combined – entire countries’ worth of electricity for a single company’s AI projects.
The scale is so massive that each individual data center site costs approximately $50 billion to build, with the total $850 billion infrastructure plan representing nearly half of the entire global $2 trillion AI buildout forecasted by major banks.
Infrastructure Challenges Ahead
Building this much AI infrastructure faces enormous practical hurdles. The electrical grid in most countries wasn’t designed to handle such concentrated power demands. Even in Texas, where Altman broke ground on his first mega data center, adding 17 gigawatts represents a significant strain on existing infrastructure.
The power requirements are so intense that experts compare them to running 17 nuclear power plants or nine Hoover Dams continuously. Finding reliable, clean sources for this much electricity poses major environmental and logistical challenges.
Energy experts point out that while nuclear power could theoretically meet this demand, new nuclear plants take many years to build and would represent “a slow ramp” even when approved.
What This Means Globally
For developing countries, this AI energy boom creates both opportunities and concerns. The massive infrastructure investments could drive down AI costs eventually, making advanced technology more accessible worldwide. However, the enormous power consumption raises questions about global energy equity and environmental impact.
The concentration of so much energy consumption in AI also means that countries with abundant, cheap electricity may become the new centers of technological power, potentially reshaping global economic relationships.
As one expert noted, this represents “some seminal moments for how we think about AI and its impact on society” – forcing difficult decisions about energy priorities in an era of climate change.
What do you think about AI systems consuming more electricity than entire cities? Should there be limits on how much energy artificial intelligence can use?