The Empire of the AI of Sam Altman will devour as much power as New York and San Diego combined. Experts say it is “scary”

Imagine New York City on an sweltering night night: each air conditioner tending to the unifying metro cars, the towers of the light. Now add San Diego to the top of a record heat wave, when demand spent 5,000 megawatts and the grid has almost completed.
It is almost the extent of electricity which, according to Sam Altman and its partners, will be devoured by their next wave of AI data centers – a single business project consuming more power, every day, than two American cities pushed to their breakdown.
The announcement is a “fundamental moment” that Andrew Dog, professor of computer science at the University of Chicago, says that he has been waiting for a long time to see what materializes.
“I have been a computer scientist for 40 years, and for most of this time, IT was the smallest piece of energy consumption in our economy,” said dog Fortune. “Now it becomes a large part of what the whole economy consumes.”
He called the change that is both exciting and alarming.
“It’s scary because IT has always been the smallest piece of energy consumption in our economy,” he said. “Now it could be 10% or 12% of world power by 2030. We arrive at fundamental moments to know how we think of AI and its impact on society.”
This week, Openai announced a plan with Nvidia to build AI data centers consuming up to 10 gigawatts of power, with additional projects totaling 17 gigawatts already in motion. It is almost equivalent to New York feeding – which uses 10 gigawatts in summer – and San Diego during the 2024 intensive wave, when more than 5 gigawatts were used. Or, as an expert said, he is close to the total electricity demand from Switzerland and Portugal combined.
“It’s quite incredible,” said dog. “A year and a half ago, they spoke of five gigawatts. Now they have increased the bet to 10, 15, even 17. There is an escalation in progress.”
Fenqi You, professor of energy systems at Cornell University, who also studies AI, agreed.
“Ten gigawatts are more than the demand for advanced power in Switzerland or in Portugal,” he said Fortune. “Seventeen gigawatts is like feeding the two countries together.”
Texas Grid, where Altman inaugurated one of the projects this week, generally manages around 80 gigawatts.
“So you are talking about a quantity of power comparable to 20% of the entire Texas network,” said dog. “It is for all other industries – refineries, factories, households. It is a large amount of power. ”
Altman has developed the construction if necessary to follow the AI gap request.
“This is what it takes to deliver AI,” he told Texas. The use of chatgpt, he noted, has increased tenfold in the last 18 months.
What source of energy does it need?
Altman has made no secrets of his favorite: nuclear. He supported fission and fusion startups, bets that only reactors can provide the type of regular and concentrated production necessary to maintain the insatiable demand for AI.
“Calculation infrastructure will be the basis of the economy of the future,” he said, framing nuclear like the backbone of this future.
Dog, however, is frank on short -term limits.
“As long as I know, the amount of nuclear energy that could be brought to the network before 2030 is less than a gigawatt,” he said. “So when you hear 17 gigawatts, the numbers just don’t match.”
With projects like Openai from Openai 10 or 17 Gigawatts, nuclear is “an extinct path and a slow ramp, even when you get there”. Instead, it expects wind, solar, natural gas and new storage technologies to dominate.
Fenqi You, an expert in energy systems in Cornell, struck a common ground. He said nuclear power can be inevitable in the long term if AI continues to develop, but warned that “in the short term, there is simply not much spare capacity” – whether fossil, renewable or nuclear. “How can we extend this short-term capacity? It is not clear,” he said.
He also warned that the chronology can be unrealistic.
“A typical nuclear power plant takes years to allow and build,” he said. “In the short term, they will have to count on renewable energies, natural gas and perhaps the modernization of older plants. Nuclear will not happen quickly enough.”
Environmental costs
Environmental costs are also looming for these experts.
“We have to face the reality that companies have promised that they would be clean and clear zero, and in the face of AI growth, they probably cannot be,” said dog.
Ecosystems could be subject to stress, you said Cornell.
“If the data centers consume all local waters or disrupt biodiversity, this creates involuntary consequences,” he said.
The investment figures are amazing. Each OPENAI site is estimated at around 50 billion dollars, which has increased up to $ 850 billion in expected expenses. Nvidia alone has promised up to $ 100 billion to support expansion, providing millions of new GPU Vera Rubin.
Dog added that we need a wider societal conversation on the imminent environmental costs of use as much electricity for AI. Beyond carbon emissions, he highlighted the hidden strains on water supply, biodiversity and local communities near massive data centers. Cooling alone, he noted, can consume large amounts of fresh water in regions already confronted with rarity. And because the equipment turns so quickly – with new Nvidia processors that deploy each year – the old chips are constantly thrown away, creating waste of waste related to toxic chemicals.
“They told us that these data centers were going to be clean and green,” said dog. “But faced with the growth of AI, I don’t think they can be. It is now time to hold their feet on fire.”
https://fortune.com/img-assets/wp-content/uploads/2025/09/GettyImages-2236544160-e1758737580231.jpg?resize=1200,600