Artificial intelligence (AI) is running up energy use faster than generators can provide power, and data centres could become operationally constrained in just two years' time, analyst firm Gartner predicts.
The electricity consumption of data centres is estimated to increase by as much as 160 per cent by 2027, reaching 500 terawatt hours (TWh) per year.
This, Gartner said, is 2.6 times the level of power usage in 2023.
Gartner analyst Bob Johnson used the word "explosive" to describe the growth in hyperscale data centres for generative AI (GenAI), adding that it creates "an insatiable demand for power that will exceed the capability of utility providers to expand their capacity fast enough."
"New larger data centres are being planned to handle the huge amounts of data needed to train and implement the rapidly expanding large language models (LLMs) that underpin GenAI applications,” Johnson said.
“However, short-term power shortages are likely to continue for years as new power transmission, distribution and generation capacity could take years to come online and won’t alleviate current problems," he added.
Unfortunately, the rapidly increasing demand for power will hurt zero-carbon sustainability goals, Gartner said.
“The reality is that increased data centre use will lead to increased CO2 emissions to generate the needed power in the short-term,” Johnson said.
“This, in turn, will make it more difficult for data centre operators and their customers to meet aggressive sustainability goals relating to CO2 emissions," he added.
Large cloud and AI operators are already looking to nuclear to meet energy demand and Gartner said renewables such as wind and solar cannot meet the requirement of data centres to have around the clock power availability.
Urgently ramping up power production - somehow - isn't the only snag to hit the booming AI industry. OpenAI, which sparked the current AI frenzy by making its ChatGPT application available to the public is reportedly seeing improvement for the technology slowing down and starting to plateau.
news: OpenAI's upcomning Orion model shows how GPT improvements are slowing down
— Amir Efrati (@amir) November 9, 2024
It's prompting OpenAI to bake in reasoning and other tweaks after the initial model training phase. pic.twitter.com/VwD1xSbZUv
The fix for that seems to be bolting on reasoning after the training is done for LLMs.
16 Comments
HAHAHAHAHAHAHAHAHAHA
I've been pointing this out for years, here. Exponential growth (AI is merely the latest manifestation) on a finite planet was always moot about now; the last Doubling-Time beats you every time.
And the joke is that AI isn't anything useful; merely an addition energy-demand, at the very time that energy is peaking. It will be used by those who want to kick the algorithms along, to sell us more - Bernays on steroids.
Otherwise? The predicament (collection of problems) facing humanity can be ascertained and best-addressed without even opening an old-school keyboard.
Well what can I say ? So the new EV car chargers opened at Tauranga Crossing at a cost of $0.86kw/hr so that's like THREE TIMES the cost of what I'm paying at home. Seriously the only way they are going to fly is if you get your own solar system. Next up we will be having blackouts or you will have limited charging because the national grid cannot cope, then what ?
DC chargers are expensive to build, and don't get used much compared to a petrol pump, because almost everyone charges at home, so they mostly only get used when people are travelling, or if you forget to charge several nights in a row.
So yeah, a few times a year you pay a massively marked up price for some electricity.
Enjoy your $2.5/L petrol, frequent oil changes, brake jobs, and no heating till the engine warms up. Lol.
They should have shutdown the Tiwai smelter and let the big tech companies bid for the power for a large datacentre.
They are willing to pay higher than anyone for power. The cool location is perfect also. Now they are locked in to providing cheap power to Rio for the next 20 years. Rio were very smart.
Alphafold has deduced the structures of 200 million proteins from 1 million species, covering nearly every known protein on the planet. Humans had managed 80,000 and taken years and presumably plenty of computer resources to do so.
It will be something when using this data they produce cures for cancer, reverse aging, etc.
singautim,
If you didn't already know about PDK's extreme views on population, then you do now. To take his logic further, then all work on extending life should halt immediately. Presumably it would make sense to pass laws prohibiting all such medical research. To speed things up, perhaps all existing medicines which help prolong life should be withdrawn, such as the drugs which are keeping my cancer at bay. More extreme measures would of course be necessary to attain a global population of under 2 billion.
Indeed, I think the future will see a bunch of large nuclear plants built, surrounded by data centres. Australia should do it in a few small, low populated spots, like small towns and headlands off the nullaboor where sea water is abundant for cooling the nuclear and data centres.
We welcome your comments below. If you are not already registered, please register to comment.
Remember we welcome robust, respectful and insightful debate. We don't welcome abusive or defamatory comments and will de-register those repeatedly making such comments. Our current comment policy is here.