Packing Too Much Power?: AI Puts a Huge Drain on Power Grids Experts Say

Posted by

In the rush to advance AI technology and uncover more of what it can do for the world, there is growing concern beyond the dilemma of deep fakes and erroneous content. That is, how much power does AI actually consume on the planet?

It seems no one can or is willing to answer that question definitively. But it’s commonly understood that AI functions demand the use of energy – a lot of energy. 

Estimates are hard to come by since machine-learning models can be configured in a variety of ways that consume different amounts of power. The question is further obscured by the closed-mouth approach of companies like Meta, Microsoft, and OpenAI, who simply aren’t telling.

Says Sasha Luccioni, a researcher at the French American company Hugging Face, one of the challenges is that companies have become more secretive as AI grows more profitable. Just years ago companies like OpenAI used to publish reports of their training regimens, but that doesn’t happen anymore with the latest generation of ChatGPT and GPT-4. 

According to Luccioni, who has published several papers on AI energy use, the secrecy is likely based somewhat on competition between companies, but also to avoid criticism.

For example, training a model is particularly energy intensive. Applied to a large language model like GPT-3, the process would consume nearly 1,300 megawatt hours of electricity – comparable to the amount used by 130 US homes for an entire year. Or viewed by another comparison, if streaming an hour of Netflix consumes 0.8 megawatt hours of electricity, you would have to watch 1,625,000 hours of Netflix to use the equivalent of the power needed to train GPT-3.

Training is just the first part of the picture. After a system is created, it is rolled out to the public in a process called inference. Fortunately inference is not as voracious an energy consumer as the training process, but still significant. In lay terms, according to a research paper published by Carnegie Mellon, the power needed to use AI for generating just a single image is nearly equal to the energy it takes to charge your smartphone. 

 Other estimates from the International Energy Agency suggest that by 2027, electricity usage from the expansion of AI and cryptocurrency could rival the demands of an entire country like Sweden or Germany. But analysts are quick to point out there are a lot of variables we don’t know and can’t quantify definitively.

Says Luccioni, “The generative AI revolution comes with a planetary cost that is completely unknown to us, and the spread for me is particularly indicative. The truth is we just don’t know.” To find out more, keep reading here.

Visit Marketing and Advertising Company Syndicate Strategies to Supercharge Your Sales
www.syndstrat.com