Uncategorized

Is AI’s Demand for Energy Really ‘Insatiable’?

Bloomberg and The Washington Post “claim AI power usage is dire,” writes Slashdot reader NoWayNoShapeNoForm. But Ars Technica “begs to disagree with those speculations.”

From Ars Technica’s article:

The high-profile pieces lean heavily on recent projections from Goldman Sachs and the International Energy Agency (IEA) to cast AI’s “insatiable” demand for energy as an almost apocalyptic threat to our power infrastructure. The Post piece even cites anonymous “some [people]” in reporting that “some worry whether there will be enough electricity to meet [the power demands] from any source.” Digging into the best available numbers and projections available, though, it’s hard to see AI’s current and near-future environmental impact in such a dire light… While the headline focus of both Bloomberg and The Washington Post’s recent pieces is on artificial intelligence, the actual numbers and projections cited in both pieces overwhelmingly focus on the energy used by Internet “data centers” as a whole…

Bloomberg asks one source directly “why data centers were suddenly sucking up so much power” and gets back a blunt answer: “It’s AI… It’s 10 to 15 times the amount of electricity.” Unfortunately for Bloomberg, that quote is followed almost immediately by a chart that heavily undercuts the AI alarmism. That chart shows worldwide data center energy usage growing at a remarkably steady pace from about 100 TWh in 2012 to around 350 TWh in 2024. The vast majority of that energy usage growth came before 2022, when the launch of tools like Dall-E and ChatGPT largely set off the industry’s current mania for generative AI. If you squint at Bloomberg’s graph, you can almost see the growth in energy usage slowing down a bit since that momentous year for generative AI.

Ars Technica first cites Dutch researcher Alex de Vries’s estimate that in a few years the AI sector could use between 85 and 134 TWh of power. But another study estimated in 2018 that PC gaming already accounted for 75 TWh of electricity use per year, while “the IEA estimates crypto mining ate up 110 TWh of electricity in 2022.”

More to the point, de Vries’ AI energy estimates are only a small fraction of the 620 to 1,050 TWh that data centers as a whole are projected to use by 2026, according to the IEA’s recent report. The vast majority of all that data center power will still be going to more mundane Internet infrastructure that we all take for granted (and which is not nearly as sexy of a headline bogeyman as “AI”).

The future is also hard to predict, the article concludes. “If customers don’t respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs…”

Read more of this story at Slashdot.

Bloomberg and The Washington Post “claim AI power usage is dire,” writes Slashdot reader NoWayNoShapeNoForm. But Ars Technica “begs to disagree with those speculations.”

From Ars Technica’s article:

The high-profile pieces lean heavily on recent projections from Goldman Sachs and the International Energy Agency (IEA) to cast AI’s “insatiable” demand for energy as an almost apocalyptic threat to our power infrastructure. The Post piece even cites anonymous “some [people]” in reporting that “some worry whether there will be enough electricity to meet [the power demands] from any source.” Digging into the best available numbers and projections available, though, it’s hard to see AI’s current and near-future environmental impact in such a dire light… While the headline focus of both Bloomberg and The Washington Post’s recent pieces is on artificial intelligence, the actual numbers and projections cited in both pieces overwhelmingly focus on the energy used by Internet “data centers” as a whole…

Bloomberg asks one source directly “why data centers were suddenly sucking up so much power” and gets back a blunt answer: “It’s AI… It’s 10 to 15 times the amount of electricity.” Unfortunately for Bloomberg, that quote is followed almost immediately by a chart that heavily undercuts the AI alarmism. That chart shows worldwide data center energy usage growing at a remarkably steady pace from about 100 TWh in 2012 to around 350 TWh in 2024. The vast majority of that energy usage growth came before 2022, when the launch of tools like Dall-E and ChatGPT largely set off the industry’s current mania for generative AI. If you squint at Bloomberg’s graph, you can almost see the growth in energy usage slowing down a bit since that momentous year for generative AI.

Ars Technica first cites Dutch researcher Alex de Vries’s estimate that in a few years the AI sector could use between 85 and 134 TWh of power. But another study estimated in 2018 that PC gaming already accounted for 75 TWh of electricity use per year, while “the IEA estimates crypto mining ate up 110 TWh of electricity in 2022.”

More to the point, de Vries’ AI energy estimates are only a small fraction of the 620 to 1,050 TWh that data centers as a whole are projected to use by 2026, according to the IEA’s recent report. The vast majority of all that data center power will still be going to more mundane Internet infrastructure that we all take for granted (and which is not nearly as sexy of a headline bogeyman as “AI”).

The future is also hard to predict, the article concludes. “If customers don’t respond to the hype by actually spending significant money on generative AI at some point, the tech-marketing machine will largely move on, as it did very recently with the metaverse and NFTs…”

Read more of this story at Slashdot.

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy