New research says ChatGPT likely consumes '10 times less' energy than we initially thought, making it about the same as Google search

OpenAI logo displayed on a phone screen and ChatGPT website displayed on a laptop screen are seen in this illustration photo taken in Krakow, Poland on December 5, 2022.
(Image credit: Jakub Porzycki/NurPhoto via Getty Images)

It's easy to slate AI in all its manifestations—trust me, I should know, I do so often enough—but some recent research from Epoch AI (via TechCrunch) suggests that we might be a little hasty if we're trashing its energy use (yes, that's the same Epoch AI that recently dropped a new, difficult math benchmark for AI). According to Epoch AI, ChatGPT likely consumes just 0.3 Wh of electricity, "10 times less" than the popular older estimate which claimed about 3 Wh.

Given a Google search amounts to 0.0003 kWh of energy consumption per search, and based on the older 3 Wh estimate, two years ago Alphabet Chairman John Hennessey said that an LLM exchange would probably cost 10 times more than a Google search in energy. If Epoch AI's new estimate is correct, it seems that a likely GPT-4o interaction actually consumes the same amount of energy as a Google search.

Server energy use isn't something that tends to cross most people's minds while using a cloud service—the 'cloud' is so far removed from our homes that it seems a little ethereal. I know I often forget there are any additional energy costs at all, other than what my own device consumes, when using ChatGPT.

Thankfully I'm not a mover or a shaker in the world of energy policy, because of course LLM interactions consume energy. Let's not forget how LLMs work: they undertake shedloads of data training (consuming shedloads of energy), then once they've been trained and are interacting, they still need to pull from gigantic models to process even simple instructions or queries. That's the nature of the beast. And that beast needs feeding energy to keep up and running.

It's just that apparently that's less energy than we might have originally thought on a per-interaction basis: "For context, 0.3 watt-hours is less than the amount of electricity that an LED lightbulb or a laptop consumes in a few minutes. And even for a heavy chat user, the energy cost of ChatGPT will be a small fraction of the overall electricity consumption of a developed-country resident."

Epoch AI explains that there are a few differences between how it's worked out this new estimate and how the original 3 Wh estimate was calculated. Essentially, the new estimate uses a "more realistic assumption for the number of output tokens in a typical chatbot usage", whereas the original estimate assumed output tokens equivalent to about 1,500 words on average (tokens are essentially units of text such as a word). The new one also assumes just 70% of peak server power and computation being performed on a newer chip (Nvidia's H100 rather than an A100).

All these changes—which seem reasonable to my eyes and ears—paint a picture of a much less power-hungry ChatGPT. However, Epoch AI points out that "there is a lot of uncertainty here around both parameter count, utilization, and other factors". Longer queries, for instance, it says could increase energy consumption "substantially to 2.5 to 40 watt-hours."

It's a complicated story, but should we expect any less? In fact, let me muddy the waters a little more for us.

We also need to consider the benefits of AI for energy consumption. A productive technology doesn't exist in a vacuum, after all. For instance, use of AI such as ChatGPT could help bring about breakthroughs in energy production that decrease energy use across the board. And use of AI could increase productivity in areas that reduce energy in other ways; for instance, a manual task that would have required you to keep your computer turned on and consuming power for 10 minutes might be done in one minute with the help of AI.

AI, explained

What is artificial general intelligence?: We dive into the lingo of AI and what the terms actually mean.

On the other hand, there's the cost of AI training to consider. But on the peculiar third hand—where did that come from?—the benefits of LLM training are starting to plateau, which means there might be less large-scale data training going forwards. Plus, aren't there always additional variables? With Google search, for instance, there's the presumed cost of constant web indexing and so on, not just the search interaction and results page generation.

In other words, it's a complicated picture, and as with all technologies, AI probably shouldn't be looked at in a vacuum. Apart from its place on the mathematician's paper, energy consumption is never an isolated variable. Ultimately, what we care about is the health and productivity of the entire system, the economy, society, and so on. As always, such debates require consideration of multi-multi-variate equations in a cost-benefit analysis, and it's difficult to get the full picture, especially when much of that picture depends on an uncertain future.

Which somewhat defines the march of capitalism, does it not? The back and forth 'but actually' that characterises these discussions gets trampled under the boots of the technology which marches ahead regardless.

And ultimately, while this new 0.3 Wh estimate is certainly a pleasant development, it's still just an estimate, and Epoch AI is very clear about this: "More transparency from OpenAI and other major AI companies would help produce a better estimate." More transparency would be nice, but I won't hold my breath.

Jacob Fox
Hardware Writer

Jacob got his hands on a gaming PC for the first time when he was about 12 years old. He swiftly realised the local PC repair store had ripped him off with his build and vowed never to let another soul build his rig again. With this vow, Jacob the hardware junkie was born. Since then, Jacob's led a double-life as part-hardware geek, part-philosophy nerd, first working as a Hardware Writer for PCGamesN in 2020, then working towards a PhD in Philosophy for a few years (result pending a patiently awaited viva exam) while freelancing on the side for sites such as TechRadar, Pocket-lint, and yours truly, PC Gamer. Eventually, he gave up the ruthless mercenary life to join the world's #1 PC Gaming site full-time. It's definitely not an ego thing, he assures us.

Read more
PC building
ChatGPT vs DeepSeek: which AI can build me a better gaming PC?
SAN FRANCISCO, CALIFORNIA - NOVEMBER 06: OpenAI CEO Sam Altman speaks during the OpenAI DevDay event on November 06, 2023 in San Francisco, California. Altman delivered the keynote address at the first-ever Open AI DevDay conference.(Photo by Justin Sullivan/Getty Images)
In a mere decade 'everyone on Earth will be capable of accomplishing more than the most impactful person can today' says OpenAI boss Sam Altman
SUQIAN, CHINA - JANUARY 27, 2025 - An illustration photo shows the logo of DeepSeek and ChatGPT in Suqian, Jiangsu province, China, January 27, 2025. (Photo credit should read CFOTO/Future Publishing via Getty Images)
China's DeepSeek chatbot reportedly gets much more done with fewer GPUs but Nvidia still thinks it's 'excellent' news
Alibaba
Forget DeepSeek R1, apparently it's now Alibaba that has the most powerful, the cheapest, the most everything-est chatbot
A digitally generated image of abstract AI chat speech bubbles overlaying a blue digital surface.
We need a better name for AI, or we risk talking past each other until actually intelligent AGI comes home mooing
DeepSeek
Today I learned I can run my very own DeepSeek R1 chatbot on just $6,000 of PC hardware and no megabucks Nvidia GPUs required
Latest in AI
BURBANK, CALIFORNIA - AUGUST 15: Protestors attend the SAG-AFTRA Video Game Strike Picket on August 15, 2024 in Burbank, California. (Photo by Lila Seeley/Getty Images)
8 months into their strike, videogame voice actors say the industry's latest proposal is 'filled with alarming loopholes that will leave our members vulnerable to AI abuse'
live action Jimbo the Jester from Balatro holding a playing card and addressing the camera
LocalThunk forbids AI-generated art on the Balatro subreddit: 'I think it does real harm to artists of all kinds'
Aloy
'Creepy,' 'ghastly,' 'rancid': Viewers react to leaked video of Sony's AI-powered Aloy
Seattle, USA - Jul 24, 2022: The South Lake Union Google Headquarter entrance at sunset.
Google is rolling out an even more AI-heavy search engine mode because 'power users want AI responses for even more of their searches'
A digitally generated image of abstract AI chat speech bubbles overlaying a blue digital surface.
We need a better name for AI, or we risk talking past each other until actually intelligent AGI comes home mooing
MOUNTAIN VIEW, CALIFORNIA - AUGUST 22: A view of Google Headquarters in Mountain View, California, United States on August 22, 2024.
One educational company accuses Google's AI summary of leading to a 'hollowed-out information ecosystem of little use and unworthy of trust' in latest lawsuit
Latest in News
Titus in Warhammer 40,000: Space Marine 3 reveal promo image
Praise be to the Omnissiah! Warhammer 40,000: Space Marine 3 is officially in development
Jensen Huang, co-founder and chief executive officer of Nvidia Corp., speaks while holding the company's new GeForce RTX 50 series graphics cards and a Thor Blackwell robotics processor during the 2025 CES event in Las Vegas, Nevada, US, on Monday, Jan. 6, 2025. Huang announced a raft of new chips, software and services, aiming to stay at the forefront of artificial intelligence computing. Photographer: Bridget Bennett/Bloomberg via Getty Images
Group allegedly trying to smuggle Nvidia Blackwell chips stare down bail set at over $1 million
New art of Harry and Kim from Disco Elysium, with Harry holding a lit molotov cocktail.
Despite Disco Elysium Mobile aiming to 'captivate the TikTok user,' it looks surprisingly decent—but it's still insulting to Disco's ousted creators
Flag of Saudi Arabia
Saudi Arabia buys Pokémon GO maker for $3.5 billion with a 'B'
Vice President, Games at Netflix Mike Verdu speaks onstage during TechCrunch Disrupt 2022 on October 18, 2022 in San Francisco, California
4 short months after saying 'We'll have to adapt and change', Netflix's AI games VP adapts and changes into a person who isn't working there anymore
Performers acting as zombies are seen on a train coach during the "Train to Apocalypse" event as part of the Pandora Box Artmire Festival 2024 held to attract commuters to ride the city's rapid transit system LRT (light rapid transit), in Jakarta on July 11, 2024. (Photo by BAY ISMOYO / AFP) (Photo by BAY ISMOYO/AFP via Getty Images)
Venerable browser-based MMO Urban Dead is closing this week after a 20-year run, not with a bang but with a whimper