Shockingly, ChatGPT doesn't consume as much power as previously thought — A new study reveals the stats were based on "napkin math" with the assumption that OpenAI powers next-gen models with dated GPUs

ChatGPT privacy settings
(Image credit: Future)

ChatGPT reportedly consumes approximately 3 watt-hours of power to respond to a single query, which is 10x more than the average power needed when using Google search. A new report by Epoch AI counters these stats as an overestimate, indicating that the OpenAI chatbot uses less power than previously assumed.

According to the report, ChatGPT running GPT-4o only consumes 0.3 watt-hours when generating a response for a query. While speaking to TechCrunch, Joshua You, a data analyst at Epoch AI, indicated:

“The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car.”

The data analyst revealed that his research and analysis on ChatGPT's power consumption was prompted by dated research and overestimated stats. He indicated that the assumed "universal" stats on ChatGPT's power consumption were based on the assumption that OpenAI was using old and inefficient chips to run its AI models.

According to You:

“Also, some of my colleagues noticed that the most widely reported estimate of 3 watt-hours per query was based on fairly old research, and based on some napkin math seemed to be too high.”

To this end, Epoch AI's ChatGPT power consumption estimate isn't cast in stone because it's also an approximation that doesn't include key AI capabilities like the chatbot's image generation.

ChatGPT will get power-hungry as OpenAI leans more on reasoning models

(Image credit: Getty Images | NurPhoto)

Perhaps more interestingly, the data analyst indicated that he doesn't expect ChatGPT's power consumption to rise, but as the models become more advanced, they'll require more energy.

This is especially true as top AI labs, including OpenAI, seem to be leaning more toward reasoning models that think harder for the hardest problems, which, in turn, requires more energy.

As generative AI rapidly advances and gains broad adoption, it's becoming more apparent thatit demands an exorbitant amount of electricity, money, and water to run.

Over the past few years, multiple reports have emerged indicating that Microsoft Copilot and ChatGPT consume one water bottle for cooling when generating a response for a query. This is on the heels of a previous report indicating that Microsoft and Google's power consumption surpasses the electricity usage of over 100 countries.

More recently, a separate report detailed that OpenAI's GPT-3 model consumes four times more water than previously thought, while GPT-4 consumes up to 3 water bottles to generate a mere 100 words. AI models seemingly become more power and resource-hungry as they become more advanced. However, as it now seems, ChatGPT might not be as power-hungry as previously thought.

CATEGORIES
Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You'll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.