A recently published report from Epoch AI contests earlier findings suggesting that ChatGPT consumes about 10 times more power per query compared to Google search. Instead, it is claimed in this new report that ChatGPT uses less energy than initially thought.
As per the report, generating a response using ChatGPT on GPT-4o requires just 0.3 watt-hours of power. In an interview with TechCrunch, Joshua You, a data analyst at Epoch AI, mentioned this fact.
Using energy for these devices isn’t something to worry too much about, considering the everyday usage of household appliances, or heating and cooling your home, or even driving a car.
The data analyst explained that his study on ChatGPT’s energy usage was initially influenced by outdated research and inflated figures. He suggested that the widely accepted assumptions about ChatGPT’s power consumption were based on an assumption that OpenAI was relying on aged, inefficient hardware to operate its AI models.
According to You:
Additionally, it was pointed out by a few of my peers that the commonly cited figure of 3 watt-hours per query might have originated from research conducted quite some time ago. A quick calculation suggests this value could potentially be overestimated.
For clarity, it’s important to note that Epoch AI’s energy usage estimation for ChatGPT is more of a rough guess than an exact figure. This is because it does not factor in advanced AI functions such as image generation capabilities within the chatbot.
ChatGPT will get power-hungry as OpenAI leans more on reasoning models
It’s worth noting that the data analyst suggested that ChatGPT’s energy usage likely won’t increase, however, as these models grow in complexity, they may demand more power.
It’s particularly relevant now because leading AI research facilities like OpenAI are tending to focus on problem-solving models that work harder on complex issues. This, of course, necessitates a greater amount of energy.
It’s increasingly clear that as artificial intelligence becomes widely used and sophisticated, it requires a significant amount of electricity, funds, and water to operate effectively.
For some time now, various sources have been sharing information suggesting that systems like Microsoft Copilot and ChatGPT require the equivalent cooling of one water bottle to produce an answer to a question. This comes after it was also reported that both Microsoft and Google use more electricity than the total power consumption of over 100 countries combined.
In a newer account, it’s been found that OpenAI’s GPT-3 model actually uses four times more water than was initially estimated, and GPT-4 requires the equivalent of three water bottles to produce just 100 words. It appears that as AI models grow in sophistication, they also tend to demand increased power and resources. Nevertheless, it seems that ChatGPT might not be quite as power-hungry as once believed.
Read More
- Masters Toronto 2025: Everything You Need to Know
- We Loved Both of These Classic Sci-Fi Films (But They’re Pretty Much the Same Movie)
- The Lowdown on Labubu: What to Know About the Viral Toy
- Mario Kart World Sold More Than 780,000 Physical Copies in Japan in First Three Days
- Street Fighter 6 Game-Key Card on Switch 2 is Considered to be a Digital Copy by Capcom
- ‘The budget card to beat right now’ — Radeon RX 9060 XT reviews are in, and it looks like a win for AMD
- Valorant Champions 2025: Paris Set to Host Esports’ Premier Event Across Two Iconic Venues
- Microsoft Has Essentially Cancelled Development of its Own Xbox Handheld – Rumour
- Gold Rate Forecast
- Karate Kid: Legends Hits Important Global Box Office Milestone, Showing Promise Despite 59% RT Score
2025-02-12 21:44