Napkin Math

A recently published report from Epoch AI contests earlier findings suggesting that ChatGPT consumes about 10 times more power per query compared to Google search. Instead, it is claimed in this new report that ChatGPT uses less energy than initially thought.

As per the report, generating a response using ChatGPT on GPT-4o requires just 0.3 watt-hours of power. In an interview with TechCrunch, Joshua You, a data analyst at Epoch AI, mentioned this fact.

Using energy for these devices isn’t something to worry too much about, considering the everyday usage of household appliances, or heating and cooling your home, or even driving a car.

The data analyst explained that his study on ChatGPT’s energy usage was initially influenced by outdated research and inflated figures. He suggested that the widely accepted assumptions about ChatGPT’s power consumption were based on an assumption that OpenAI was relying on aged, inefficient hardware to operate its AI models.

According to You:

Additionally, it was pointed out by a few of my peers that the commonly cited figure of 3 watt-hours per query might have originated from research conducted quite some time ago. A quick calculation suggests this value could potentially be overestimated.

For clarity, it’s important to note that Epoch AI’s energy usage estimation for ChatGPT is more of a rough guess than an exact figure. This is because it does not factor in advanced AI functions such as image generation capabilities within the chatbot.

ChatGPT will get power-hungry as OpenAI leans more on reasoning models

It’s worth noting that the data analyst suggested that ChatGPT’s energy usage likely won’t increase, however, as these models grow in complexity, they may demand more power.

It’s particularly relevant now because leading AI research facilities like OpenAI are tending to focus on problem-solving models that work harder on complex issues. This, of course, necessitates a greater amount of energy.

It’s increasingly clear that as artificial intelligence becomes widely used and sophisticated, it requires a significant amount of electricity, funds, and water to operate effectively.

For some time now, various sources have been sharing information suggesting that systems like Microsoft Copilot and ChatGPT require the equivalent cooling of one water bottle to produce an answer to a question. This comes after it was also reported that both Microsoft and Google use more electricity than the total power consumption of over 100 countries combined.

In a newer account, it’s been found that OpenAI’s GPT-3 model actually uses four times more water than was initially estimated, and GPT-4 requires the equivalent of three water bottles to produce just 100 words. It appears that as AI models grow in sophistication, they also tend to demand increased power and resources. Nevertheless, it seems that ChatGPT might not be quite as power-hungry as once believed.

Read More

2025-02-12 21:44