A new study reveals OpenAI’s dated GPT-3 uses more water than previously assumed — ChatGPT needs up to four times more water to quench its thirst for cooling data centers

A new study reveals OpenAI's dated GPT-3 uses more water than previously assumed — ChatGPT needs up to four times more water to quench its thirst for cooling data centers

What you need to know

  • A new study reveals that OpenAI’s ChatGPT running GPT-3 uses up to four times more water than previously thought.
  • A separate study indicated that OpenAI’s GPT-4 model (referred to as mildly embarrassing at best) uses up to 3 water bottles to generate a mere 100 words.
  • Generative AI’s high demand for electricity and water is subtly hurting large tech corporations’ sustainability and environmental goals. 

As a researcher with a keen interest in AI and sustainability, I find myself constantly juggling between excitement for technological advancements and concern for our planet. The recent revelation that OpenAI’s ChatGPT uses up to four times more water for cooling than previously thought is indeed concerning.


Although the fast-paced evolution and widespread use of generative AI technology has brought about remarkable progress, it has also sparked significant apprehensions among users. These worries encompass potential threats such as a 99.9% chance that it could lead to humanity’s downfall, replace human jobs on a large scale, and other concerns. Leading tech companies like OpenAI and Microsoft are playing pivotal roles in the AI field with their cutting-edge AI models that have transformed our daily lives significantly.

However, it’s still unclear whether the advantages of broad access to AI outweigh the cons. While the technology is being integrated into workflows and even across medicine, education, computing, entertainment, and more, it raises critical concerns equally. 

Elon Musk suggests that we are on the brink of a groundbreaking technological leap with AI, but there may not be sufficient electricity to drive these advancements by 2025. Moreover, the energy consumption of generative AI for cooling purposes is concerningly high.

It was discovered last year that both Microsoft Copilot (previously Bing Chat) and ChatGPT can use as much as one bottle of water for each query they process. A newly conducted study has shown that OpenAI’s GPT-4 model may require up to 3 bottles of water to generate approximately 100 words.

It was revealed in 2020 that the University of Riverside’s research found ChatGPT uses around 2 liters of water to produce 50 answers to questions. New information suggests that the technology may actually use more water than initially estimated.

Based on a study called “Making AI Less Thirsty,” set to be published in Communications of the ACM, it was found that OpenAI’s ChatGPT uses up to four times as much water for cooling as earlier estimates suggested (as reported by The Times). To clarify, this study is based on an earlier version of ChatGPT utilizing OpenAI’s GPT-3 model.

Since the introduction of GPT-3, OpenAI has rolled out newer, state-of-the-art AI models such as GPT-4 and GPT-4o, boasting advanced functionalities like real-time audio, text, and visual processing. It’s worth noting that these new models surpass GPT-3 in terms of capabilities. Given this, it can be assumed that the more sophisticated tasks these newer AI models handle might require them to use more water for cooling purposes compared to OpenAI’s earlier model.

Not a good look on the environmental and sustainability front

A new study reveals OpenAI's dated GPT-3 uses more water than previously assumed — ChatGPT needs up to four times more water to quench its thirst for cooling data centers

Leading tech companies like OpenAI, Microsoft, Google, among others, find themselves in a challenging predicament as the advancement of AI technology calls for escalating energy and water resources. Meeting these growing needs is proving to be increasingly complex.

Beyond just needing resources for their projects, the investigation and application of technology is a significant obstacle in their pursuit of sustainability and eco-friendly objectives. Notably, Microsoft aims to achieve carbon neutrality by the year 2030.

As per the latest sustainability report, Microsoft accounted for 22.5% of the total water usage, followed by both Google and Meta at approximately 17% each. However, it is yet to be determined how these tech giants plan to address the escalating need for cooling water as AI technologies continue to advance in complexity.

It is said that OpenAI might be spending around $700,000 per day to keep ChatGPT operational. Despite AI becoming a crucial aspect for many tech giants, it demands substantial resources and financial commitment. However, investors are starting to voice worries about such significant investments. The challenge lies in determining a profitable business model within this sector, which appears elusive at the moment.

More recently, the company behind ChatGPT successfully secured a staggering 6.6 billion dollars in funding from Microsoft, NVIDIA, Thrive Capital, and other prominent investors. This substantial investment significantly boosted its market value, surpassing the 157 billion dollar mark. Despite anticipating losses of approximately 5 billion dollars over the next few months, the AI company managed to avoid financial collapse.

As a tech enthusiast, I can’t help but express my concerns about the current round of funding for OpenAI. There’s a crucial hurdle looming: they need to transform into a profit-driven company within the next two years or risk giving back the funds invested by our supporters. This transformation might not be as smooth as we’d hope, given potential obstacles from various quarters – the board, staff, government, and regulatory bodies could all present challenges.

Read More

2024-10-08 16:09