“They’re just hiding the critical information” — Google’s latest AI efficiency claims spark backlash over hidden environmental costs

Artificial Intelligence (AI) significantly alters how we labor and perceive tasks, yet it also impacts global resource consumption in substantial ways. Given Earth’s limited resources, this trend may not be beneficial for the long term.

By the year 2023, there was an increasing frequency of discussions regarding energy and water usage by major AI companies, with growing apprehensions. A prominent research study predicted that the power consumption of generative AI could potentially match the energy use of a nation as large as the Netherlands by the year 2027.

Ever since that point, a vast array of research has been conducted regarding this subject, with specialists debating from different perspectives. Moreover, CEOs of prominent AI companies have also countered these arguments. Given the rapid growth of AI so far, it’s no wonder that it is mostly about creating excitement and addressing concerns.

Yes, the data centers that power AI models are thirsty and love hogging power, but to what extent?

A recently released research paper authored by several Google specialists intends to evaluate the ecological footprint associated with their Gemini AI. This comprehensive document, backed up by numerous references, presents some assertive findings.

According to the study, a typical text response from Gemini AI uses less energy compared to watching approximately nine seconds of TV, which is around 0.24 watt-hours. Additionally, generating a single text response by the AI requires only about five drops, or 0.26 milliliters, of water.

It’s also suggested that over the past year, Google managed to reduce energy consumption for Gemini text prompts by a factor of 33, and decrease its carbon footprint by a factor of 44.

In my analysis, this research encompasses a comprehensive array of elements within the artificial intelligence (AI) serving infrastructure. This includes the active use of accelerators, power consumption of the host system during operation, energy expended by idle machines, and the additional energy overhead incurred at the data center level. Compared to other studies I’ve encountered, this one presents a more holistic perspective.

From my perspective, I’m observing a research study conducted by Google, focusing on their own practices. It’s important to note that this study hasn’t undergone peer review yet. There have been critiques from outside experts who question the study’s sincerity or transparency.

Is Google’s latest AI environmental impact study misleading?

A recent report from The Verge indicates that Google’s assessment on the environmental impact of its Gemini AI might be incomplete, as it seems to overlook significant details that could paint a more critical portrait.

According to Shaolei Ren, an associate professor in the field of electrical and computer engineering at the University of California and a writer of a paper referenced in the Google report, he finds Google’s claims questionable.

They’re just hiding the critical information. This really spreads the wrong message to the world.

Shaolei Ren (via The Verge)

Ren’s worries lie in the absence of indirect water usage measurements from Google in their consumption estimations. Although the research primarily looks into a AI center’s water cooling system, it doesn’t take into account the water utilized during the production of electricity that powers data centers.

Alex de Vries-Gao, a PhD candidate at Vrije Universiteit Amsterdam Institute for Environmental Studies, also cites a paper referenced in the Google report. In his view, when discussing water consumption estimates, we’re only seeing the “tip of the iceberg,” so to speak.

The study also raises concerns about Google’s carbon footprint, as they calculate their emissions using a “market-based” method. This approach doesn’t provide us with a straightforward number, instead it factors in commitments Google has made towards promoting renewable energy growth in the United States through a complex system.

In my observation, as detailed in The Verge’s report, Ren and de Vries-Gao have suggested that Google could improve its emissions calculations by adopting a location-specific approach. This method would allow for a more targeted evaluation of a data center’s influence on the local power grid, rather than relying on a broader market-based perspective.

In simpler terms, Ren criticizes Google for providing vague details regarding how they arrived at their median text prompt calculation. As stated by Ren, “The overall figures that Google presents are, in fact, quite misleading.

Google isn’t the only company with an AI consumption problem

In simpler terms, it’s important to note that currently, Artificial Intelligence (AI) isn’t eco-friendly due to the large energy consumption of the data centers that power these models. They consume vast amounts of water and electricity at an alarming rate, and as of now, there aren’t any viable solutions to circumvent this issue.

In 2024, a report found that both Microsoft and Google were using more electricity than approximately 100 nations combined, with this total taking into account factors such as AI and cloud services. Subsequently, it was disclosed that Microsoft’s energy demands had grown by 168% due to artificial intelligence, which was creating challenges for their aim of achieving carbon neutrality by 2030.

Microsoft has agreed to a 20-year contract for the revival of the infamous Three Mile Island, the site of a nuclear accident in 1979, with the aim of supplying power to their data centers.

Speaking about OpenAI, the company behind ChatGPT (Microsoft’s primary AI collaborator), it has been revealed that their next-generation model, GPT-5, is remarkably power-intensive. In fact, operating this model for just one day would require the same amount of electricity as is used daily by about 1.5 million households in the United States.

As an enthusiast, I’ve learned that each GPT-5 query roughly consumes around 18 Watts of power, as per research conducted by the AI lab at the University of Rhode Island. With an estimated 2.5 billion queries happening daily, we’re talking about a need for approximately 45 Gigawatt-hours of energy to keep this system up and running. To put that into perspective, a modern nuclear reactor can generate between 1 and 1.6 Gigawatts of power per hour.

Despite the AI lab’s findings being based on various assumptions such as GPT-5 using 40Wh of power for a medium-length output, it’s worth noting that its predecessor, GPT-4o, appears significantly more energy-efficient. A forecast from 2025 indicates that the older model required just 0.3Wh to produce a response, making it appear greener in terms of energy consumption compared to its successor.

Are you concerned that AI data centers might lead to power outages and droughts? Or do you believe this concern is exaggerated? Feel free to share your thoughts in the comment section!

Read More

2025-08-22 22:10