Is saying “please and thank you” to ChatGPT worth it? — CEO jokes it spends “tens of millions of dollars” on polite prompts

Leading artificial intelligence research centers such as OpenAI, Anthropic, and Google are investing heavily to drive substantial advancements in the AI sector. As per a 2023 report, OpenAI is estimated to spend approximately $700,000 daily just to keep their ChatGPT platform operational.

Lately, Sam Altman, the CEO of OpenAI, playfully revealed some details about the amount of electricity that ChatGPT uses when people politely say “please” and “thank you” during their interactions with the company’s advanced AI models.

Tomieinlove on X suggested a playful thought: “I ponder over the potential electric bills savings for OpenAI if people said ‘please’ and ‘thank you’ less often to their models instead.

In a playful manner, CEO Sam Altman hinted that ChatGPT invests considerable amounts, approximately tens of millions of dollars, in additional efforts required.

tens of millions of dollars well spent–you never knowApril 16, 2025

Despite the substantial funds required to maintain ChatGPT and answer questions like this, the executive appears unfazed. In truth, Altman considers it a wise investment.

A recent report found that both Microsoft and Google use more electricity than over one hundred different nations do. However, a new study from Epoch AI has challenged this claim, labeling it as rough estimates or “back-of-the-envelope calculations”, and suggesting that ChatGPT may not actually consume as much energy as originally believed.

Epoch AI suggested that their data assumes OpenAI’s current AI models rely on older graphics processing units (GPUs). The article explained that the ChatGPT version using GPT-4o requires just 0.3 watt-hours to produce a response for a given question.

The advancement and evolution of AI isn’t solely reliant on financial resources; instead, AI research facilities extensively utilize data centers for their immense computational power. It’s crucial to note that these data centers need cooling systems to prevent their servers from overheating. This is essential because overheating could cause malfunctions and potentially lead to service interruptions.

2023 saw a separate finding that both Microsoft Copilot and ChatGPT require the equivalent of one water bottle’s worth of cooling while responding to a user inquiry. It appears that as these AI models progress, their need for cooling water is increasing.

For example, it’s been discovered that OpenAI’s GPT-3 model uses four times as much water as previously believed, while generating just 100 words requires GPT-4 to use up to three full water bottles.

Read More

2025-04-17 17:09