logo
Logo

Google Claims Gemini AI Uses Just ‘Five Drops of Water’ Per Prompt, Sparks Debate

Google’s Key Claims: How “Green” Is Gemini?Google’s new study puts forward three headline numbers about a single Gemin

  • by Shan 2025-08-22 12:34:27

Why AI’s Environmental Footprint Matters

Artificial intelligence has wormed its way into daily life — from quick question answering, to the more complex world of business decision making. Behind every chatbot reply is an actual cost in electricity, water, and carbon emissions. 

When models like Gemini, GPT, and Claude are scaled globally, they rely on enormous servers that require electricity to run and water for cooling processes. Meaning it will have real-world environmental costs in terms of energy ​demands, and water consumption.

This is why Google’s recent report on the environmental costs of Gemini AI has created a buzz. The company states that its systems are much more efficient than anyone has previously thought. However, many researchers are claiming the numbers are framed in a way to downplay the true impact of AI.



Google’s Key Claims: How “Green” Is Gemini?

Google’s new study puts forward three headline numbers about a single Gemini text prompt:

  • Water use: ~5 drops (0.26 millilitres)

  • Energy use: 0.24 watt-hours (Wh) — about the same as watching TV for less than 9 seconds

  • Carbon emissions: ~0.03 grams of CO₂

On the surface, these numbers look tiny. To make them relatable, Google compared them to everyday activities:

Gemini AI Prompt

Equivalent Real-World Activity

0.24 Wh energy

Less than 9 seconds of TV time

0.03 g CO₂

The carbon from charging a phone for 1 second

5 drops of water

Around 1 sip in every 100 prompts

Ben Gomes, Google’s Chief Technologist of Learning and Sustainability, said that efficiency has improved drastically:

  • 33x lower energy consumption per prompt compared to earlier benchmarks.

  • 44x lower carbon footprint compared to past models.

Google frames this as proof that AI can scale without overwhelming environmental costs.


The Skeptic’s View: Why Experts Disagree

Not everyone is convinced. Shaolei Ren, a researcher whose 2023 paper on AI’s water and energy use is widely cited, challenged Google’s findings.

His main counterpoints:

  1. Selective Comparisons

    • Google compared its 2025 onsite-only water usage to Ren’s highest total water number from 2023 across 18 locations.

    • This makes Gemini look far more efficient than GPT-3, even though in 8 out of 18 locations GPT-3 used less water than Gemini.

  2. Omission of Indirect Water Use

    • Google’s estimates only include onsite cooling water.

    • Ren’s paper included both onsite cooling water and offsite water used in electricity generation.

    • By excluding offsite water, Google presents a “cleaner” picture of Gemini’s footprint.

  3. Global vs Regional Differences

    • Google gives a global average.

    • But in practice, water and energy costs vary heavily depending on data center location.

    • In water-stressed regions (like Arizona or parts of India), even “five drops per prompt” can add up to billions of litres annually.

Ren summed it up bluntly on LinkedIn: Google’s framing risks being “misleading” because it leaves out context that matters.


What Do These Numbers Really Mean?

At first glance, five drops of water or nine seconds of TV seem harmless. But scale changes everything.

Let’s do the math:

  • 1 million prompts =

    • ~260 litres of water (enough for several showers)

    • ~240 kWh of energy (similar to what an average household uses in 7–8 days)

    • ~30 kilograms of CO₂ emissions

Now think about billions of queries daily across Google products. Even small numbers per prompt can grow into meaningful global impact.

This is where the debate sharpens: is Gemini really sustainable, or are the metrics designed to make the impact seem trivial?

The Bigger Picture: AI’s Real Environmental Cost

The Gemini debate isn’t happening in isolation. Researchers and industry observers have been studying AI’s environmental cost for years. Some key insights:

  • Data centers already consume ~1–1.5% of global electricity (IEA estimate, 2023).

  • AI training runs can consume millions of kWh and millions of litres of water in weeks. For example:

    • Training GPT-3 was estimated to use 700,000 litres of clean water.

    • Google’s PaLM model reportedly used water equivalent to a small town’s daily consumption.

  • Water is a local issue. Using large amounts of water in drought-prone areas (Arizona, Chile, India) can directly affect communities.

Transparency remains a challenge. Companies often highlight the lowest or most flattering figures. Independent research frequently shows higher totals once indirect usage (like electricity-linked water) is counted.

Why This Debate Matters

This isn’t just an academic argument. There are real consequences for policy, business, and society.

1. Policy & Regulation

  • Governments are beginning to ask tech companies to disclose total environmental costs, not just onsite averages.

  • The EU’s AI Act and climate-related disclosures could push for stricter reporting standards.

2. Water Scarcity & Ethics

  • Data centers in water-stressed regions amplify ethical concerns.

  • For example, if a data center in Arizona uses billions of litres yearly, “five drops per prompt” loses meaning when the community faces water shortages.

3. Investor & Consumer Awareness

  • Sustainability is becoming a key metric for investors.

  • Consumers increasingly care about whether AI growth aligns with climate goals.

  • Transparent reporting builds trust, while selective framing risks backlash.

What Can Be Done? Toward “Green AI”

So, what are practical steps the industry (and users) can take?

For AI Companies:

  • Report both onsite and total water use (including electricity generation).

  • Break down data by region to reflect local stress factors.

  • Invest in renewable-powered data centers.

  • Reuse wastewater for cooling.

For Policymakers:

  • Mandate standardized sustainability disclosures.

  • Incentivize data centers in regions with renewable energy and abundant water.

For Businesses & Users:

  • Consider carbon offsets when adopting AI at scale.

  • Use AI efficiently — optimize queries instead of running thousands unnecessarily.

  • Pressure providers to disclose clear, transparent environmental metrics.

Key Takeaways

  • According to Google: Each Gemini AI text prompt uses five drops of water, less energy than a TV on for 9 seconds, and emits 0.03 g of CO₂.

  • Potential for efficiency: 33x less energy and 44x less carbon footprint compared to the baselines of AI systems.

  • The problem: Researcher Shaolei Ren claims Google’s report, while impressive, is disingenuous because it omits indirect water usage, and the datasets are mismatched.

  • The important point: Even small per-prompt numbers add up to significant global amounts of water, energy and carbon usage- like all global industries.

  • Going forward: Transparency, standardized reporting, and investment in renewable/water-enabled infrastructure will be important in delivering “green AI.”

Final Word

Google’s Gemini report shows just how far AI efficiency has come — and how far the industry still needs to go with transparency and accountability.  Whether five drops of water is the full truth or merely part of the story, the argument demonstrates at least one clear thing - that AI's environmental footprint cannot be ignored moving forward.

The future of "green AI" is going to rely not just on technical advances, but whether companies, regulators, and communities demand full lifecycle accounting of AI's hidden costs.

Also Read: Fired by Elon Musk, Ex-Twitter CEO Parag Agrawal Launches ‘Deep Research API’ to Rival ChatGPT


  • Share
logoSubscribe now
x
logo