Send Link mailing list submissions to
[email protected]
To subscribe or unsubscribe via the World Wide Web, visit
https://mailman.anu.edu.au/mailman/listinfo/link
or, via email, send a message with subject or body 'help' to
[email protected]
You can reach the person managing the list at
[email protected]
When replying, please edit your Subject line so it is more specific
than "Re: Contents of Link digest..."
Today's Topics:
1. In a first, Google has released data on how much energy an AI
prompt uses (Stephen Loosley)
2. Re: The warning signs the AI bubble is about to burst
(Tom Worthington)
----------------------------------------------------------------------
Message: 1
Date: Mon, 25 Aug 2025 01:04:01 +0930
From: Stephen Loosley <[email protected]>
To: "link" <[email protected]>
Subject: [LINK] In a first, Google has released data on how much
energy an AI prompt uses
Message-ID: <[email protected]>
Content-Type: text/plain; charset="UTF-8"
Artificial intelligence
In a first, Google has released data on how much energy an AI prompt uses
It?s the most transparent estimate yet from one of the big AI companies, and a
long-awaited peek behind the curtain for researchers.
By Casey Crownhart August 21, 2025
https://www.technologyreview.com/2025/08/21/1122288/google-gemini-ai-energy/
MIT Technology Review | Google
Google has just released a technical report detailing how much energy its
Gemini apps use for each query.
In total, the median prompt?one that falls in the middle of the range of energy
demand?consumes 0.24 watt-hours of electricity, the equivalent of running a
standard microwave for about one second. The company also provided average
estimates for the water consumption and carbon emissions associated with a text
prompt to Gemini.
It?s the most transparent estimate yet from a Big Tech company with a popular
AI product, and the report includes detailed information about how the company
calculated its final estimate.
As AI has become more widely adopted, there?s been a growing effort to
understand its energy use. But public efforts attempting to directly measure
the energy used by AI have been hampered by a lack of full access to the
operations of a major tech company.
Earlier this year, MIT Technology Review published a comprehensive series on AI
and energy, at which time none of the major AI companies would reveal their
per-prompt energy usage.
Google?s new publication, at last, allows for a peek behind the curtain that
researchers and analysts have long hoped for.
The study focuses on a broad look at energy demand, including not only the
power used by the AI chips that run models but also by all the other
infrastructure needed to support that hardware.
?We wanted to be quite comprehensive in all the things we included,? said Jeff
Dean, Google?s chief scientist, in an exclusive interview with MIT Technology
Review about the new report.
That?s significant, because in this measurement, the AI chips?in this case,
Google?s custom TPUs, the company?s proprietary equivalent of GPUs?account for
just 58% of the total electricity demand of 0.24 watt-hours.
Another large portion of the energy is used by equipment needed to support
AI-specific hardware: The host machine?s CPU and memory account for another 25%
of the total energy used. There?s also backup equipment needed in case
something fails?these idle machines account for 10% of the total. The final 8%
is from overhead associated with running a data center, including cooling and
power conversion.
This sort of report shows the value of industry input to energy and AI
research, says Mosharaf Chowdhury, a professor at the University of Michigan
and one of the heads of the ML.Energy leaderboard, which tracks energy
consumption of AI models.
Estimates like Google?s are generally something that only companies can
produce, because they run at a larger scale than researchers are able to and
have access to behind-the-scenes information.
?I think this will be a keystone piece in the AI energy field,? says Jae-Won
Chung, a PhD candidate at the University of Michigan and another leader of the
ML.Energy effort. ?It?s the most comprehensive analysis so far.?
Google?s figure, however, is not representative of all queries submitted to
Gemini: The company handles a huge variety of requests, and this estimate is
calculated from a median energy demand, one that falls in the middle of the
range of possible queries.
Related Story
calculator making sparks
Everything you need to know about estimating AI?s energy and emissions burden
Here?s how MIT Technology Review waded through a mess of data and hidden
variables to calculate the individual and collective energy demand from AI.
So some Gemini prompts use much more energy than this: Dean gives the example
of feeding dozens of books into Gemini and asking it to produce a detailed
synopsis of their content. ?That?s the kind of thing that will probably take
more energy than the median prompt,? Dean says. Using a reasoning model could
also have a higher associated energy demand because these models take more
steps before producing an answer.
This report was also strictly limited to text prompts, so it doesn?t represent
what?s needed to generate an image or a video. (Other analyses, including one
in MIT Technology Review?s Power Hungry series earlier this year, show that
these tasks can require much more energy.)
The report also finds that the total energy used to field a Gemini query has
fallen dramatically over time. The median Gemini prompt used 33 times more
energy in May 2024 than it did in May 2025, according to Google. The company
points to advancements in its models and other software optimizations for the
improvements.
Google also estimates the greenhouse gas emissions associated with the median
prompt, which they put at 0.03 grams of carbon dioxide. To get to this number,
the company multiplied the total energy used to respond to a prompt by the
average emissions per unit of electricity.
Rather than using an emissions estimate based on the US grid average, or the
average of the grids where Google operates, the company instead uses a
market-based estimate, which takes into account electricity purchases that the
company makes from clean energy projects. The company has signed agreements to
buy over 22 gigawatts of power from sources including solar, wind, geothermal,
and advanced nuclear projects since 2010. Because of those purchases, Google?s
emissions per unit of electricity on paper are roughly one-third of those on
the average grid where it operates.
AI data centers also consume water for cooling, and Google estimates that each
prompt consumes 0.26 milliliters of water, or about five drops.
The goal of this work was to provide users a window into the energy use of
their interactions with AI, Dean says.
?People are using [AI tools] for all kinds of things, and they shouldn?t have
major concerns about the energy usage or the water usage of Gemini models,
because in our actual measurements, what we were able to show was that it?s
actually equivalent to things you do without even thinking about it on a daily
basis,? he says, ?like watching a few seconds of TV or consuming five drops of
water.?
The publication greatly expands what?s known about AI?s resource usage. It
follows recent increasing pressure on companies to release more information
about the energy toll of the technology. ?I?m really happy that they put this
out,? says Sasha Luccioni, an AI and climate researcher at Hugging Face.
?People want to know what the cost is.?
This estimate and the supporting report contain more public information than
has been available before, and it?s helpful to get more information about AI
use in real life, at scale, by a major company, Luccioni adds. However, there
are still details that the company isn?t sharing in this report. One major
question mark is the total number of queries that Gemini gets each day, which
would allow estimates of the AI tool?s total energy demand.
And ultimately, it?s still the company deciding what details to share, and when
and how. ?We?ve been trying to push for a standardized AI energy score,?
Luccioni says, a standard for AI similar to the Energy Star rating for
appliances. ?This is not a replacement or proxy for standardized comparisons.?
hide
by Casey Crownhart
------------------------------
Message: 2
Date: Mon, 25 Aug 2025 09:10:57 +1000
From: Tom Worthington <[email protected]>
To: [email protected]
Subject: Re: [LINK] The warning signs the AI bubble is about to burst
Message-ID: <[email protected]>
Content-Type: text/plain; charset="utf-8"; Format="flowed"
On 8/21/25 22:41, Stephen Loosley wrote:
> The warning signs the AI bubble is about to burst
I thought the AI Bubble had another six months to go. What does Chat GPT
say? ;-)
But those governments and investors again donating money to speculative
tech projects is heartening to see. The investment will not be wasted.
Just as bitcoin mining hardware was put to use doing AI, the AI hardware
can be used for whatever comes next.
--
Tom Worthington http://www.tomw.net.au
-------------- next part --------------
A non-text attachment was scrubbed...
Name: OpenPGP_signature.asc
Type: application/pgp-signature
Size: 665 bytes
Desc: OpenPGP digital signature
URL:
<https://mailman.anu.edu.au/pipermail/link/attachments/20250825/5413bc46/attachment-0001.sig>
------------------------------
Subject: Digest Footer
_______________________________________________
Link mailing list
[email protected]
https://mailman.anu.edu.au/mailman/listinfo/link
------------------------------
End of Link Digest, Vol 393, Issue 21
*************************************