For the climate concerned, the rise of the AI-reliant internet query is a cause for alarm. Many people have turned to ChatGPT and other services for simple questions. And even basic Google searches include an AI-derived result.
Depending on how you crunch the numbers, it’s possible to come up with a wide range for the energy usage and related climate cost of an AI query. ChatGPT provides the estimate of up to 0.34 watt-hours per prompt, equivalent to using a household lightbulb for 20 seconds, while one set of researchers concluded that some models may use as much as 100 times more for longer prompts.
[time-brightcove not-tgx=”true”]
On Thursday, Google released its own data: the average search using Gemini—the company’s ubiquitous AI tool—uses 0.24 watt-hours. That’s equivalent to watching about nine seconds of TV. Such a search emits 0.03 grams of carbon dioxide equivalent. Perhaps more interesting is how Google says Gemini text queries have become cleaner with time. Over the last year, energy consumption per query has decreased around 97% while carbon emissions have decreased by 98% per query, the company said. A separate report from Google released earlier in the summer showed a decoupling of data center energy consumption and the resulting emissions. (It’s worth noting, of course, that simple text queries are less intensive than other functions like image, audio, or visual generation, and these figures don’t include training of models—numbers that aren’t included in Google’s report given the challenges of accurately calculating them).
Read more: Some AI Prompts Can Cause 50 Times More CO2 Emissions Than Others
Whether such a downward trajectory can continue is a crucial question for anyone watching the future of energy and climate in the U.S.—with implications not just for the future of U.S. emissions but also for the hundreds of billions of dollars in power sector investments. Across a variety of related industries, leaders will need to try to thread the needle: addressing the growing demand for AI while avoiding overbuilding infrastructure as AI models grow more efficient.
Google’s progress boils down to two levers: cleaner power and more efficient chips and query crunching.
The clean energy strategy is impressive, but fairly straightforward. The company buys a lot of renewable energy to power its operations, signing contracts to buy 8GW of clean power last year alone. That’s equivalent to the capacity of 2,400 utility-scale wind turbines, according to Department of Energy numbers. Going forward, the company has invested in helping bring other future clean technologies like nuclear fusion online.
But then there’s the company’s efficiency measures. In energy circles, efficiency tends to refer to simply using less energy and making energy hardware run more productively—think of climate control or better insulation. While Google has done some of that, the most impressive efficiency gains have come through the AI ecosystem rather than the energy system. The company has created its own chips—which it calls TPUs, as opposed to broadly used GPUs. Those chips have become more efficient over time—some 30 times more efficient since 2018, according to Google’s sustainability report. The company has also improved the efficiency of its models using techniques that crunch queries differently, thereby reducing the needed compute power. And a few weeks ago the company announced a program to shift data center demand to times when the electricity grid is less stressed.
Read more: AI Could Reshape Everything We Know About Climate Change
The question—not just for Google but for any company deeply invested in AI—is whether those programs and the resulting efficiency gains can continue. Deepened efficiency gains would be a huge climate win—so long as the increase in usage doesn’t outpace the increase in efficiency.
Greater efficiency would also have significant implications across the energy sector. Right now, power companies are betting big on new sources of electricity generation on the assumption that AI will continue driving demand growth. But it’s very hard to predict exactly how fast demand will grow. Prospective efficiency gains are a big reason why, and Google’s results should at least make you pause and consider the known unknown potential.
To get this story in your inbox, subscribe to the TIME CO2 Leadership Report newsletter here.