The Climate Cost of AI Queries
The growing use of AI-powered internet queries raises new concerns about energy consumption and climate impact. Many users now rely on ChatGPT and similar services for simple questions, and even basic Google searches often include AI-generated results. Estimates of the energy used per AI query vary widely. ChatGPT reports up to 0.34 watt-hours per prompt, roughly the same as a household lightbulb running for 20 seconds. Some researchers suggest that more complex or longer prompts could demand 100 times that energy.
Recently, Google shared its data for searches powered by Gemini, its AI tool. The average Gemini search consumes about 0.24 watt-hours, equivalent to watching TV for nine seconds, and produces approximately 0.03 grams of CO2 equivalent emissions. Notably, Google highlighted significant efficiency gains: energy use per query dropped by 97% and carbon emissions by 98% over the past year. This progress, however, applies mainly to text queries and excludes more resource-intensive tasks like image or audio generation and model training, which are harder to quantify.
The Future of Energy and AI
The key question is whether this downward trend in energy use can continue as AI demand grows. This has major implications for future U.S. emissions and for power sector investments. Industry leaders face a challenge: meeting the rising demand for AI while avoiding unnecessary infrastructure expansion as AI models become more efficient.
Google’s gains rely on two main strategies: using cleaner power and improving efficiency in chips and query processing. The clean energy approach involves large purchases of renewable energy—last year alone, Google contracted for 8 gigawatts of clean power, roughly equivalent to 2,400 utility-scale wind turbines. The company is also investing in emerging clean technologies like nuclear fusion.
Efficiency, meanwhile, extends beyond typical energy-saving measures. Google designs its own chips, called TPUs, which have become about 30 times more efficient since 2018. The company also optimizes its AI models to reduce computational demands per query. Additionally, Google recently started shifting data center loads to times when the electricity grid is less stressed, balancing demand and supply effectively.
The Implications of Efficiency Gains
The pressing issue for Google and others in AI is whether these efficiency improvements can keep pace with growing usage. Sustained efficiency gains would benefit the climate, provided demand growth doesn't outstrip efficiency improvements. For the energy sector, this uncertainty complicates planning. Power companies are investing heavily in new electricity generation capacity anticipating AI-driven demand growth, but efficiency gains could temper that demand.
Google's experience suggests that predicting energy demand from AI is complex. Stakeholders should consider both the potential for continued efficiency improvements and the uncertainties in future usage patterns. For those interested in AI's evolving landscape and its impacts, exploring further education and training can offer valuable insights. Resources like Complete AI Training’s latest courses provide up-to-date information on AI technologies and trends.
Your membership also unlocks: