AI & TechArtificial IntelligenceBigTech CompaniesNewswireTechnologyWhat's Buzzing

Google Reveals AI’s Energy Cost Per Prompt for the First Time

▼ Summary

Google has published a detailed study revealing the energy usage of AI operations, providing previously undisclosed data that researchers have sought.
– The analysis includes not only AI-specific hardware energy but also all supporting infrastructure, offering a comprehensive view of total energy demand.
– Google’s custom AI chips account for only 58% of the total 0.24 watt-hours per prompt, with the rest consumed by CPUs, memory, backup systems, and data center overhead.
– Experts highlight that such precise energy estimates are only possible from industry sources due to their scale and access to internal operational data.
– The reported energy figure is a median value and does not represent all queries, as Google handles a wide variety of requests with varying energy demands.

For the first time, Google has provided a detailed look at the energy consumption of artificial intelligence systems per prompt, offering unprecedented transparency into the environmental footprint of AI operations. This disclosure marks a significant step forward in understanding the true energy costs associated with running large-scale AI models, a topic that has long been shrouded in corporate secrecy.

Earlier this year, major AI firms remained tight-lipped about their energy usage metrics, leaving researchers and analysts to rely on estimates. Google’s newly published data changes that, revealing not only the power drawn by the specialized chips that execute AI tasks but also the substantial energy demands of the surrounding infrastructure. According to Jeff Dean, Google’s chief scientist, the company aimed for a thorough assessment, capturing every component involved in AI processing.

The findings are revealing: custom TPUs—Google’s proprietary AI accelerators—account for just 58% of the total energy consumed, which averages 0.24 watt-hours per prompt. Supporting hardware plays a surprisingly large role, with the host CPU and memory contributing another 25%. Backup systems, kept idle in case of failure, use 10%, while data center overheads like cooling and power conversion make up the remaining 8%.

Experts in the field have welcomed the report as a critical contribution. Mosharaf Chowdhury, a professor at the University of Michigan involved in tracking AI energy use, emphasized the value of industry-led data, noting that companies operate at scales beyond academic reach. Jae-Won Chung, a PhD candidate coordinating the same effort, described the analysis as a “keystone piece” that sets a new benchmark for comprehensiveness.

It’s important to note that Google’s energy estimate represents a median value and may not reflect every type of query handled by its Gemini model. The company processes an enormous variety of requests, each with differing computational demands, meaning actual energy use can vary significantly across tasks. Still, this disclosure provides a foundational reference point for future research and policy discussions around sustainable AI development.

(Source: Technology Review)

Topics

ai energy consumption 95% google transparency 85% infrastructure energy usage 80% custom ai chips tpus 75% supporting hardware energy 70% data center overhead 65% industry data importance 60% energy variability across queries 55%