Google’s AI Chief: AI Not Major Culprit for Data Center Energy Use

At Fortune’s Brainstorm Tech conference, Google’s Chief Scientist Jeff Dean provided insights into AI’s contribution to data center energy use, dismissing concerns that AI is a major driver of increased emissions. AI’s Energy Consumption in Context Dean, responsible for Google DeepMind and Google Research, noted that AI-related energy usage is still relatively small compared to the overall consumption of data centers. He stressed that several factors contribute to the overall energy growth, not just AI. Dean reiterated Google’s pledge to run entirely on clean energy by 2030. Although achieving this target involves complex, long-term projects, he expressed optimism about their potential to significantly lower carbon footprints. Boosting Efficiency and Partnerships Improving operational efficiency is a key strategy for Google, according to Dean. The company is enhancing the efficiency of its data centers and collaborating with renewable energy providers to meet its sustainability commitments. Dean discussed Project Astra, a DeepMind initiative aimed at creating a “universal AI agent” to interact with users’ surroundings. This advanced AI is scheduled for testing later this year. Dean pointed out the necessity for breakthroughs in AI algorithms to enhance factual accuracy and reasoning. Besides scaling up data and computing resources, innovative algorithms are crucial to making AI more robust and dependable. Energy Consumption Forecasts for Data Centers Projections by the Electric Power Institute suggest that data centers might consume double their current electricity, potentially representing up to 9 percent of U.S. power usage by 2030. Presently, AI applications contribute to roughly 10 percent to 20 percent of this consumption, a figure set to grow. Between 2019 and 2023, Google saw a 48 percent rise in greenhouse gas emissions, despite its goal of achieving net-zero emissions by 2030. Dean emphasized that Google’s custom AI chips, TPUs, are substantially more efficient than competitors’. Since their debut in 2016, TPUs have demonstrated 30 to 80 times better efficiency. Dean, marking 25 years at Google, maintained a balanced stance on generative AI’s risks and advantages. He highlighted its transformative potential in areas like education and healthcare but noted the unpredictable nature of future developments.

Stay in the Loop

Get the daily email from CryptoNews that makes reading the news actually enjoyable. Join our mailing list to stay in the loop to stay informed, for free.

Latest stories

- Advertisement - spot_img

You might also like...