AI’s energy impact is still small—but how we handle it is huge
With seemingly no limit to the demand for artificial intelligence, everyone in the energy, AI, and climate fields is justifiably worried. Will there be enough clean electricity to power AI and enough water to cool the data centers that support this technology? These are important questions with serious implications for communities, the economy, and the environment. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. But the question about AI’s energy usage portends even bigger issues about what we need to do in addressing climate change for the next several decades. If we can’t work out how to handle this, we won’t be able to handle broader electrification of the economy, and the climate risks we face will increase. Innovation in IT got us to this point. Graphics processing unitsthat power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break. As the accuracy of AI models dramatically improved, the electricity needed for data centers also started increasing faster; they now account for 4.4% of total demand, up from 1.9% in 2018. Data centers consume more than 10% of the electricity supply in six US states. In Virginia, which has emerged as a hub of data center activity, that figure is 25%.
Projections about the future demand for energy to power AI are uncertain and range widely, but in one study, Lawrence Berkeley National Laboratory estimated that data centers could represent 6% to 12% of total US electricity use by 2028. Communities and companies will notice this type of rapid growth in electricity demand. It will put pressure on energy prices and on ecosystems. The projections have resulted in calls to build lots of new fossil-fired power plants or bring older ones out of retirement. In many parts of the US, the demand will likely result in a surge of natural-gas-powered plants. It’s a daunting situation. Yet when we zoom out, the projected electricity use from AI is still pretty small. The US generated about 4,300 billion kilowatt-hours last year. We’ll likely need another 1,000 billion to 1,200 billion or more in the next decade—a 24% to 29% increase. Almost half the additional electricity demand will be from electrified vehicles. Another 30% is expected to be from electrified technologies in buildings and industry. Innovation in vehicle and building electrification also advanced in the last decade, and this shift will be good news for the climate, for communities, and for energy costs.
The remaining 22% of new electricity demand is estimated to come from AI and data centers. While it represents a smaller piece of the pie, it’s the most urgent one. Because of their rapid growth and geographic concentration, data centers are the electrification challenge we face right now—the small stuff we have to figure out before we’re able to do the big stuff like vehicles and buildings. We also need to understand what the energy consumption and carbon emissions associated with AI are buying us. While the impacts from producing semiconductors and powering AI data centers are important, they are likely small compared with the positive or negative effects AI may have on applications such as the electricity grid, the transportation system, buildings and factories, or consumer behavior. Companies could use AI to develop new materials or batteries that would better integrate renewable energy into the grid. But they could also use AI to make it easier to find more fossil fuels. The claims about potential benefits for the climate are exciting, but they need to be continuously verified and will need support to be realized. This isn’t the first time we’ve faced challenges coping with growth in electricity demand. In the 1960s, US electricity demand was growing at more than 7% per year. In the 1970s that growth was nearly 5%, and in the 1980s and 1990s it was more than 2% per year. Then, starting in 2005, we basically had a decade and a half of flat electricity growth. Most projections for the next decade put our expected growth in electricity demand at around 2% again—but this time we’ll have to do things differently. To manage these new energy demands, we need a “Grid New Deal” that leverages public and private capital to rebuild the electricity system for AI with enough capacity and intelligence for decarbonization. New clean energy supplies, investment in transmission and distribution, and strategies for virtual demand management can cut emissions, lower prices, and increase resilience. Data centers bringing clean electricity and distribution system upgrades could be given a fast lane to connect to the grid. Infrastructure banks could fund new transmission lines or pay to upgrade existing ones. Direct investment or tax incentives could encourage clean computing standards, workforce development in the clean energy sector, and open data transparency from data center operators about their energy use so that communities can understand and measure the impacts. In 2022, the White House released a Blueprint for an AI Bill of Rights that provided principles to protect the public’s rights, opportunities, and access to critical resources from being restricted by AI systems. To the AI Bill of Rights, we humbly offer a climate amendment, because ethical AI must be climate-safe AI. It’s a starting point to ensure that the growth of AI works for everyone—that it doesn’t raise people’s energy bills, adds more clean power to the grid than it uses, increases investment in the power system’s infrastructure, and benefits communities while driving innovation. By grounding the conversation about AI and energy in context about what is needed to tackle climate change, we can deliver better outcomes for communities, ecosystems, and the economy. The growth of electricity demand for AI and data centers is a test case for how society will respond to the demands and challenges of broader electrification. If we get this wrong, the likelihood of meeting our climate targets will be extremely low. This is what we mean when we say the energy and climate impacts from data centers are small, but they are also huge. Costa Samaras is the Trustee Professor of Civil and Environmental Engineering and director of the Scott Institute for Energy Innovation at Carnegie Mellon University. Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University. Ramayya Krishnan is dean of the Heinz College of Information Systems and Public Policy and the William W. and Ruth F. Cooper Professor of Management Science and Information Systems at Carnegie Mellon University.
#ais #energy #impact #still #smallbut
AI’s energy impact is still small—but how we handle it is huge
With seemingly no limit to the demand for artificial intelligence, everyone in the energy, AI, and climate fields is justifiably worried. Will there be enough clean electricity to power AI and enough water to cool the data centers that support this technology? These are important questions with serious implications for communities, the economy, and the environment. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. But the question about AI’s energy usage portends even bigger issues about what we need to do in addressing climate change for the next several decades. If we can’t work out how to handle this, we won’t be able to handle broader electrification of the economy, and the climate risks we face will increase. Innovation in IT got us to this point. Graphics processing unitsthat power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break. As the accuracy of AI models dramatically improved, the electricity needed for data centers also started increasing faster; they now account for 4.4% of total demand, up from 1.9% in 2018. Data centers consume more than 10% of the electricity supply in six US states. In Virginia, which has emerged as a hub of data center activity, that figure is 25%.
Projections about the future demand for energy to power AI are uncertain and range widely, but in one study, Lawrence Berkeley National Laboratory estimated that data centers could represent 6% to 12% of total US electricity use by 2028. Communities and companies will notice this type of rapid growth in electricity demand. It will put pressure on energy prices and on ecosystems. The projections have resulted in calls to build lots of new fossil-fired power plants or bring older ones out of retirement. In many parts of the US, the demand will likely result in a surge of natural-gas-powered plants. It’s a daunting situation. Yet when we zoom out, the projected electricity use from AI is still pretty small. The US generated about 4,300 billion kilowatt-hours last year. We’ll likely need another 1,000 billion to 1,200 billion or more in the next decade—a 24% to 29% increase. Almost half the additional electricity demand will be from electrified vehicles. Another 30% is expected to be from electrified technologies in buildings and industry. Innovation in vehicle and building electrification also advanced in the last decade, and this shift will be good news for the climate, for communities, and for energy costs.
The remaining 22% of new electricity demand is estimated to come from AI and data centers. While it represents a smaller piece of the pie, it’s the most urgent one. Because of their rapid growth and geographic concentration, data centers are the electrification challenge we face right now—the small stuff we have to figure out before we’re able to do the big stuff like vehicles and buildings. We also need to understand what the energy consumption and carbon emissions associated with AI are buying us. While the impacts from producing semiconductors and powering AI data centers are important, they are likely small compared with the positive or negative effects AI may have on applications such as the electricity grid, the transportation system, buildings and factories, or consumer behavior. Companies could use AI to develop new materials or batteries that would better integrate renewable energy into the grid. But they could also use AI to make it easier to find more fossil fuels. The claims about potential benefits for the climate are exciting, but they need to be continuously verified and will need support to be realized. This isn’t the first time we’ve faced challenges coping with growth in electricity demand. In the 1960s, US electricity demand was growing at more than 7% per year. In the 1970s that growth was nearly 5%, and in the 1980s and 1990s it was more than 2% per year. Then, starting in 2005, we basically had a decade and a half of flat electricity growth. Most projections for the next decade put our expected growth in electricity demand at around 2% again—but this time we’ll have to do things differently. To manage these new energy demands, we need a “Grid New Deal” that leverages public and private capital to rebuild the electricity system for AI with enough capacity and intelligence for decarbonization. New clean energy supplies, investment in transmission and distribution, and strategies for virtual demand management can cut emissions, lower prices, and increase resilience. Data centers bringing clean electricity and distribution system upgrades could be given a fast lane to connect to the grid. Infrastructure banks could fund new transmission lines or pay to upgrade existing ones. Direct investment or tax incentives could encourage clean computing standards, workforce development in the clean energy sector, and open data transparency from data center operators about their energy use so that communities can understand and measure the impacts. In 2022, the White House released a Blueprint for an AI Bill of Rights that provided principles to protect the public’s rights, opportunities, and access to critical resources from being restricted by AI systems. To the AI Bill of Rights, we humbly offer a climate amendment, because ethical AI must be climate-safe AI. It’s a starting point to ensure that the growth of AI works for everyone—that it doesn’t raise people’s energy bills, adds more clean power to the grid than it uses, increases investment in the power system’s infrastructure, and benefits communities while driving innovation. By grounding the conversation about AI and energy in context about what is needed to tackle climate change, we can deliver better outcomes for communities, ecosystems, and the economy. The growth of electricity demand for AI and data centers is a test case for how society will respond to the demands and challenges of broader electrification. If we get this wrong, the likelihood of meeting our climate targets will be extremely low. This is what we mean when we say the energy and climate impacts from data centers are small, but they are also huge. Costa Samaras is the Trustee Professor of Civil and Environmental Engineering and director of the Scott Institute for Energy Innovation at Carnegie Mellon University. Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University. Ramayya Krishnan is dean of the Heinz College of Information Systems and Public Policy and the William W. and Ruth F. Cooper Professor of Management Science and Information Systems at Carnegie Mellon University.
#ais #energy #impact #still #smallbut
·17 Views