![](https://eu-images.contentstack.com/v3/assets/blt69509c9116440be8/bltca48b977c6843f6b/67a3b63932968967142ee519/cityscape_TithiLuadthong-AlamyStockPhoto.jpg)
The Cost of AI: Power Hunger -- Why the Grid Cant Support AI
www.informationweek.com
Joao-Pierre S. Ruth, Senior EditorFebruary 10, 20257 Min ReadTithi Luadthong via Alamy Stock PhotoRemember when plans to use geothermal energy from volcanoes to power bitcoin mining turned heads as examples of skyrocketing, tech-driven power consumption?If it possessed feelings, AI would probably say that was cute as it gazes hungrily at the power grid.InformationWeeks The Cost of AI series previously explored how energy bills might rise with demand from artificial intelligence, but what happens if the grid cannot meet escalating needs?Would regions be forced to ration power with rolling blackouts? Will companies have to wait their turn for access to AI and the power needed to drive it? Will more sources of power go online fast enough to absorb demand?Answers to those questions might not be as simple as adding windmills, solar panels, and more nuclear reactors to the grid. Experts from KX, GlobalFoundries, and Infosys shared some of their perspectives on AIs energy demands and the power grids struggle to accommodate this escalation.I think the most interesting benchmark to talk about is the Stargate [project] that was just announced, says Thomas Barber, vice president, communications infrastructure and data center at GlobalFoundries. The multiyear Stargate effort, announced late January, is a $500 billion plan to build AI infrastructure for OpenAI with data centers in the United States. Youre talking about building upwards of 50 to 100 gigawatts of new IT capacity every year for the next seven to eight years, and thats really just one company.Related:That is in addition to Microsoft and Google developing their own data center buildouts, he says. The scale of that, if you think about it, is the Hoover Dam generates two gigawatts per year. You need 50 new Hoover Dams per year to do it.The Stargate site planned for Abilene, Texas would include power from green energy sources, Barber says. Its wind and solar power in West Texas thats being used to supply power for that.Business Insider reported that developers also filed permits to operate natural gas turbines at Stargate's site in Abilene.Barbers says as power gets allocated to data centers, in a broad sense, some efforts to go green are being applied. It depends on whether or not you consider nuclear green, he says. Nuclear is one option, which is not carbon centric. Theres a lot of work going into collocated data centers in areas where solar is available, where wind is available.Barber says very few exponentials, such as Moores Law on microchips, last but AI is now on the upslope of the performance curve of these models. Even as AI gets tested against more difficult problems, these are still the early training days in the technologys development.Related:When AI moves from training and more into inference -- where AI draws conclusions -- Barber says demand could be significantly greater, maybe even 10 times so, than with training data. Right now, the slope is driven by training, he says. As these models roll out, as people start adopting them, the demand for inference is going to pick up and the capacity is going to go into serving inference.A Nuclear Scale MatterThe world already sees very hungry AI models, says Neil Kanungo, vice president of product led growth for KX, and that demand is expected to rise. There is a component of power generation thats growing exponentially and that is data center usage, he says. I believe its currently around 2% to 3% of all power produced in the US. The US has more data centers than anywhere in the world, but its projected to double every couple of years for the next 10 years.While AI training drives high power consumption, Kanungo says the ubiquity of AI inference makes its draw on power is significant as well. One way to improve efficiency, he says, would be to remove the transmission side of power from the equation by placing data centers closer to power plants. You get huge efficiency gains by cutting inefficiency out, where youre having over 30% losses traditionally in power generation, Kanungo says. He is also a proponent of the use of nuclear power, considering its energy load and land usage impact. The ability to put these data centers near nuclear power plants and what youre transmitting out is not power, he says. Youre transmitting data out. Youre not having losses on data transmission.Related:Nuclear power development in the United States, he says, has seen some stalling due to negative perspectives on safety and potential environmental concerns. Rising energy demands might be a catalyst to revisit such conversations. This might be the right time to switch those perceptions, Kanungo says, because you have tech giants that are willing to take the risks and handle the waste, and go through the red tape, and make this a profitable endeavor.He believes these are still the very early stages of AI adoption and as more agents are used with LLMs -- with agents completing tasks such as shopping for users, filling out tabular data, or deep research -- more computation is needed. Were just at the tip of the iceberg of agents, Kanungo says. The use cases for these transformer-based LLMs are so great, I think the demand for them is going to continue to go up and therefore we should be investing power to ensure that youre not jeopardizing residential power youre not having blackouts, youre not stealing base load.Energy Hungry GPUsThere is an unprecedented load being put on the grid according, to Ashiss Kumar Dash, executive vice president and global head - services, utilities, resources, energy and sustainability for Infosys. He says the power conundrum as it relates to AI is three-pronged.The increase in demand for electricity, increase in demand energy is unprecedented, Dash says. No other general-purpose technology has put this much demand in the past they say a ChatGPT query consumes 10 times the energy that a Google search would.Dash also cited a CNBC documentary that posited that to train an LLM today would effectively emit as much carbon dioxide as five gas-fueled cars in their entire lifetimes. There is this dimension of unprecedented load, he says. There are energy hungry GPUs, energy hungry data centers, and the cloud infrastructure that it needs.The second part of the problem, Dash says, is data centers tend to be concentrated geographically. If you look at the global data centers, we have about 8,000 data centers in the world, but you can pretty much name where the data centers are, he says. Seventy percent of the worlds internet traffic goes through Virginia. And the Data Center Alley in Virginia consumes almost 30% of the states entire electricity demand.That grid must obviously serve residents and local, commercial businesses, he says. When you concentrate the demand like this, its very difficult for the local grid to manage, Dash says. Same thing in Europe -- Ireland. Seventeen or 18% of Irelands electricity demand is on data centers.The third aspect of the problem, he says, is load growth. Utility companies tend to base their grid resiliency models on 2% to 3% maximum growth on a yearly basis, Dash says. Thats how the funding works. Thats how the rate cases are built. But now were talking, in some parts of the US, 20% growth year-on-year. Portland is going to see massive growth. California is seeing the demand.The grid and utility models are not designed to handle such fast growth, he says. For them to invest in the infrastructure and to build up transmission lines and substations and transformers is going to be a big challenge. That does not include recurring spikes in energy load in parts of the country, Dash says. If you have the data centers running at 20% higher energy demand and summer peak hits, the grid is not going to survive -- it's going to go down.However, there is some hope such outages might be avoided. AI companies, energy companies, and multiple partners are building an ecosystem to think about the problem, he says. There was even a discussion at the International Energy Agency Conference in December, he says, on using AI to work on AIs energy needs. It was good to hear tech companies, regulators, energy companies, oil and gas and utilities equally.Dash says he sees encouragement in redesigning and rethinking the grid, for example with the advent of the power usage effectiveness (PUE) metric, which can help drive more efficiency to data centers. I look at the reports and I find that quite a few organizations are able to optimize their power usage to a level where the power used for IT or tech is almost similar to the power used for the entire operations of the company, he says.Initiatives such as the creation of coolants that are more energy efficient, the creation of renewable microgrids close to data centers, and AI modeling to help utilities envision load growth are also encouraging, Dash says. Its AI solving the problem AI created.Read more about:Cost of AIAbout the AuthorJoao-Pierre S. RuthSenior EditorJoao-Pierre S. Ruth covers tech policy, including ethics, privacy, legislation, and risk; fintech; code strategy; and cloud & edge computing for InformationWeek. He has been a journalist for more than 25 years, reporting on business and technology first in New Jersey, then covering the New York tech startup community, and later as a freelancer for such outlets as TheStreet, Investopedia, and Street Fight.See more from Joao-Pierre S. RuthNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also LikeWebinarsMore WebinarsReportsMore Reports
0 التعليقات
·0 المشاركات
·33 مشاهدة