The Massachusetts Institute of Technology is a world leader in research and education.
Недавние обновления
-
The Download: meet Cathy Tie, and Anthropic’s new AI models
This is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. Meet Cathy Tie, Bride of “China’s Frankenstein” Since the Chinese biophysicist He Jiankui was released from prison in 2022, he has sought to make a scientific comeback and to repair his reputation after a three-year incarceration for illegally creating the world’s first gene-edited children. One area of visible success on his come-back trail has been his X.com account. Over the past few years, his account has evolved from sharing mundane images of his daily life to spreading outrageous, antagonistic messages. This has left observers unsure what to take seriously.Last month, in reply to MIT Technology Review’s questions about who was responsible for the account’s transformation into a font of clever memes, He emailed us back: “It’s thanks to Cathy Tie.”Tie is no stranger to the public spotlight. A former Thiel fellow, she is a partner in a project which promised to create glow-in-the-dark pets. Over the past several weeks, though, the Canadian entrepreneur has started to get more and more attention as the new wife to He Jiankui. Read the full story.
—Caiwei Chen & Antonio Regalado
Anthropic’s new hybrid AI model can work on tasks autonomously for hours at a time Anthropic has announced two new AI models that it claims represent a major step toward making AI agents truly useful. AI agents trained on Claude Opus 4, the company’s most powerful model to date, raise the bar for what such systems are capable of by tackling difficult tasks over extended periods of time and responding more usefully to user instructions, the company says. They’ve achieved some impressive results: Opus 4 created a guide for the video game Pokémon Red while playing it for more than 24 hours straight. The company’s previously most powerful model was capable of playing for just 45 minutes. Read the full story. —Rhiannon Williams The FDA plans to limit access to covid vaccines. Here’s why that’s not all bad. This week, two new leaders at the US Food and Drug Administration announced plans to limit access to covid vaccines, arguing that there is not much evidence to support the value of annual shots in healthy people. New vaccines will be made available only to the people who are most vulnerable—namely, those over 65 and others with conditions that make them more susceptible to severe disease. The plans have been met with fear and anger in some quarters. But they weren’t all that shocking to me. In the UK, where I live, covid boosters have been offered only to vulnerable groups for a while now. And the immunologists I spoke to agree: The plans make sense. Read the full story.
—Jessica Hamzelou This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Thousands of Americans are facing extreme weather But help from the federal government may never arrive.+ States struck by tornadoes and floods are begging the Trump administration for aid.2 Spain’s grid operator has accused power plants of not doing their job It claims they failed to control the system’s voltage shortly before the blackout.+ Did solar power cause Spain’s blackout?3 Google is facing a DoJ probe over its AI chatbot deal It will probe whether Google’s deal with Character.AI gives it an unfair advantage.+ It may not lead to enforcement action, though.4 DOGE isn’t bad news for everyone These smaller US government IT contractors say it’s good for business—for now.+ It appears that DOGE used a Meta AI model to review staff emails, not Grok.+ Can AI help DOGE slash government budgets? It’s complex.5 Google’s new shopping tool adds breasts to minorsTry it On distorts uploaded photos to clothing models’ proportions, even when they’re children.+ It feels like this could have easily been avoided.+ An AI companion site is hosting sexually charged conversations with underage celebrity bots.6 Apple is reportedly planning a smart glasses product launchBy the end of next year.+ It’s playing catchup with Meta and Google, among others.+ What’s next for smart glasses.7 What it’s like to live in Elon Musk’s corner of TexasComplete with an ugly bust and furious locals.+ West Lake Hills residents are pushing back against his giant fences.8 Our solar system may contain a hidden ninth planetA possible dwarf planet has been spotted orbiting beyond Neptune.9 Wikipedia does swag now How else will you let everyone know you love the open web?10 One of the last good apps is shutting down Mozilla is closing Pocket, its article-saving app, and the internet is worse for it.+ Parent company Mozilla said the way people use the web has changed.Quote of the day
“This is like the Mount Everest of corruption.” —Senator Jeff Merkley protests outside Donald Trump’s exclusive dinner for the highest-paying customers of his personal cryptocurrency, the New York Times reports. One more thing
The iPad was meant to revolutionize accessibility. What happened?On April 3, 2010, Steve Jobs debuted the iPad. What for most people was basically a more convenient form factor was something far more consequential for non-speakers: a life-changing revolution in access to a portable, powerful communication device for just a few hundred dollars. But a piece of hardware, however impressively designed and engineered, is only as valuable as what a person can do with it. After the iPad’s release, the flood of new, easy-to-use augmentative and alternative communication apps that users were in desperate need of never came.Today, there are only around half a dozen apps, each retailing for to that ask users to select from menus of crudely drawn icons to produce text and synthesized speech. It’s a depressingly slow pace of development for such an essential human function. Read the full story.—Julie Kim We can still have nice things A place for comfort, fun and distraction to brighten up your day.+ Dive into the physics behind the delicate frills of Tête de Moine cheese shavings.+ Our capacity to feel moved by music is at least partly inherited, apparently.+ Kermit the frog has delivered a moving commencement address at the University of Maryland.+ It’s a question as old as time: are clowns sexy?
#download #meet #cathy #tie #anthropicsThe Download: meet Cathy Tie, and Anthropic’s new AI modelsThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. Meet Cathy Tie, Bride of “China’s Frankenstein” Since the Chinese biophysicist He Jiankui was released from prison in 2022, he has sought to make a scientific comeback and to repair his reputation after a three-year incarceration for illegally creating the world’s first gene-edited children. One area of visible success on his come-back trail has been his X.com account. Over the past few years, his account has evolved from sharing mundane images of his daily life to spreading outrageous, antagonistic messages. This has left observers unsure what to take seriously.Last month, in reply to MIT Technology Review’s questions about who was responsible for the account’s transformation into a font of clever memes, He emailed us back: “It’s thanks to Cathy Tie.”Tie is no stranger to the public spotlight. A former Thiel fellow, she is a partner in a project which promised to create glow-in-the-dark pets. Over the past several weeks, though, the Canadian entrepreneur has started to get more and more attention as the new wife to He Jiankui. Read the full story. —Caiwei Chen & Antonio Regalado Anthropic’s new hybrid AI model can work on tasks autonomously for hours at a time Anthropic has announced two new AI models that it claims represent a major step toward making AI agents truly useful. AI agents trained on Claude Opus 4, the company’s most powerful model to date, raise the bar for what such systems are capable of by tackling difficult tasks over extended periods of time and responding more usefully to user instructions, the company says. They’ve achieved some impressive results: Opus 4 created a guide for the video game Pokémon Red while playing it for more than 24 hours straight. The company’s previously most powerful model was capable of playing for just 45 minutes. Read the full story. —Rhiannon Williams The FDA plans to limit access to covid vaccines. Here’s why that’s not all bad. This week, two new leaders at the US Food and Drug Administration announced plans to limit access to covid vaccines, arguing that there is not much evidence to support the value of annual shots in healthy people. New vaccines will be made available only to the people who are most vulnerable—namely, those over 65 and others with conditions that make them more susceptible to severe disease. The plans have been met with fear and anger in some quarters. But they weren’t all that shocking to me. In the UK, where I live, covid boosters have been offered only to vulnerable groups for a while now. And the immunologists I spoke to agree: The plans make sense. Read the full story. —Jessica Hamzelou This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Thousands of Americans are facing extreme weather But help from the federal government may never arrive.+ States struck by tornadoes and floods are begging the Trump administration for aid.2 Spain’s grid operator has accused power plants of not doing their job It claims they failed to control the system’s voltage shortly before the blackout.+ Did solar power cause Spain’s blackout?3 Google is facing a DoJ probe over its AI chatbot deal It will probe whether Google’s deal with Character.AI gives it an unfair advantage.+ It may not lead to enforcement action, though.4 DOGE isn’t bad news for everyone These smaller US government IT contractors say it’s good for business—for now.+ It appears that DOGE used a Meta AI model to review staff emails, not Grok.+ Can AI help DOGE slash government budgets? It’s complex.5 Google’s new shopping tool adds breasts to minorsTry it On distorts uploaded photos to clothing models’ proportions, even when they’re children.+ It feels like this could have easily been avoided.+ An AI companion site is hosting sexually charged conversations with underage celebrity bots.6 Apple is reportedly planning a smart glasses product launchBy the end of next year.+ It’s playing catchup with Meta and Google, among others.+ What’s next for smart glasses.7 What it’s like to live in Elon Musk’s corner of TexasComplete with an ugly bust and furious locals.+ West Lake Hills residents are pushing back against his giant fences.8 Our solar system may contain a hidden ninth planetA possible dwarf planet has been spotted orbiting beyond Neptune.9 Wikipedia does swag now How else will you let everyone know you love the open web?10 One of the last good apps is shutting down Mozilla is closing Pocket, its article-saving app, and the internet is worse for it.+ Parent company Mozilla said the way people use the web has changed.Quote of the day “This is like the Mount Everest of corruption.” —Senator Jeff Merkley protests outside Donald Trump’s exclusive dinner for the highest-paying customers of his personal cryptocurrency, the New York Times reports. One more thing The iPad was meant to revolutionize accessibility. What happened?On April 3, 2010, Steve Jobs debuted the iPad. What for most people was basically a more convenient form factor was something far more consequential for non-speakers: a life-changing revolution in access to a portable, powerful communication device for just a few hundred dollars. But a piece of hardware, however impressively designed and engineered, is only as valuable as what a person can do with it. After the iPad’s release, the flood of new, easy-to-use augmentative and alternative communication apps that users were in desperate need of never came.Today, there are only around half a dozen apps, each retailing for to that ask users to select from menus of crudely drawn icons to produce text and synthesized speech. It’s a depressingly slow pace of development for such an essential human function. Read the full story.—Julie Kim We can still have nice things A place for comfort, fun and distraction to brighten up your day.+ Dive into the physics behind the delicate frills of Tête de Moine cheese shavings.+ Our capacity to feel moved by music is at least partly inherited, apparently.+ Kermit the frog has delivered a moving commencement address at the University of Maryland.+ It’s a question as old as time: are clowns sexy? 🤡 #download #meet #cathy #tie #anthropicsWWW.TECHNOLOGYREVIEW.COMThe Download: meet Cathy Tie, and Anthropic’s new AI modelsThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. Meet Cathy Tie, Bride of “China’s Frankenstein” Since the Chinese biophysicist He Jiankui was released from prison in 2022, he has sought to make a scientific comeback and to repair his reputation after a three-year incarceration for illegally creating the world’s first gene-edited children. One area of visible success on his come-back trail has been his X.com account. Over the past few years, his account has evolved from sharing mundane images of his daily life to spreading outrageous, antagonistic messages. This has left observers unsure what to take seriously.Last month, in reply to MIT Technology Review’s questions about who was responsible for the account’s transformation into a font of clever memes, He emailed us back: “It’s thanks to Cathy Tie.”Tie is no stranger to the public spotlight. A former Thiel fellow, she is a partner in a project which promised to create glow-in-the-dark pets. Over the past several weeks, though, the Canadian entrepreneur has started to get more and more attention as the new wife to He Jiankui. Read the full story. —Caiwei Chen & Antonio Regalado Anthropic’s new hybrid AI model can work on tasks autonomously for hours at a time Anthropic has announced two new AI models that it claims represent a major step toward making AI agents truly useful. AI agents trained on Claude Opus 4, the company’s most powerful model to date, raise the bar for what such systems are capable of by tackling difficult tasks over extended periods of time and responding more usefully to user instructions, the company says. They’ve achieved some impressive results: Opus 4 created a guide for the video game Pokémon Red while playing it for more than 24 hours straight. The company’s previously most powerful model was capable of playing for just 45 minutes. Read the full story. —Rhiannon Williams The FDA plans to limit access to covid vaccines. Here’s why that’s not all bad. This week, two new leaders at the US Food and Drug Administration announced plans to limit access to covid vaccines, arguing that there is not much evidence to support the value of annual shots in healthy people. New vaccines will be made available only to the people who are most vulnerable—namely, those over 65 and others with conditions that make them more susceptible to severe disease. The plans have been met with fear and anger in some quarters. But they weren’t all that shocking to me. In the UK, where I live, covid boosters have been offered only to vulnerable groups for a while now. And the immunologists I spoke to agree: The plans make sense. Read the full story. —Jessica Hamzelou This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Thousands of Americans are facing extreme weather But help from the federal government may never arrive. (Slate $)+ States struck by tornadoes and floods are begging the Trump administration for aid. (Scientific American $)2 Spain’s grid operator has accused power plants of not doing their job It claims they failed to control the system’s voltage shortly before the blackout. (FT $)+ Did solar power cause Spain’s blackout? (MIT Technology Review)3 Google is facing a DoJ probe over its AI chatbot deal It will probe whether Google’s deal with Character.AI gives it an unfair advantage. (Bloomberg $)+ It may not lead to enforcement action, though. (Reuters) 4 DOGE isn’t bad news for everyone These smaller US government IT contractors say it’s good for business—for now. (WSJ $)+ It appears that DOGE used a Meta AI model to review staff emails, not Grok. (Wired $)+ Can AI help DOGE slash government budgets? It’s complex. (MIT Technology Review)5 Google’s new shopping tool adds breasts to minorsTry it On distorts uploaded photos to clothing models’ proportions, even when they’re children. (The Atlantic $)+ It feels like this could have easily been avoided. (Axios)+ An AI companion site is hosting sexually charged conversations with underage celebrity bots. (MIT Technology Review)6 Apple is reportedly planning a smart glasses product launchBy the end of next year. (Bloomberg $) + It’s playing catchup with Meta and Google, among others. (Engadget)+ What’s next for smart glasses. (MIT Technology Review) 7 What it’s like to live in Elon Musk’s corner of TexasComplete with an ugly bust and furious locals. (The Guardian) + West Lake Hills residents are pushing back against his giant fences. (Architectural Digest $)8 Our solar system may contain a hidden ninth planetA possible dwarf planet has been spotted orbiting beyond Neptune. (New Scientist $) 9 Wikipedia does swag now How else will you let everyone know you love the open web? (Fast Company $)10 One of the last good apps is shutting down Mozilla is closing Pocket, its article-saving app, and the internet is worse for it. (404 Media)+ Parent company Mozilla said the way people use the web has changed. (The Verge)Quote of the day “This is like the Mount Everest of corruption.” —Senator Jeff Merkley protests outside Donald Trump’s exclusive dinner for the highest-paying customers of his personal cryptocurrency, the New York Times reports. One more thing The iPad was meant to revolutionize accessibility. What happened?On April 3, 2010, Steve Jobs debuted the iPad. What for most people was basically a more convenient form factor was something far more consequential for non-speakers: a life-changing revolution in access to a portable, powerful communication device for just a few hundred dollars. But a piece of hardware, however impressively designed and engineered, is only as valuable as what a person can do with it. After the iPad’s release, the flood of new, easy-to-use augmentative and alternative communication apps that users were in desperate need of never came.Today, there are only around half a dozen apps, each retailing for $200 to $300, that ask users to select from menus of crudely drawn icons to produce text and synthesized speech. It’s a depressingly slow pace of development for such an essential human function. Read the full story.—Julie Kim We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet 'em at me.) + Dive into the physics behind the delicate frills of Tête de Moine cheese shavings.+ Our capacity to feel moved by music is at least partly inherited, apparently.+ Kermit the frog has delivered a moving commencement address at the University of Maryland.+ It’s a question as old as time: are clowns sexy? 🤡0 Комментарии 0 Поделились 0 предпросмотрВойдите, чтобы отмечать, делиться и комментировать! -
The Download: the desert data center boom, and how to measure Earth’s elevations
This is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. The data center boom in the desert In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities nearby. Meanwhile, Microsoft has acquired more than 225 acres of undeveloped property, and Apple is expanding its existing data center just across the Truckee River from the industrial park.The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—and it’s just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny. Read the full story.
—James Temple This story is part of Power Hungry: AI and our energy future—our new series shining a light on the energy demands and carbon costs of the artificial intelligence revolution. Check out the rest of the package here.
A new atomic clock in space could help us measure elevations on Earth In 2003, engineers from Germany and Switzerland began building a bridge across the Rhine River simultaneously from both sides. Months into construction, they found that the two sides did not meet. The German side hovered 54 centimeters above the Swiss one. The misalignment happened because they measured elevation from sea level differently. To prevent such costly construction errors, in 2015 scientists in the International Association of Geodesy voted to adopt the International Height Reference Frame, or IHRF, a worldwide standard for elevation. Now, a decade after its adoption, scientists are looking to update the standard—by using the most precise clock ever to fly in space. Read the full story. —Sophia Chen Three takeaways about AI’s energy use and climate impacts —Casey Crownhart This week, we published Power Hungry, a package all about AI and energy. At the center of this package is the most comprehensive look yet at AI’s growing power demand, if I do say so myself.
This data-heavy story is the result of over six months of reporting by me and my colleague James O’Donnell. Over that time, with the help of leading researchers, we quantified the energy and emissions impacts of individual queries to AI models and tallied what it all adds up to, both right now and for the years ahead. There’s a lot of data to dig through, and I hope you’ll take the time to explore the whole story. But in the meantime, here are three of my biggest takeaways from working on this project. Read the full story.This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here. MIT Technology Review Narrated: Congress used to evaluate emerging technologies. Let’s do it again. Artificial intelligence comes with a shimmer and a sheen of magical thinking. And if we’re not careful, politicians, employers, and other decision-makers may accept at face value the idea that machines can and should replace human judgment and discretion. One way to combat that might be resurrecting the Office of Technology Assessment, a Congressional think tank that detected lies and tested tech until it was shuttered in 1995. This is our latest story to be turned into a MIT Technology Review Narrated podcast, which we’re publishing each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released.
The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 OpenAI is buying Jony Ive’s AI startup The former Apple design guru will work with Sam Altman to design an entirely new range of devices.+ The deal is worth a whopping billion.+ Altman gave OpenAI staff a preview of its AI ‘companion’ devices.+ AI products to date have failed to set the world alight.2 Microsoft has blocked employee emails containing ‘Gaza’ or ‘Palestine’ Although the term ‘Israel’ does not trigger such a block.+ Protest group No Azure for Apartheid has accused the company of censorship.3 DOGE needs to do its work in secret That’s what the Trump administration is claiming to the Supreme Court, at least.+ It’s trying to avoid being forced to hand over internal documents.+ DOGE’s tech takeover threatens the safety and stability of our critical data.4 US banks are racing to embrace cryptocurrency Ahead of new stablecoin legislation.+ Attendees at Trump’s crypto dinner paid over million for the privilege.+ Bitcoin has surged to an all-time peak yet again.5 China is making huge technological leaps Thanks to the billions it’s poured into narrowing the gap between it and the US.+ Nvidia’s CEO has branded America’s chip curbs on China ‘a failure.’+ There can be no winners in a US-China AI arms race.6 Disordered eating content is rife on TikTokBut a pocket of creators are dedicated to debunking the worst of it.7 The US military is interested in the world’s largest aircraftThe gigantic WindRunner plane will have an 80-metre wingspan.+ Phase two of military AI has arrived.8 How AI is shaking up animationNew tools are slashing the costs of creating episodes by up to 90%.+ Generative AI is reshaping South Korea’s webcomics industry.9 Tesla’s Cybertruck is a flop Sorry, Elon.+ The vehicles’ resale value is plummeting.10 Google’s new AI video generator loves this terrible joke Which appears to originate from a Reddit post.+ What happened when 20 comedians got AI to write their routines.Quote of the day “It feels like we are marching off a cliff.” —An unnamed software engineering vice president jokes that future developers conferences will be attended by the AI agents companies like Microsoft are racing to deploy, Semafor reports. One more thing What does GPT-3 “know” about me?One of the biggest stories in tech is the rise of large language models that produce text that reads like a human might have written it. These models’ power comes from being trained on troves of publicly available human-created text hoovered up from the internet. If you’ve posted anything even remotely personal in English on the internet, chances are your data might be part of some of the world’s most popular LLMs.Melissa Heikkilä, MIT Technology Review’s former AI reporter, wondered what data these models might have on her—and how it could be misused. So she put OpenAI’s GPT-3 to the test. Read about what she found.We can still have nice things A place for comfort, fun and distraction to brighten up your day.+ Don’t shoot the messenger, but it seems like there’s a new pizza king in town + Ranked: every Final Destination film, from worst to best.+ Who knew that jelly could help to preserve coral reefs? Not I.+ A new generation of space archaeologists are beavering away to document our journeys to the stars.
#download #desert #data #center #boomThe Download: the desert data center boom, and how to measure Earth’s elevationsThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. The data center boom in the desert In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities nearby. Meanwhile, Microsoft has acquired more than 225 acres of undeveloped property, and Apple is expanding its existing data center just across the Truckee River from the industrial park.The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—and it’s just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny. Read the full story. —James Temple This story is part of Power Hungry: AI and our energy future—our new series shining a light on the energy demands and carbon costs of the artificial intelligence revolution. Check out the rest of the package here. A new atomic clock in space could help us measure elevations on Earth In 2003, engineers from Germany and Switzerland began building a bridge across the Rhine River simultaneously from both sides. Months into construction, they found that the two sides did not meet. The German side hovered 54 centimeters above the Swiss one. The misalignment happened because they measured elevation from sea level differently. To prevent such costly construction errors, in 2015 scientists in the International Association of Geodesy voted to adopt the International Height Reference Frame, or IHRF, a worldwide standard for elevation. Now, a decade after its adoption, scientists are looking to update the standard—by using the most precise clock ever to fly in space. Read the full story. —Sophia Chen Three takeaways about AI’s energy use and climate impacts —Casey Crownhart This week, we published Power Hungry, a package all about AI and energy. At the center of this package is the most comprehensive look yet at AI’s growing power demand, if I do say so myself. This data-heavy story is the result of over six months of reporting by me and my colleague James O’Donnell. Over that time, with the help of leading researchers, we quantified the energy and emissions impacts of individual queries to AI models and tallied what it all adds up to, both right now and for the years ahead. There’s a lot of data to dig through, and I hope you’ll take the time to explore the whole story. But in the meantime, here are three of my biggest takeaways from working on this project. Read the full story.This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here. MIT Technology Review Narrated: Congress used to evaluate emerging technologies. Let’s do it again. Artificial intelligence comes with a shimmer and a sheen of magical thinking. And if we’re not careful, politicians, employers, and other decision-makers may accept at face value the idea that machines can and should replace human judgment and discretion. One way to combat that might be resurrecting the Office of Technology Assessment, a Congressional think tank that detected lies and tested tech until it was shuttered in 1995. This is our latest story to be turned into a MIT Technology Review Narrated podcast, which we’re publishing each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 OpenAI is buying Jony Ive’s AI startup The former Apple design guru will work with Sam Altman to design an entirely new range of devices.+ The deal is worth a whopping billion.+ Altman gave OpenAI staff a preview of its AI ‘companion’ devices.+ AI products to date have failed to set the world alight.2 Microsoft has blocked employee emails containing ‘Gaza’ or ‘Palestine’ Although the term ‘Israel’ does not trigger such a block.+ Protest group No Azure for Apartheid has accused the company of censorship.3 DOGE needs to do its work in secret That’s what the Trump administration is claiming to the Supreme Court, at least.+ It’s trying to avoid being forced to hand over internal documents.+ DOGE’s tech takeover threatens the safety and stability of our critical data.4 US banks are racing to embrace cryptocurrency Ahead of new stablecoin legislation.+ Attendees at Trump’s crypto dinner paid over million for the privilege.+ Bitcoin has surged to an all-time peak yet again.5 China is making huge technological leaps Thanks to the billions it’s poured into narrowing the gap between it and the US.+ Nvidia’s CEO has branded America’s chip curbs on China ‘a failure.’+ There can be no winners in a US-China AI arms race.6 Disordered eating content is rife on TikTokBut a pocket of creators are dedicated to debunking the worst of it.7 The US military is interested in the world’s largest aircraftThe gigantic WindRunner plane will have an 80-metre wingspan.+ Phase two of military AI has arrived.8 How AI is shaking up animationNew tools are slashing the costs of creating episodes by up to 90%.+ Generative AI is reshaping South Korea’s webcomics industry.9 Tesla’s Cybertruck is a flop Sorry, Elon.+ The vehicles’ resale value is plummeting.10 Google’s new AI video generator loves this terrible joke Which appears to originate from a Reddit post.+ What happened when 20 comedians got AI to write their routines.Quote of the day “It feels like we are marching off a cliff.” —An unnamed software engineering vice president jokes that future developers conferences will be attended by the AI agents companies like Microsoft are racing to deploy, Semafor reports. One more thing What does GPT-3 “know” about me?One of the biggest stories in tech is the rise of large language models that produce text that reads like a human might have written it. These models’ power comes from being trained on troves of publicly available human-created text hoovered up from the internet. If you’ve posted anything even remotely personal in English on the internet, chances are your data might be part of some of the world’s most popular LLMs.Melissa Heikkilä, MIT Technology Review’s former AI reporter, wondered what data these models might have on her—and how it could be misused. So she put OpenAI’s GPT-3 to the test. Read about what she found.We can still have nice things A place for comfort, fun and distraction to brighten up your day.+ Don’t shoot the messenger, but it seems like there’s a new pizza king in town 🍕+ Ranked: every Final Destination film, from worst to best.+ Who knew that jelly could help to preserve coral reefs? Not I.+ A new generation of space archaeologists are beavering away to document our journeys to the stars. #download #desert #data #center #boomWWW.TECHNOLOGYREVIEW.COMThe Download: the desert data center boom, and how to measure Earth’s elevationsThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. The data center boom in the desert In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities nearby. Meanwhile, Microsoft has acquired more than 225 acres of undeveloped property, and Apple is expanding its existing data center just across the Truckee River from the industrial park.The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—and it’s just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny. Read the full story. —James Temple This story is part of Power Hungry: AI and our energy future—our new series shining a light on the energy demands and carbon costs of the artificial intelligence revolution. Check out the rest of the package here. A new atomic clock in space could help us measure elevations on Earth In 2003, engineers from Germany and Switzerland began building a bridge across the Rhine River simultaneously from both sides. Months into construction, they found that the two sides did not meet. The German side hovered 54 centimeters above the Swiss one. The misalignment happened because they measured elevation from sea level differently. To prevent such costly construction errors, in 2015 scientists in the International Association of Geodesy voted to adopt the International Height Reference Frame, or IHRF, a worldwide standard for elevation. Now, a decade after its adoption, scientists are looking to update the standard—by using the most precise clock ever to fly in space. Read the full story. —Sophia Chen Three takeaways about AI’s energy use and climate impacts —Casey Crownhart This week, we published Power Hungry, a package all about AI and energy. At the center of this package is the most comprehensive look yet at AI’s growing power demand, if I do say so myself. This data-heavy story is the result of over six months of reporting by me and my colleague James O’Donnell (and the work of many others on our team). Over that time, with the help of leading researchers, we quantified the energy and emissions impacts of individual queries to AI models and tallied what it all adds up to, both right now and for the years ahead. There’s a lot of data to dig through, and I hope you’ll take the time to explore the whole story. But in the meantime, here are three of my biggest takeaways from working on this project. Read the full story.This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here. MIT Technology Review Narrated: Congress used to evaluate emerging technologies. Let’s do it again. Artificial intelligence comes with a shimmer and a sheen of magical thinking. And if we’re not careful, politicians, employers, and other decision-makers may accept at face value the idea that machines can and should replace human judgment and discretion. One way to combat that might be resurrecting the Office of Technology Assessment, a Congressional think tank that detected lies and tested tech until it was shuttered in 1995. This is our latest story to be turned into a MIT Technology Review Narrated podcast, which we’re publishing each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 OpenAI is buying Jony Ive’s AI startup The former Apple design guru will work with Sam Altman to design an entirely new range of devices. (NYT $)+ The deal is worth a whopping $6.5 billion. (Bloomberg $)+ Altman gave OpenAI staff a preview of its AI ‘companion’ devices. (WSJ $)+ AI products to date have failed to set the world alight. (The Atlantic $)2 Microsoft has blocked employee emails containing ‘Gaza’ or ‘Palestine’ Although the term ‘Israel’ does not trigger such a block. (The Verge)+ Protest group No Azure for Apartheid has accused the company of censorship. (Fortune $) 3 DOGE needs to do its work in secret That’s what the Trump administration is claiming to the Supreme Court, at least. (Ars Technica)+ It’s trying to avoid being forced to hand over internal documents. (NYT $)+ DOGE’s tech takeover threatens the safety and stability of our critical data. (MIT Technology Review)4 US banks are racing to embrace cryptocurrency Ahead of new stablecoin legislation. (The Information $)+ Attendees at Trump’s crypto dinner paid over $1 million for the privilege. (NBC News)+ Bitcoin has surged to an all-time peak yet again. (Reuters)5 China is making huge technological leaps Thanks to the billions it’s poured into narrowing the gap between it and the US. (WSJ $)+ Nvidia’s CEO has branded America’s chip curbs on China ‘a failure.’ (FT $)+ There can be no winners in a US-China AI arms race. (MIT Technology Review)6 Disordered eating content is rife on TikTokBut a pocket of creators are dedicated to debunking the worst of it. (Wired $) 7 The US military is interested in the world’s largest aircraftThe gigantic WindRunner plane will have an 80-metre wingspan. (New Scientist $) + Phase two of military AI has arrived. (MIT Technology Review)8 How AI is shaking up animationNew tools are slashing the costs of creating episodes by up to 90%. (NYT $) + Generative AI is reshaping South Korea’s webcomics industry. (MIT Technology Review)9 Tesla’s Cybertruck is a flop Sorry, Elon. (Fast Company $)+ The vehicles’ resale value is plummeting. (The Daily Beast)10 Google’s new AI video generator loves this terrible joke Which appears to originate from a Reddit post. (404 Media)+ What happened when 20 comedians got AI to write their routines. (MIT Technology Review) Quote of the day “It feels like we are marching off a cliff.” —An unnamed software engineering vice president jokes that future developers conferences will be attended by the AI agents companies like Microsoft are racing to deploy, Semafor reports. One more thing What does GPT-3 “know” about me?One of the biggest stories in tech is the rise of large language models that produce text that reads like a human might have written it. These models’ power comes from being trained on troves of publicly available human-created text hoovered up from the internet. If you’ve posted anything even remotely personal in English on the internet, chances are your data might be part of some of the world’s most popular LLMs.Melissa Heikkilä, MIT Technology Review’s former AI reporter, wondered what data these models might have on her—and how it could be misused. So she put OpenAI’s GPT-3 to the test. Read about what she found.We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet 'em at me.) + Don’t shoot the messenger, but it seems like there’s a new pizza king in town 🍕 ($)+ Ranked: every Final Destination film, from worst to best.+ Who knew that jelly could help to preserve coral reefs? Not I.+ A new generation of space archaeologists are beavering away to document our journeys to the stars.0 Комментарии 0 Поделились 0 предпросмотр -
AI could keep us dependent on natural gas for decades to come
The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturersand after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone. When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center.
Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.”
The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost, and you know how to scale it and get it approved,” says Victor. “Even forcompanies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for per million Btu; last year, it averaged just the lowest annual priceever reported, according to the US Energy Information Administration.
Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal. But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035, relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestrationat power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology. But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US.
Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave.
The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models, can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI, a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says.
Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas. In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much.
Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is sayingneeds around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to takeword for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.” In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over million to support the Richland Parish data centers with new infrastructure, including roads and water systems. Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayerssmall businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tellwhat they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come.
#could #keep #dependent #natural #gasAI could keep us dependent on natural gas for decades to comeThe thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturersand after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone. When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center. Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.” The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost, and you know how to scale it and get it approved,” says Victor. “Even forcompanies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for per million Btu; last year, it averaged just the lowest annual priceever reported, according to the US Energy Information Administration. Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal. But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035, relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestrationat power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology. But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US. Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave. The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models, can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI, a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says. Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas. In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much. Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is sayingneeds around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to takeword for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.” In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over million to support the Richland Parish data centers with new infrastructure, including roads and water systems. Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayerssmall businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tellwhat they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come. #could #keep #dependent #natural #gasWWW.TECHNOLOGYREVIEW.COMAI could keep us dependent on natural gas for decades to comeThe thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturers (no takers) and after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive $10 billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone (the electricity for cooling and other building needs will add to that). When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend $3.2 billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center. Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.” The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost (more or less), and you know how to scale it and get it approved,” says Victor. “Even for [AI] companies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for $13 per million Btu (a measure of thermal energy); last year, it averaged just $2.21, the lowest annual price (adjusting for inflation) ever reported, according to the US Energy Information Administration (EIA). Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal. But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035 (roughly equivalent to the emissions from a large US state such as Florida), relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestration (CCS) at power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology. But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US. Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave. The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models (such as Meta’s facility in Richland Parish), can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI (the Electric Power Research Institute), a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says. Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas. In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much. Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is saying [Meta] needs around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to take [Entergy’s] word for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.” In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over $200 million to support the Richland Parish data centers with new infrastructure, including roads and water systems. Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayers [and] small businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tell [public utility commissions] what they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come.0 Комментарии 0 Поделились 0 предпросмотр -
AI’s energy impact is still small—but how we handle it is huge
With seemingly no limit to the demand for artificial intelligence, everyone in the energy, AI, and climate fields is justifiably worried. Will there be enough clean electricity to power AI and enough water to cool the data centers that support this technology? These are important questions with serious implications for communities, the economy, and the environment. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. But the question about AI’s energy usage portends even bigger issues about what we need to do in addressing climate change for the next several decades. If we can’t work out how to handle this, we won’t be able to handle broader electrification of the economy, and the climate risks we face will increase. Innovation in IT got us to this point. Graphics processing unitsthat power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break. As the accuracy of AI models dramatically improved, the electricity needed for data centers also started increasing faster; they now account for 4.4% of total demand, up from 1.9% in 2018. Data centers consume more than 10% of the electricity supply in six US states. In Virginia, which has emerged as a hub of data center activity, that figure is 25%.
Projections about the future demand for energy to power AI are uncertain and range widely, but in one study, Lawrence Berkeley National Laboratory estimated that data centers could represent 6% to 12% of total US electricity use by 2028. Communities and companies will notice this type of rapid growth in electricity demand. It will put pressure on energy prices and on ecosystems. The projections have resulted in calls to build lots of new fossil-fired power plants or bring older ones out of retirement. In many parts of the US, the demand will likely result in a surge of natural-gas-powered plants. It’s a daunting situation. Yet when we zoom out, the projected electricity use from AI is still pretty small. The US generated about 4,300 billion kilowatt-hours last year. We’ll likely need another 1,000 billion to 1,200 billion or more in the next decade—a 24% to 29% increase. Almost half the additional electricity demand will be from electrified vehicles. Another 30% is expected to be from electrified technologies in buildings and industry. Innovation in vehicle and building electrification also advanced in the last decade, and this shift will be good news for the climate, for communities, and for energy costs.
The remaining 22% of new electricity demand is estimated to come from AI and data centers. While it represents a smaller piece of the pie, it’s the most urgent one. Because of their rapid growth and geographic concentration, data centers are the electrification challenge we face right now—the small stuff we have to figure out before we’re able to do the big stuff like vehicles and buildings. We also need to understand what the energy consumption and carbon emissions associated with AI are buying us. While the impacts from producing semiconductors and powering AI data centers are important, they are likely small compared with the positive or negative effects AI may have on applications such as the electricity grid, the transportation system, buildings and factories, or consumer behavior. Companies could use AI to develop new materials or batteries that would better integrate renewable energy into the grid. But they could also use AI to make it easier to find more fossil fuels. The claims about potential benefits for the climate are exciting, but they need to be continuously verified and will need support to be realized. This isn’t the first time we’ve faced challenges coping with growth in electricity demand. In the 1960s, US electricity demand was growing at more than 7% per year. In the 1970s that growth was nearly 5%, and in the 1980s and 1990s it was more than 2% per year. Then, starting in 2005, we basically had a decade and a half of flat electricity growth. Most projections for the next decade put our expected growth in electricity demand at around 2% again—but this time we’ll have to do things differently. To manage these new energy demands, we need a “Grid New Deal” that leverages public and private capital to rebuild the electricity system for AI with enough capacity and intelligence for decarbonization. New clean energy supplies, investment in transmission and distribution, and strategies for virtual demand management can cut emissions, lower prices, and increase resilience. Data centers bringing clean electricity and distribution system upgrades could be given a fast lane to connect to the grid. Infrastructure banks could fund new transmission lines or pay to upgrade existing ones. Direct investment or tax incentives could encourage clean computing standards, workforce development in the clean energy sector, and open data transparency from data center operators about their energy use so that communities can understand and measure the impacts. In 2022, the White House released a Blueprint for an AI Bill of Rights that provided principles to protect the public’s rights, opportunities, and access to critical resources from being restricted by AI systems. To the AI Bill of Rights, we humbly offer a climate amendment, because ethical AI must be climate-safe AI. It’s a starting point to ensure that the growth of AI works for everyone—that it doesn’t raise people’s energy bills, adds more clean power to the grid than it uses, increases investment in the power system’s infrastructure, and benefits communities while driving innovation. By grounding the conversation about AI and energy in context about what is needed to tackle climate change, we can deliver better outcomes for communities, ecosystems, and the economy. The growth of electricity demand for AI and data centers is a test case for how society will respond to the demands and challenges of broader electrification. If we get this wrong, the likelihood of meeting our climate targets will be extremely low. This is what we mean when we say the energy and climate impacts from data centers are small, but they are also huge. Costa Samaras is the Trustee Professor of Civil and Environmental Engineering and director of the Scott Institute for Energy Innovation at Carnegie Mellon University. Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University. Ramayya Krishnan is dean of the Heinz College of Information Systems and Public Policy and the William W. and Ruth F. Cooper Professor of Management Science and Information Systems at Carnegie Mellon University.
#ais #energy #impact #still #smallbutAI’s energy impact is still small—but how we handle it is hugeWith seemingly no limit to the demand for artificial intelligence, everyone in the energy, AI, and climate fields is justifiably worried. Will there be enough clean electricity to power AI and enough water to cool the data centers that support this technology? These are important questions with serious implications for communities, the economy, and the environment. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. But the question about AI’s energy usage portends even bigger issues about what we need to do in addressing climate change for the next several decades. If we can’t work out how to handle this, we won’t be able to handle broader electrification of the economy, and the climate risks we face will increase. Innovation in IT got us to this point. Graphics processing unitsthat power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break. As the accuracy of AI models dramatically improved, the electricity needed for data centers also started increasing faster; they now account for 4.4% of total demand, up from 1.9% in 2018. Data centers consume more than 10% of the electricity supply in six US states. In Virginia, which has emerged as a hub of data center activity, that figure is 25%. Projections about the future demand for energy to power AI are uncertain and range widely, but in one study, Lawrence Berkeley National Laboratory estimated that data centers could represent 6% to 12% of total US electricity use by 2028. Communities and companies will notice this type of rapid growth in electricity demand. It will put pressure on energy prices and on ecosystems. The projections have resulted in calls to build lots of new fossil-fired power plants or bring older ones out of retirement. In many parts of the US, the demand will likely result in a surge of natural-gas-powered plants. It’s a daunting situation. Yet when we zoom out, the projected electricity use from AI is still pretty small. The US generated about 4,300 billion kilowatt-hours last year. We’ll likely need another 1,000 billion to 1,200 billion or more in the next decade—a 24% to 29% increase. Almost half the additional electricity demand will be from electrified vehicles. Another 30% is expected to be from electrified technologies in buildings and industry. Innovation in vehicle and building electrification also advanced in the last decade, and this shift will be good news for the climate, for communities, and for energy costs. The remaining 22% of new electricity demand is estimated to come from AI and data centers. While it represents a smaller piece of the pie, it’s the most urgent one. Because of their rapid growth and geographic concentration, data centers are the electrification challenge we face right now—the small stuff we have to figure out before we’re able to do the big stuff like vehicles and buildings. We also need to understand what the energy consumption and carbon emissions associated with AI are buying us. While the impacts from producing semiconductors and powering AI data centers are important, they are likely small compared with the positive or negative effects AI may have on applications such as the electricity grid, the transportation system, buildings and factories, or consumer behavior. Companies could use AI to develop new materials or batteries that would better integrate renewable energy into the grid. But they could also use AI to make it easier to find more fossil fuels. The claims about potential benefits for the climate are exciting, but they need to be continuously verified and will need support to be realized. This isn’t the first time we’ve faced challenges coping with growth in electricity demand. In the 1960s, US electricity demand was growing at more than 7% per year. In the 1970s that growth was nearly 5%, and in the 1980s and 1990s it was more than 2% per year. Then, starting in 2005, we basically had a decade and a half of flat electricity growth. Most projections for the next decade put our expected growth in electricity demand at around 2% again—but this time we’ll have to do things differently. To manage these new energy demands, we need a “Grid New Deal” that leverages public and private capital to rebuild the electricity system for AI with enough capacity and intelligence for decarbonization. New clean energy supplies, investment in transmission and distribution, and strategies for virtual demand management can cut emissions, lower prices, and increase resilience. Data centers bringing clean electricity and distribution system upgrades could be given a fast lane to connect to the grid. Infrastructure banks could fund new transmission lines or pay to upgrade existing ones. Direct investment or tax incentives could encourage clean computing standards, workforce development in the clean energy sector, and open data transparency from data center operators about their energy use so that communities can understand and measure the impacts. In 2022, the White House released a Blueprint for an AI Bill of Rights that provided principles to protect the public’s rights, opportunities, and access to critical resources from being restricted by AI systems. To the AI Bill of Rights, we humbly offer a climate amendment, because ethical AI must be climate-safe AI. It’s a starting point to ensure that the growth of AI works for everyone—that it doesn’t raise people’s energy bills, adds more clean power to the grid than it uses, increases investment in the power system’s infrastructure, and benefits communities while driving innovation. By grounding the conversation about AI and energy in context about what is needed to tackle climate change, we can deliver better outcomes for communities, ecosystems, and the economy. The growth of electricity demand for AI and data centers is a test case for how society will respond to the demands and challenges of broader electrification. If we get this wrong, the likelihood of meeting our climate targets will be extremely low. This is what we mean when we say the energy and climate impacts from data centers are small, but they are also huge. Costa Samaras is the Trustee Professor of Civil and Environmental Engineering and director of the Scott Institute for Energy Innovation at Carnegie Mellon University. Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University. Ramayya Krishnan is dean of the Heinz College of Information Systems and Public Policy and the William W. and Ruth F. Cooper Professor of Management Science and Information Systems at Carnegie Mellon University. #ais #energy #impact #still #smallbutWWW.TECHNOLOGYREVIEW.COMAI’s energy impact is still small—but how we handle it is hugeWith seemingly no limit to the demand for artificial intelligence, everyone in the energy, AI, and climate fields is justifiably worried. Will there be enough clean electricity to power AI and enough water to cool the data centers that support this technology? These are important questions with serious implications for communities, the economy, and the environment. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. But the question about AI’s energy usage portends even bigger issues about what we need to do in addressing climate change for the next several decades. If we can’t work out how to handle this, we won’t be able to handle broader electrification of the economy, and the climate risks we face will increase. Innovation in IT got us to this point. Graphics processing units (GPUs) that power the computing behind AI have fallen in cost by 99% since 2006. There was similar concern about the energy use of data centers in the early 2010s, with wild projections of growth in electricity demand. But gains in computing power and energy efficiency not only proved these projections wrong but enabled a 550% increase in global computing capability from 2010 to 2018 with only minimal increases in energy use. In the late 2010s, however, the trends that had saved us began to break. As the accuracy of AI models dramatically improved, the electricity needed for data centers also started increasing faster; they now account for 4.4% of total demand, up from 1.9% in 2018. Data centers consume more than 10% of the electricity supply in six US states. In Virginia, which has emerged as a hub of data center activity, that figure is 25%. Projections about the future demand for energy to power AI are uncertain and range widely, but in one study, Lawrence Berkeley National Laboratory estimated that data centers could represent 6% to 12% of total US electricity use by 2028. Communities and companies will notice this type of rapid growth in electricity demand. It will put pressure on energy prices and on ecosystems. The projections have resulted in calls to build lots of new fossil-fired power plants or bring older ones out of retirement. In many parts of the US, the demand will likely result in a surge of natural-gas-powered plants. It’s a daunting situation. Yet when we zoom out, the projected electricity use from AI is still pretty small. The US generated about 4,300 billion kilowatt-hours last year. We’ll likely need another 1,000 billion to 1,200 billion or more in the next decade—a 24% to 29% increase. Almost half the additional electricity demand will be from electrified vehicles. Another 30% is expected to be from electrified technologies in buildings and industry. Innovation in vehicle and building electrification also advanced in the last decade, and this shift will be good news for the climate, for communities, and for energy costs. The remaining 22% of new electricity demand is estimated to come from AI and data centers. While it represents a smaller piece of the pie, it’s the most urgent one. Because of their rapid growth and geographic concentration, data centers are the electrification challenge we face right now—the small stuff we have to figure out before we’re able to do the big stuff like vehicles and buildings. We also need to understand what the energy consumption and carbon emissions associated with AI are buying us. While the impacts from producing semiconductors and powering AI data centers are important, they are likely small compared with the positive or negative effects AI may have on applications such as the electricity grid, the transportation system, buildings and factories, or consumer behavior. Companies could use AI to develop new materials or batteries that would better integrate renewable energy into the grid. But they could also use AI to make it easier to find more fossil fuels. The claims about potential benefits for the climate are exciting, but they need to be continuously verified and will need support to be realized. This isn’t the first time we’ve faced challenges coping with growth in electricity demand. In the 1960s, US electricity demand was growing at more than 7% per year. In the 1970s that growth was nearly 5%, and in the 1980s and 1990s it was more than 2% per year. Then, starting in 2005, we basically had a decade and a half of flat electricity growth. Most projections for the next decade put our expected growth in electricity demand at around 2% again—but this time we’ll have to do things differently. To manage these new energy demands, we need a “Grid New Deal” that leverages public and private capital to rebuild the electricity system for AI with enough capacity and intelligence for decarbonization. New clean energy supplies, investment in transmission and distribution, and strategies for virtual demand management can cut emissions, lower prices, and increase resilience. Data centers bringing clean electricity and distribution system upgrades could be given a fast lane to connect to the grid. Infrastructure banks could fund new transmission lines or pay to upgrade existing ones. Direct investment or tax incentives could encourage clean computing standards, workforce development in the clean energy sector, and open data transparency from data center operators about their energy use so that communities can understand and measure the impacts. In 2022, the White House released a Blueprint for an AI Bill of Rights that provided principles to protect the public’s rights, opportunities, and access to critical resources from being restricted by AI systems. To the AI Bill of Rights, we humbly offer a climate amendment, because ethical AI must be climate-safe AI. It’s a starting point to ensure that the growth of AI works for everyone—that it doesn’t raise people’s energy bills, adds more clean power to the grid than it uses, increases investment in the power system’s infrastructure, and benefits communities while driving innovation. By grounding the conversation about AI and energy in context about what is needed to tackle climate change, we can deliver better outcomes for communities, ecosystems, and the economy. The growth of electricity demand for AI and data centers is a test case for how society will respond to the demands and challenges of broader electrification. If we get this wrong, the likelihood of meeting our climate targets will be extremely low. This is what we mean when we say the energy and climate impacts from data centers are small, but they are also huge. Costa Samaras is the Trustee Professor of Civil and Environmental Engineering and director of the Scott Institute for Energy Innovation at Carnegie Mellon University. Emma Strubell is the Raj Reddy Assistant Professor in the Language Technologies Institute in the School of Computer Science at Carnegie Mellon University. Ramayya Krishnan is dean of the Heinz College of Information Systems and Public Policy and the William W. and Ruth F. Cooper Professor of Management Science and Information Systems at Carnegie Mellon University.0 Комментарии 0 Поделились 0 предпросмотр -
The data center boom in the desert
In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny. Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects. But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.”
Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade. That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe.
It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story. Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah. “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.
He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.
Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part.
“We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes. But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average.
The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake. Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.
The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought. About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground. It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts. In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades. “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California. These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways.
As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration. Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research. But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface. The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources. “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center. “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction. We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip. Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable. Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system. But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline. Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis. When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated. More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river. Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas. “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks. Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region. “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email. Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling. But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year. Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review. Coming conflicts The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says. Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers. Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests. Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center. Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters. The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”
#data #center #boom #desertThe data center boom in the desertIn the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny. Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects. But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade. That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story. Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah. “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills. He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space. Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes. But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake. Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained. The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought. About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground. It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts. In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades. “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California. These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration. Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research. But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface. The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources. “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center. “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction. We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip. Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable. Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system. But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline. Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis. When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated. More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river. Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas. “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks. Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region. “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email. Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling. But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year. Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review. Coming conflicts The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says. Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers. Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests. Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center. Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters. The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.” #data #center #boom #desertWWW.TECHNOLOGYREVIEW.COMThe data center boom in the desertIn the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny. Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects. But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade. That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story. Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah. “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills. He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space. Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend $400 million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park (although several lots are available for resale following the failed gamble of one crypto tenant). When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes. But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake. Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained. The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought. About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground. It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts. In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades. “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California. These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside. (The research has since been peer-reviewed and is awaiting publication.) What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity. (“Consumed” here means the water is evaporated, not merely withdrawn and returned to the engineered water system.) The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration. Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research. But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface. The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than $4 billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources. “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center. “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction. We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip. Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable. Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system. But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline. Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis. When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated. More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river. Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline from [the Truckee Meadows Water Authority] to our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas. “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks. Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region. “While the land purchase is public knowledge, we have not disclosed specific details [of] our plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email. Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling. But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year. Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review. Coming conflicts The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says. Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers. Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests. Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center. Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters. The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”0 Комментарии 0 Поделились 0 предпросмотр -
We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
#did #math #ais #energy #footprintWe did the math on AI’s energy footprint. Here’s the story you haven’t heard.We did the math on AI’s energy footprint. Here’s the story you haven’t heard. #did #math #ais #energy #footprintWWW.TECHNOLOGYREVIEW.COMWe did the math on AI’s energy footprint. Here’s the story you haven’t heard.We did the math on AI’s energy footprint. Here’s the story you haven’t heard.0 Комментарии 0 Поделились 0 предпросмотр -
Everything you need to know about estimating AI’s energy and emissions burden
When we set out to write a story on the best available estimates for AI’s energy and emissions burden, we knew there would be caveats and uncertainties to these numbers. But, we quickly discovered, the caveats are the story too. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Measuring the energy used by an AI model is not like evaluating a car’s fuel economy or an appliance’s energy rating. There’s no agreed-upon method or public database of values. There are no regulators who enforce standards, and consumers don’t get the chance to evaluate one model against another. Despite the fact that billions of dollars are being poured into reshaping energy infrastructure around the needs of AI, no one has settled on a way to quantify AI’s energy usage. Worse, companies are generally unwilling to disclose their own piece of the puzzle. There are also limitations to estimating the emissions associated with that energy demand, because the grid hosts a complicated, ever-changing mix of energy sources. It’s a big mess, basically. So, that said, here are the many variables, assumptions, and caveats that we used to calculate the consequences of an AI query.Measuring the energy a model uses Companies like OpenAI, dealing in “closed-source” models, generally offer access to their systems through an interface where you input a question and receive an answer. What happens in between—which data center in the world processes your request, the energy it takes to do so, and the carbon intensity of the energy sources used—remains a secret, knowable only to the companies. There are few incentives for them to release this information, and so far, most have not. That’s why, for our analysis, we looked at open-source models. They serve as a very imperfect proxy but the best one we have.
The best resources for measuring the energy consumption of open-source AI models are AI Energy Score, ML.Energy, and MLPerf Power. The team behind ML.Energy assisted us with our text and image model calculations, and the team behind AI Energy Score helped with our video model calculations. Text models AI models use up energy in two phases: when they initially learn from vast amounts of data, called training, and when they respond to queries, called inference. When ChatGPT was launched a few years ago, training was the focus, as tech companies raced to keep up and build ever-bigger models. But now, inference is where the most energy is used. The most accurate way to understand how much energy an AI model uses in the inference stage is to directly measure the amount of electricity used by the server handling the request. Servers contain all sorts of components—powerful chips called GPUs that do the bulk of the computing, other chips called CPUs, fans to keep everything cool, and more. Researchers typically measure the amount of power the GPU draws and estimate the rest. To do this, we turned to PhD candidate Jae-Won Chung and associate professor Mosharaf Chowdhury at the University of Michigan, who lead the ML.Energy project. Once we collected figures for different models’ GPU energy use from their team, we had to estimate how much energy is used for other processes, like cooling. We examined research literature, including a 2024 paper from Microsoft, to understand how much of a server’s total energy demand GPUs are responsible for. It turns out to be about half. So we took the team’s GPU energy estimate and doubled it to get a sense of total energy demands. The ML.Energy team uses a batch of 500 prompts from a larger dataset to test models. The hardware is kept the same throughout; the GPU is a popular Nvidia chip called the H100. We decided to focus on models of three sizes from the Meta Llama family: small, medium, and large. We also identified a selection of prompts to test. We compared these with the averages for the entire batch of 500 prompts. Image models Stable Diffusion 3 from Stability AI is one of the most commonly used open-source image-generating models, so we made it our focus. Though we tested multiple sizes of the text-based Meta Llama model, we focused on one of the most popular sizes of Stable Diffusion 3, with 2 billion parameters. The team uses a dataset of example prompts to test a model’s energy requirements. Though the energy used by large language models is determined partially by the prompt, this isn’t true for diffusion models. Diffusion models can be programmed to go through a prescribed number of “denoising steps” when they generate an image or video, with each step being an iteration of the algorithm that adds more detail to the image. For a given step count and model, all images generated have the same energy footprint. The more steps, the higher quality the end result—but the more energy used. Numbers of steps vary by model and application, but 25 is pretty common, and that’s what we used for our standard quality. For higher quality, we used 50 steps.
We mentioned that GPUs are usually responsible for about half of the energy demands of large language model requests. There is not sufficient research to know how this changes for diffusion models that generate images and videos. In the absence of a better estimate, and after consulting with researchers, we opted to stick with this 50% rule of thumb for images and videos too. Video models Chung and Chowdhury do test video models, but only ones that generate short, low-quality GIFs. We don’t think the videos these models produce mirror the fidelity of the AI-generated video that many people are used to seeing. Instead, we turned to Sasha Luccioni, the AI and climate lead at Hugging Face, who directs the AI Energy Score project. She measures the energy used by the GPU during AI requests. We chose two versions of the CogVideoX model to test: an older, lower-quality version and a newer, higher-quality one. We asked Luccioni to use her tool, called Code Carbon, to test both and measure the results of a batch of video prompts we selected, using the same hardware as our text and image tests to keep as many variables as possible the same. She reported the GPU energy demands, which we again doubled to estimate total energy demands. Tracing where that energy comes from After we understand how much energy it takes to respond to a query, we can translate that into the total emissions impact. Doing so requires looking at the power grid from which data centers draw their electricity. Nailing down the climate impact of the grid can be complicated, because it’s both interconnected and incredibly local. Imagine the grid as a system of connected canals and pools of water. Power plants add water to the canals, and electricity users, or loads, siphon it out. In the US, grid interconnections stretch all the way across the country. So, in a way, we’re all connected, but we can also break the grid up into its component pieces to get a sense for how energy sources vary across the country. Understanding carbon intensity The key metric to understand here is called carbon intensity, which is basically a measure of how many grams of carbon dioxide pollution are released for every kilowatt-hour of electricity that’s produced. To get carbon intensity figures, we reached out to Electricity Maps, a Danish startup company that gathers data on grids around the world. The team collects information from sources including governments and utilities and uses them to publish historical and real-time estimates of the carbon intensity of the grid. You can find more about their methodology here.
The company shared with us historical data from 2024, both for the entire US and for a few key balancing authorities. After discussions with Electricity Maps founder Olivier Corradi and other experts, we made a few decisions about which figures we would use in our calculations. One way to measure carbon intensity is to simply look at all the power plants that are operating on the grid, add up the pollution they’re producing at the moment, and divide that total by the electricity they’re producing. But that doesn’t account for the emissions that are associated with building and tearing down power plants, which can be significant. So we chose to use carbon intensity figures that account for the whole life cycle of a power plant.
We also chose to use the consumption-based carbon intensity of energy rather than production-based. This figure accounts for imports and exports moving between different parts of the grid and best represents the electricity that’s being used, in real time, within a given region. For most of the calculations you see in the story, we used the average carbon intensity for the US for 2024, according to Electricity Maps, which is 402.49 grams of carbon dioxide equivalent per kilowatt-hour. Understanding balancing authorities While understanding the picture across the entire US can be helpful, the grid can look incredibly different in different locations. One way we can break things up is by looking at balancing authorities. These are independent bodies responsible for grid balancing in a specific region. They operate mostly independently, though there’s a constant movement of electricity between them as well. There are 66 balancing authorities in the US, and we can calculate a carbon intensity for the part of the grid encompassed by a specific balancing authority. Electricity Maps provided carbon intensity figures for a few key balancing authorities, and we focused on several that play the largest roles in data center operations. ERCOTand PJMare two of the regions with the largest burden of data centers, according to research from the Harvard School of Public Health. We added CAISObecause it covers the most populated state in the US. CAISO also manages a grid with a significant number of renewable energy sources, making it a good example of how carbon intensity can change drastically depending on the time of day.One key caveat here is that we’re not entirely sure where companies tend to send individual AI inference requests. There are clusters of data centers in the regions we chose as examples, but when you use a tech giant’s AI model, your request could be handled by any number of data centers owned or contracted by the company. One reasonable approximation is location: It’s likely that the data center servicing a request is close to where it’s being made, so a request on the West Coast might be most likely to be routed to a data center on that side of the country. Explaining what we found To better contextualize our calculations, we introduced a few comparisons people might be more familiar with than kilowatt-hours and grams of carbon dioxide. In a few places, we took the amount of electricity estimated to be used by a model and calculated how long that electricity would be able to power a standard microwave, as well as how far it might take someone on an e-bike. In the case of the e-bike, we assumed an efficiency of 25 watt-hours per mile, which falls in the range of frequently cited efficiencies for a pedal-assisted bike. For the microwave, we assumed an 800-watt model, which falls within the average range in the US. We also introduced a comparison to contextualize greenhouse gas emissions: miles driven in a gas-powered car. For this, we used data from the US Environmental Protection Agency, which puts the weighted average fuel economy of vehicles in the US in 2022 at 393 grams of carbon dioxide equivalent per mile.
Predicting how much energy AI will use in the future After measuring the energy demand of an individual query and the emissions it generated, it was time to estimate how all of this added up to national demand. There are two ways to do this. In a bottom-up analysis, you estimate how many individual queries there are, calculate the energy demands of each, and add them up to determine the total. For a top-down look, you estimate how much energy all data centers are using by looking at larger trends. Bottom-up is particularly difficult, because, once again, closed-source companies do not share such information and declined to talk specifics with us. While we can make some educated guesses to give us a picture of what might be happening right now, looking into the future is perhaps better served by taking a top-down approach. This data is scarce as well. The most important report was published in December by the Lawrence Berkeley National Laboratory, which is funded by the Department of Energy, and the report authors noted that it’s only the third such report released in the last 20 years. Academic climate and energy researchers we spoke with said it’s a major problem that AI is not considered its own economic sector for emissions measurements, and there aren’t rigorous reporting requirements. As a result, it’s difficult to track AI’s climate toll. Still, we examined the report’s results, compared them with other findings and estimates, and consulted independent experts about the data. While much of the report was about data centers more broadly, we drew out data points that were specific to the future of AI. Company goals We wanted to contrast these figures with the amounts of energy that AI companies themselves say they need. To do so, we collected reports by leading tech and AI companies about their plans for energy and data center expansions, as well as the dollar amounts they promised to invest. Where possible, we fact-checked the promises made in these claims. Requests to companies We submitted requests to Microsoft, Google, and OpenAI to have data-driven conversations about their models’ energy demands for AI inference. None of the companies made executives or leadership available for on-the-record interviews about their energy usage. This story was supported by a grant from the Tarbell Center for AI Journalism.
#everything #you #need #know #aboutEverything you need to know about estimating AI’s energy and emissions burdenWhen we set out to write a story on the best available estimates for AI’s energy and emissions burden, we knew there would be caveats and uncertainties to these numbers. But, we quickly discovered, the caveats are the story too. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Measuring the energy used by an AI model is not like evaluating a car’s fuel economy or an appliance’s energy rating. There’s no agreed-upon method or public database of values. There are no regulators who enforce standards, and consumers don’t get the chance to evaluate one model against another. Despite the fact that billions of dollars are being poured into reshaping energy infrastructure around the needs of AI, no one has settled on a way to quantify AI’s energy usage. Worse, companies are generally unwilling to disclose their own piece of the puzzle. There are also limitations to estimating the emissions associated with that energy demand, because the grid hosts a complicated, ever-changing mix of energy sources. It’s a big mess, basically. So, that said, here are the many variables, assumptions, and caveats that we used to calculate the consequences of an AI query.Measuring the energy a model uses Companies like OpenAI, dealing in “closed-source” models, generally offer access to their systems through an interface where you input a question and receive an answer. What happens in between—which data center in the world processes your request, the energy it takes to do so, and the carbon intensity of the energy sources used—remains a secret, knowable only to the companies. There are few incentives for them to release this information, and so far, most have not. That’s why, for our analysis, we looked at open-source models. They serve as a very imperfect proxy but the best one we have. The best resources for measuring the energy consumption of open-source AI models are AI Energy Score, ML.Energy, and MLPerf Power. The team behind ML.Energy assisted us with our text and image model calculations, and the team behind AI Energy Score helped with our video model calculations. Text models AI models use up energy in two phases: when they initially learn from vast amounts of data, called training, and when they respond to queries, called inference. When ChatGPT was launched a few years ago, training was the focus, as tech companies raced to keep up and build ever-bigger models. But now, inference is where the most energy is used. The most accurate way to understand how much energy an AI model uses in the inference stage is to directly measure the amount of electricity used by the server handling the request. Servers contain all sorts of components—powerful chips called GPUs that do the bulk of the computing, other chips called CPUs, fans to keep everything cool, and more. Researchers typically measure the amount of power the GPU draws and estimate the rest. To do this, we turned to PhD candidate Jae-Won Chung and associate professor Mosharaf Chowdhury at the University of Michigan, who lead the ML.Energy project. Once we collected figures for different models’ GPU energy use from their team, we had to estimate how much energy is used for other processes, like cooling. We examined research literature, including a 2024 paper from Microsoft, to understand how much of a server’s total energy demand GPUs are responsible for. It turns out to be about half. So we took the team’s GPU energy estimate and doubled it to get a sense of total energy demands. The ML.Energy team uses a batch of 500 prompts from a larger dataset to test models. The hardware is kept the same throughout; the GPU is a popular Nvidia chip called the H100. We decided to focus on models of three sizes from the Meta Llama family: small, medium, and large. We also identified a selection of prompts to test. We compared these with the averages for the entire batch of 500 prompts. Image models Stable Diffusion 3 from Stability AI is one of the most commonly used open-source image-generating models, so we made it our focus. Though we tested multiple sizes of the text-based Meta Llama model, we focused on one of the most popular sizes of Stable Diffusion 3, with 2 billion parameters. The team uses a dataset of example prompts to test a model’s energy requirements. Though the energy used by large language models is determined partially by the prompt, this isn’t true for diffusion models. Diffusion models can be programmed to go through a prescribed number of “denoising steps” when they generate an image or video, with each step being an iteration of the algorithm that adds more detail to the image. For a given step count and model, all images generated have the same energy footprint. The more steps, the higher quality the end result—but the more energy used. Numbers of steps vary by model and application, but 25 is pretty common, and that’s what we used for our standard quality. For higher quality, we used 50 steps. We mentioned that GPUs are usually responsible for about half of the energy demands of large language model requests. There is not sufficient research to know how this changes for diffusion models that generate images and videos. In the absence of a better estimate, and after consulting with researchers, we opted to stick with this 50% rule of thumb for images and videos too. Video models Chung and Chowdhury do test video models, but only ones that generate short, low-quality GIFs. We don’t think the videos these models produce mirror the fidelity of the AI-generated video that many people are used to seeing. Instead, we turned to Sasha Luccioni, the AI and climate lead at Hugging Face, who directs the AI Energy Score project. She measures the energy used by the GPU during AI requests. We chose two versions of the CogVideoX model to test: an older, lower-quality version and a newer, higher-quality one. We asked Luccioni to use her tool, called Code Carbon, to test both and measure the results of a batch of video prompts we selected, using the same hardware as our text and image tests to keep as many variables as possible the same. She reported the GPU energy demands, which we again doubled to estimate total energy demands. Tracing where that energy comes from After we understand how much energy it takes to respond to a query, we can translate that into the total emissions impact. Doing so requires looking at the power grid from which data centers draw their electricity. Nailing down the climate impact of the grid can be complicated, because it’s both interconnected and incredibly local. Imagine the grid as a system of connected canals and pools of water. Power plants add water to the canals, and electricity users, or loads, siphon it out. In the US, grid interconnections stretch all the way across the country. So, in a way, we’re all connected, but we can also break the grid up into its component pieces to get a sense for how energy sources vary across the country. Understanding carbon intensity The key metric to understand here is called carbon intensity, which is basically a measure of how many grams of carbon dioxide pollution are released for every kilowatt-hour of electricity that’s produced. To get carbon intensity figures, we reached out to Electricity Maps, a Danish startup company that gathers data on grids around the world. The team collects information from sources including governments and utilities and uses them to publish historical and real-time estimates of the carbon intensity of the grid. You can find more about their methodology here. The company shared with us historical data from 2024, both for the entire US and for a few key balancing authorities. After discussions with Electricity Maps founder Olivier Corradi and other experts, we made a few decisions about which figures we would use in our calculations. One way to measure carbon intensity is to simply look at all the power plants that are operating on the grid, add up the pollution they’re producing at the moment, and divide that total by the electricity they’re producing. But that doesn’t account for the emissions that are associated with building and tearing down power plants, which can be significant. So we chose to use carbon intensity figures that account for the whole life cycle of a power plant. We also chose to use the consumption-based carbon intensity of energy rather than production-based. This figure accounts for imports and exports moving between different parts of the grid and best represents the electricity that’s being used, in real time, within a given region. For most of the calculations you see in the story, we used the average carbon intensity for the US for 2024, according to Electricity Maps, which is 402.49 grams of carbon dioxide equivalent per kilowatt-hour. Understanding balancing authorities While understanding the picture across the entire US can be helpful, the grid can look incredibly different in different locations. One way we can break things up is by looking at balancing authorities. These are independent bodies responsible for grid balancing in a specific region. They operate mostly independently, though there’s a constant movement of electricity between them as well. There are 66 balancing authorities in the US, and we can calculate a carbon intensity for the part of the grid encompassed by a specific balancing authority. Electricity Maps provided carbon intensity figures for a few key balancing authorities, and we focused on several that play the largest roles in data center operations. ERCOTand PJMare two of the regions with the largest burden of data centers, according to research from the Harvard School of Public Health. We added CAISObecause it covers the most populated state in the US. CAISO also manages a grid with a significant number of renewable energy sources, making it a good example of how carbon intensity can change drastically depending on the time of day.One key caveat here is that we’re not entirely sure where companies tend to send individual AI inference requests. There are clusters of data centers in the regions we chose as examples, but when you use a tech giant’s AI model, your request could be handled by any number of data centers owned or contracted by the company. One reasonable approximation is location: It’s likely that the data center servicing a request is close to where it’s being made, so a request on the West Coast might be most likely to be routed to a data center on that side of the country. Explaining what we found To better contextualize our calculations, we introduced a few comparisons people might be more familiar with than kilowatt-hours and grams of carbon dioxide. In a few places, we took the amount of electricity estimated to be used by a model and calculated how long that electricity would be able to power a standard microwave, as well as how far it might take someone on an e-bike. In the case of the e-bike, we assumed an efficiency of 25 watt-hours per mile, which falls in the range of frequently cited efficiencies for a pedal-assisted bike. For the microwave, we assumed an 800-watt model, which falls within the average range in the US. We also introduced a comparison to contextualize greenhouse gas emissions: miles driven in a gas-powered car. For this, we used data from the US Environmental Protection Agency, which puts the weighted average fuel economy of vehicles in the US in 2022 at 393 grams of carbon dioxide equivalent per mile. Predicting how much energy AI will use in the future After measuring the energy demand of an individual query and the emissions it generated, it was time to estimate how all of this added up to national demand. There are two ways to do this. In a bottom-up analysis, you estimate how many individual queries there are, calculate the energy demands of each, and add them up to determine the total. For a top-down look, you estimate how much energy all data centers are using by looking at larger trends. Bottom-up is particularly difficult, because, once again, closed-source companies do not share such information and declined to talk specifics with us. While we can make some educated guesses to give us a picture of what might be happening right now, looking into the future is perhaps better served by taking a top-down approach. This data is scarce as well. The most important report was published in December by the Lawrence Berkeley National Laboratory, which is funded by the Department of Energy, and the report authors noted that it’s only the third such report released in the last 20 years. Academic climate and energy researchers we spoke with said it’s a major problem that AI is not considered its own economic sector for emissions measurements, and there aren’t rigorous reporting requirements. As a result, it’s difficult to track AI’s climate toll. Still, we examined the report’s results, compared them with other findings and estimates, and consulted independent experts about the data. While much of the report was about data centers more broadly, we drew out data points that were specific to the future of AI. Company goals We wanted to contrast these figures with the amounts of energy that AI companies themselves say they need. To do so, we collected reports by leading tech and AI companies about their plans for energy and data center expansions, as well as the dollar amounts they promised to invest. Where possible, we fact-checked the promises made in these claims. Requests to companies We submitted requests to Microsoft, Google, and OpenAI to have data-driven conversations about their models’ energy demands for AI inference. None of the companies made executives or leadership available for on-the-record interviews about their energy usage. This story was supported by a grant from the Tarbell Center for AI Journalism. #everything #you #need #know #aboutWWW.TECHNOLOGYREVIEW.COMEverything you need to know about estimating AI’s energy and emissions burdenWhen we set out to write a story on the best available estimates for AI’s energy and emissions burden, we knew there would be caveats and uncertainties to these numbers. But, we quickly discovered, the caveats are the story too. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Measuring the energy used by an AI model is not like evaluating a car’s fuel economy or an appliance’s energy rating. There’s no agreed-upon method or public database of values. There are no regulators who enforce standards, and consumers don’t get the chance to evaluate one model against another. Despite the fact that billions of dollars are being poured into reshaping energy infrastructure around the needs of AI, no one has settled on a way to quantify AI’s energy usage. Worse, companies are generally unwilling to disclose their own piece of the puzzle. There are also limitations to estimating the emissions associated with that energy demand, because the grid hosts a complicated, ever-changing mix of energy sources. It’s a big mess, basically. So, that said, here are the many variables, assumptions, and caveats that we used to calculate the consequences of an AI query. (You can see the full results of our investigation here.) Measuring the energy a model uses Companies like OpenAI, dealing in “closed-source” models, generally offer access to their systems through an interface where you input a question and receive an answer. What happens in between—which data center in the world processes your request, the energy it takes to do so, and the carbon intensity of the energy sources used—remains a secret, knowable only to the companies. There are few incentives for them to release this information, and so far, most have not. That’s why, for our analysis, we looked at open-source models. They serve as a very imperfect proxy but the best one we have. (OpenAI, Microsoft, and Google declined to share specifics on how much energy their closed-source models use.) The best resources for measuring the energy consumption of open-source AI models are AI Energy Score, ML.Energy, and MLPerf Power. The team behind ML.Energy assisted us with our text and image model calculations, and the team behind AI Energy Score helped with our video model calculations. Text models AI models use up energy in two phases: when they initially learn from vast amounts of data, called training, and when they respond to queries, called inference. When ChatGPT was launched a few years ago, training was the focus, as tech companies raced to keep up and build ever-bigger models. But now, inference is where the most energy is used. The most accurate way to understand how much energy an AI model uses in the inference stage is to directly measure the amount of electricity used by the server handling the request. Servers contain all sorts of components—powerful chips called GPUs that do the bulk of the computing, other chips called CPUs, fans to keep everything cool, and more. Researchers typically measure the amount of power the GPU draws and estimate the rest (more on this shortly). To do this, we turned to PhD candidate Jae-Won Chung and associate professor Mosharaf Chowdhury at the University of Michigan, who lead the ML.Energy project. Once we collected figures for different models’ GPU energy use from their team, we had to estimate how much energy is used for other processes, like cooling. We examined research literature, including a 2024 paper from Microsoft, to understand how much of a server’s total energy demand GPUs are responsible for. It turns out to be about half. So we took the team’s GPU energy estimate and doubled it to get a sense of total energy demands. The ML.Energy team uses a batch of 500 prompts from a larger dataset to test models. The hardware is kept the same throughout; the GPU is a popular Nvidia chip called the H100. We decided to focus on models of three sizes from the Meta Llama family: small (8 billion parameters), medium (70 billion), and large (405 billion). We also identified a selection of prompts to test. We compared these with the averages for the entire batch of 500 prompts. Image models Stable Diffusion 3 from Stability AI is one of the most commonly used open-source image-generating models, so we made it our focus. Though we tested multiple sizes of the text-based Meta Llama model, we focused on one of the most popular sizes of Stable Diffusion 3, with 2 billion parameters. The team uses a dataset of example prompts to test a model’s energy requirements. Though the energy used by large language models is determined partially by the prompt, this isn’t true for diffusion models. Diffusion models can be programmed to go through a prescribed number of “denoising steps” when they generate an image or video, with each step being an iteration of the algorithm that adds more detail to the image. For a given step count and model, all images generated have the same energy footprint. The more steps, the higher quality the end result—but the more energy used. Numbers of steps vary by model and application, but 25 is pretty common, and that’s what we used for our standard quality. For higher quality, we used 50 steps. We mentioned that GPUs are usually responsible for about half of the energy demands of large language model requests. There is not sufficient research to know how this changes for diffusion models that generate images and videos. In the absence of a better estimate, and after consulting with researchers, we opted to stick with this 50% rule of thumb for images and videos too. Video models Chung and Chowdhury do test video models, but only ones that generate short, low-quality GIFs. We don’t think the videos these models produce mirror the fidelity of the AI-generated video that many people are used to seeing. Instead, we turned to Sasha Luccioni, the AI and climate lead at Hugging Face, who directs the AI Energy Score project. She measures the energy used by the GPU during AI requests. We chose two versions of the CogVideoX model to test: an older, lower-quality version and a newer, higher-quality one. We asked Luccioni to use her tool, called Code Carbon, to test both and measure the results of a batch of video prompts we selected, using the same hardware as our text and image tests to keep as many variables as possible the same. She reported the GPU energy demands, which we again doubled to estimate total energy demands. Tracing where that energy comes from After we understand how much energy it takes to respond to a query, we can translate that into the total emissions impact. Doing so requires looking at the power grid from which data centers draw their electricity. Nailing down the climate impact of the grid can be complicated, because it’s both interconnected and incredibly local. Imagine the grid as a system of connected canals and pools of water. Power plants add water to the canals, and electricity users, or loads, siphon it out. In the US, grid interconnections stretch all the way across the country. So, in a way, we’re all connected, but we can also break the grid up into its component pieces to get a sense for how energy sources vary across the country. Understanding carbon intensity The key metric to understand here is called carbon intensity, which is basically a measure of how many grams of carbon dioxide pollution are released for every kilowatt-hour of electricity that’s produced. To get carbon intensity figures, we reached out to Electricity Maps, a Danish startup company that gathers data on grids around the world. The team collects information from sources including governments and utilities and uses them to publish historical and real-time estimates of the carbon intensity of the grid. You can find more about their methodology here. The company shared with us historical data from 2024, both for the entire US and for a few key balancing authorities (more on this in a moment). After discussions with Electricity Maps founder Olivier Corradi and other experts, we made a few decisions about which figures we would use in our calculations. One way to measure carbon intensity is to simply look at all the power plants that are operating on the grid, add up the pollution they’re producing at the moment, and divide that total by the electricity they’re producing. But that doesn’t account for the emissions that are associated with building and tearing down power plants, which can be significant. So we chose to use carbon intensity figures that account for the whole life cycle of a power plant. We also chose to use the consumption-based carbon intensity of energy rather than production-based. This figure accounts for imports and exports moving between different parts of the grid and best represents the electricity that’s being used, in real time, within a given region. For most of the calculations you see in the story, we used the average carbon intensity for the US for 2024, according to Electricity Maps, which is 402.49 grams of carbon dioxide equivalent per kilowatt-hour. Understanding balancing authorities While understanding the picture across the entire US can be helpful, the grid can look incredibly different in different locations. One way we can break things up is by looking at balancing authorities. These are independent bodies responsible for grid balancing in a specific region. They operate mostly independently, though there’s a constant movement of electricity between them as well. There are 66 balancing authorities in the US, and we can calculate a carbon intensity for the part of the grid encompassed by a specific balancing authority. Electricity Maps provided carbon intensity figures for a few key balancing authorities, and we focused on several that play the largest roles in data center operations. ERCOT (which covers most of Texas) and PJM (a cluster of states on the East Coast, including Virginia, Pennsylvania, and New Jersey) are two of the regions with the largest burden of data centers, according to research from the Harvard School of Public Health. We added CAISO (in California) because it covers the most populated state in the US. CAISO also manages a grid with a significant number of renewable energy sources, making it a good example of how carbon intensity can change drastically depending on the time of day. (In the middle of the day, solar tends to dominate, while natural gas plays a larger role overnight, for example.) One key caveat here is that we’re not entirely sure where companies tend to send individual AI inference requests. There are clusters of data centers in the regions we chose as examples, but when you use a tech giant’s AI model, your request could be handled by any number of data centers owned or contracted by the company. One reasonable approximation is location: It’s likely that the data center servicing a request is close to where it’s being made, so a request on the West Coast might be most likely to be routed to a data center on that side of the country. Explaining what we found To better contextualize our calculations, we introduced a few comparisons people might be more familiar with than kilowatt-hours and grams of carbon dioxide. In a few places, we took the amount of electricity estimated to be used by a model and calculated how long that electricity would be able to power a standard microwave, as well as how far it might take someone on an e-bike. In the case of the e-bike, we assumed an efficiency of 25 watt-hours per mile, which falls in the range of frequently cited efficiencies for a pedal-assisted bike. For the microwave, we assumed an 800-watt model, which falls within the average range in the US. We also introduced a comparison to contextualize greenhouse gas emissions: miles driven in a gas-powered car. For this, we used data from the US Environmental Protection Agency, which puts the weighted average fuel economy of vehicles in the US in 2022 at 393 grams of carbon dioxide equivalent per mile. Predicting how much energy AI will use in the future After measuring the energy demand of an individual query and the emissions it generated, it was time to estimate how all of this added up to national demand. There are two ways to do this. In a bottom-up analysis, you estimate how many individual queries there are, calculate the energy demands of each, and add them up to determine the total. For a top-down look, you estimate how much energy all data centers are using by looking at larger trends. Bottom-up is particularly difficult, because, once again, closed-source companies do not share such information and declined to talk specifics with us. While we can make some educated guesses to give us a picture of what might be happening right now, looking into the future is perhaps better served by taking a top-down approach. This data is scarce as well. The most important report was published in December by the Lawrence Berkeley National Laboratory, which is funded by the Department of Energy, and the report authors noted that it’s only the third such report released in the last 20 years. Academic climate and energy researchers we spoke with said it’s a major problem that AI is not considered its own economic sector for emissions measurements, and there aren’t rigorous reporting requirements. As a result, it’s difficult to track AI’s climate toll. Still, we examined the report’s results, compared them with other findings and estimates, and consulted independent experts about the data. While much of the report was about data centers more broadly, we drew out data points that were specific to the future of AI. Company goals We wanted to contrast these figures with the amounts of energy that AI companies themselves say they need. To do so, we collected reports by leading tech and AI companies about their plans for energy and data center expansions, as well as the dollar amounts they promised to invest. Where possible, we fact-checked the promises made in these claims. (Meta and Microsoft’s pledges to use more nuclear power, for example, would indeed reduce the carbon emissions of the companies, but it will take years, if not decades, for these additional nuclear plants to come online.) Requests to companies We submitted requests to Microsoft, Google, and OpenAI to have data-driven conversations about their models’ energy demands for AI inference. None of the companies made executives or leadership available for on-the-record interviews about their energy usage. This story was supported by a grant from the Tarbell Center for AI Journalism.0 Комментарии 0 Поделились 0 предпросмотр -
The Download: CRISPR in court, and the police’s ban-skirting AIThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology.
A US court just put ownership of CRISPR back in play The CRISPR patents are back in play.
Yesterday, the US Court of Appeals for the Federal Circuit said scientists Jennifer Doudna and Emmanuelle Charpentier will get another chance to show they ought to own the key patents on what many consider the defining biotechnology invention of the 21st century.
The pair shared a 2020 Nobel Prize for developing the gene-editing system, which is already being used to treat various disorders.
But when US patent rights were granted in 2014 to Feng Zhang of the Broad Institute of MIT and Harvard, the decision set off a bitter dispute in which hundreds of millions of dollars—as well as scientific bragging rights—are at stake.
Read the full story.—Antonio Regalado
To read more about CRISPR, why not take a look at: + Charpentier and Doudna announced they wanted to cancel their own CRISPR patents in Europe last year.
Read the full story.+ How CRISPR will help the world cope with climate change.
Read the full story.+ The US has approved CRISPR pigs for food.
Pigs whose DNA makes them resistant to a virus could be the first big consumer product using gene editing.
Read the full story.
+ CRISPR will get easier and easier to administer.
What does that mean for the future of our species?Police tech can sidestep facial recognition bans now —James O'Donnell Six months ago I attended the largest gathering of chiefs of police in the US to see how they’re using AI.
I found some big developments, like officers getting AI to write their reports.
Now, I’ve published a new story that shows just how far AI for police has developed since then.
It’s about a new method police are using to track people: an AI tool that uses attributes like body size, gender, hair color and style, clothing, and accessories instead of faces.
It offers a way around laws curbing the use of facial recognition, which are on the rise.Here’s what this tells us about the development of police tech and what rules, if any, these departments are subject to in the age of AI.
Read the full story.
This story originally appeared in The Algorithm, our weekly newsletter on AI.
To get stories like this in your inbox first, sign up here.
The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology.
1 Two Trump officials were denied access to the US Copyright Office Their visit came days after the administration fired the office’s head.
(Wired $)+ Shira Perlmutter oversaw a report raising concerns about training AI with copyrighted materials.
(WP $) 2 Google knew it couldn’t monitor how Israel might use its cloud technology But it went ahead with Project Nimbus anyway.
(The Intercept)3 Spain still doesn’t know what caused its massive power blackout Investigators are examining generators’ cyber defences for weaknesses.
(FT $)+ Could solar power be to blame? (MIT Technology Review)4 Apple is considering hiking the price of iPhones The company doesn’t want to blame tariffs, though.
(WSJ $)+ Apple boss Tim Cook had a call with Trump following the tariff rollback news.
(CNBC)+ It’s reportedly developing an AI tool to extend phones’ battery life.
(Bloomberg $)5 Venture capitalists aren’t 100% sure what an AI agent isThat isn’t stopping companies from sinking millions into them.
(TechCrunch) + Google is working on its own agent ahead of its I/O conference.
(The Information $)+ What AI assistants can—and can’t—do.
(Vox)+ Check out our AI agent explainer.
(MIT Technology Review)
6 Scammers are stealing the identities of death row inmates And prisoners are unlikely to see correspondence alerting them to the fraud.
(NBC News)7 Weight-loss drugs aren’t always enoughYou need long-term changes in health, not just weight.
(The Atlantic $) + How is Trump planning to lower drug costs, exactly? (NY Mag $)+ Drugs like Ozempic now make up 5% of prescriptions in the US.
(MIT Technology Review)
8 China’s e-commerce giants are racing to deliver goods within an hourAs competition has intensified, companies are fighting to be the quickest.
(Reuters) 9 This spacecraft will police satellites’ orbits And hunt them down where necessary.
(IEEE Spectrum)+ The world’s biggest space-based radar will measure Earth’s forests from orbit.
(MIT Technology Review) 10 Is your beard trimmer broken? Simply 3D-print a new part.
Philips is experimenting with letting its customers create their own replacements.
(The Verge)Quote of the day “We usually set it up so that our team doesn’t get to creep in.”
—Angie Saltman, founder and president of tech company Saltmedia, explains how her company helps store Indigenous data securely away from the Trump administration, the Verge reports.
One more thing Meet the radio-obsessed civilian shaping Ukraine’s drone defenseDrones have come to define the brutal conflict in Ukraine that has now dragged on for more than three years.
And most rely on radio communications—a technology that Serhii “Flash” Beskrestnov has obsessed over since childhood.
While Flash is now a civilian, the former officer has still taken it upon himself to inform his country’s defense in all matters related to radio.
Once a month, he studies the skies for Russian radio transmissions and tries to learn about the problems facing troops in the fields and in the trenches.In this race for survival—as each side constantly tries to best the other, only to start all over again when the other inevitably catches up—Ukrainian soldiers need to develop creative solutions, and fast.
As Ukraine’s wartime radio guru, Flash may just be one of their best hopes for doing that.
Read the full story.
—Charlie Metcalfe We can still have nice things A place for comfort, fun and distraction to brighten up your day.
(Got any ideas? Drop me a line or skeet 'em at me.) + Tune in at any time to the Coral City Camera, an underwater camera streaming live from an urban coral reef in Miami + Inhuman Resources, which mixes gaming, reading, and listening, sounds nuts.+ This compilation of 331 film clips to recreate Eminem’s Lose Yourself is spectacular.+ Questions I never thought I’d ask: what if Bigfoot were British?
Source: https://www.technologyreview.com/2025/05/13/1116357/the-download-crispr-in-court-and-the-polices-ban-skirting-ai/" style="color: #0066cc;">https://www.technologyreview.com/2025/05/13/1116357/the-download-crispr-in-court-and-the-polices-ban-skirting-ai/
#the #download #crispr #court #and #polices #banskirtingThe Download: CRISPR in court, and the police’s ban-skirting AIThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. A US court just put ownership of CRISPR back in play The CRISPR patents are back in play. Yesterday, the US Court of Appeals for the Federal Circuit said scientists Jennifer Doudna and Emmanuelle Charpentier will get another chance to show they ought to own the key patents on what many consider the defining biotechnology invention of the 21st century. The pair shared a 2020 Nobel Prize for developing the gene-editing system, which is already being used to treat various disorders. But when US patent rights were granted in 2014 to Feng Zhang of the Broad Institute of MIT and Harvard, the decision set off a bitter dispute in which hundreds of millions of dollars—as well as scientific bragging rights—are at stake. Read the full story.—Antonio Regalado To read more about CRISPR, why not take a look at: + Charpentier and Doudna announced they wanted to cancel their own CRISPR patents in Europe last year. Read the full story.+ How CRISPR will help the world cope with climate change. Read the full story.+ The US has approved CRISPR pigs for food. Pigs whose DNA makes them resistant to a virus could be the first big consumer product using gene editing. Read the full story. + CRISPR will get easier and easier to administer. What does that mean for the future of our species?Police tech can sidestep facial recognition bans now —James O'Donnell Six months ago I attended the largest gathering of chiefs of police in the US to see how they’re using AI. I found some big developments, like officers getting AI to write their reports. Now, I’ve published a new story that shows just how far AI for police has developed since then. It’s about a new method police are using to track people: an AI tool that uses attributes like body size, gender, hair color and style, clothing, and accessories instead of faces. It offers a way around laws curbing the use of facial recognition, which are on the rise.Here’s what this tells us about the development of police tech and what rules, if any, these departments are subject to in the age of AI. Read the full story. This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Two Trump officials were denied access to the US Copyright Office Their visit came days after the administration fired the office’s head. (Wired $)+ Shira Perlmutter oversaw a report raising concerns about training AI with copyrighted materials. (WP $) 2 Google knew it couldn’t monitor how Israel might use its cloud technology But it went ahead with Project Nimbus anyway. (The Intercept)3 Spain still doesn’t know what caused its massive power blackout Investigators are examining generators’ cyber defences for weaknesses. (FT $)+ Could solar power be to blame? (MIT Technology Review)4 Apple is considering hiking the price of iPhones The company doesn’t want to blame tariffs, though. (WSJ $)+ Apple boss Tim Cook had a call with Trump following the tariff rollback news. (CNBC)+ It’s reportedly developing an AI tool to extend phones’ battery life. (Bloomberg $)5 Venture capitalists aren’t 100% sure what an AI agent isThat isn’t stopping companies from sinking millions into them. (TechCrunch) + Google is working on its own agent ahead of its I/O conference. (The Information $)+ What AI assistants can—and can’t—do. (Vox)+ Check out our AI agent explainer. (MIT Technology Review) 6 Scammers are stealing the identities of death row inmates And prisoners are unlikely to see correspondence alerting them to the fraud. (NBC News)7 Weight-loss drugs aren’t always enoughYou need long-term changes in health, not just weight. (The Atlantic $) + How is Trump planning to lower drug costs, exactly? (NY Mag $)+ Drugs like Ozempic now make up 5% of prescriptions in the US. (MIT Technology Review) 8 China’s e-commerce giants are racing to deliver goods within an hourAs competition has intensified, companies are fighting to be the quickest. (Reuters) 9 This spacecraft will police satellites’ orbits 🛰️ And hunt them down where necessary. (IEEE Spectrum)+ The world’s biggest space-based radar will measure Earth’s forests from orbit. (MIT Technology Review) 10 Is your beard trimmer broken? Simply 3D-print a new part. Philips is experimenting with letting its customers create their own replacements. (The Verge)Quote of the day “We usually set it up so that our team doesn’t get to creep in.” —Angie Saltman, founder and president of tech company Saltmedia, explains how her company helps store Indigenous data securely away from the Trump administration, the Verge reports. One more thing Meet the radio-obsessed civilian shaping Ukraine’s drone defenseDrones have come to define the brutal conflict in Ukraine that has now dragged on for more than three years. And most rely on radio communications—a technology that Serhii “Flash” Beskrestnov has obsessed over since childhood. While Flash is now a civilian, the former officer has still taken it upon himself to inform his country’s defense in all matters related to radio. Once a month, he studies the skies for Russian radio transmissions and tries to learn about the problems facing troops in the fields and in the trenches.In this race for survival—as each side constantly tries to best the other, only to start all over again when the other inevitably catches up—Ukrainian soldiers need to develop creative solutions, and fast. As Ukraine’s wartime radio guru, Flash may just be one of their best hopes for doing that. Read the full story. —Charlie Metcalfe We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet 'em at me.) + Tune in at any time to the Coral City Camera, an underwater camera streaming live from an urban coral reef in Miami 🐠+ Inhuman Resources, which mixes gaming, reading, and listening, sounds nuts.+ This compilation of 331 film clips to recreate Eminem’s Lose Yourself is spectacular.+ Questions I never thought I’d ask: what if Bigfoot were British? Source: https://www.technologyreview.com/2025/05/13/1116357/the-download-crispr-in-court-and-the-polices-ban-skirting-ai/ #the #download #crispr #court #and #polices #banskirtingWWW.TECHNOLOGYREVIEW.COMThe Download: CRISPR in court, and the police’s ban-skirting AIThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. A US court just put ownership of CRISPR back in play The CRISPR patents are back in play. Yesterday, the US Court of Appeals for the Federal Circuit said scientists Jennifer Doudna and Emmanuelle Charpentier will get another chance to show they ought to own the key patents on what many consider the defining biotechnology invention of the 21st century. The pair shared a 2020 Nobel Prize for developing the gene-editing system, which is already being used to treat various disorders. But when US patent rights were granted in 2014 to Feng Zhang of the Broad Institute of MIT and Harvard, the decision set off a bitter dispute in which hundreds of millions of dollars—as well as scientific bragging rights—are at stake. Read the full story.—Antonio Regalado To read more about CRISPR, why not take a look at: + Charpentier and Doudna announced they wanted to cancel their own CRISPR patents in Europe last year. Read the full story.+ How CRISPR will help the world cope with climate change. Read the full story.+ The US has approved CRISPR pigs for food. Pigs whose DNA makes them resistant to a virus could be the first big consumer product using gene editing. Read the full story. + CRISPR will get easier and easier to administer. What does that mean for the future of our species?Police tech can sidestep facial recognition bans now —James O'Donnell Six months ago I attended the largest gathering of chiefs of police in the US to see how they’re using AI. I found some big developments, like officers getting AI to write their reports. Now, I’ve published a new story that shows just how far AI for police has developed since then. It’s about a new method police are using to track people: an AI tool that uses attributes like body size, gender, hair color and style, clothing, and accessories instead of faces. It offers a way around laws curbing the use of facial recognition, which are on the rise.Here’s what this tells us about the development of police tech and what rules, if any, these departments are subject to in the age of AI. Read the full story. This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 Two Trump officials were denied access to the US Copyright Office Their visit came days after the administration fired the office’s head. (Wired $)+ Shira Perlmutter oversaw a report raising concerns about training AI with copyrighted materials. (WP $) 2 Google knew it couldn’t monitor how Israel might use its cloud technology But it went ahead with Project Nimbus anyway. (The Intercept)3 Spain still doesn’t know what caused its massive power blackout Investigators are examining generators’ cyber defences for weaknesses. (FT $)+ Could solar power be to blame? (MIT Technology Review)4 Apple is considering hiking the price of iPhones The company doesn’t want to blame tariffs, though. (WSJ $)+ Apple boss Tim Cook had a call with Trump following the tariff rollback news. (CNBC)+ It’s reportedly developing an AI tool to extend phones’ battery life. (Bloomberg $)5 Venture capitalists aren’t 100% sure what an AI agent isThat isn’t stopping companies from sinking millions into them. (TechCrunch) + Google is working on its own agent ahead of its I/O conference. (The Information $)+ What AI assistants can—and can’t—do. (Vox)+ Check out our AI agent explainer. (MIT Technology Review) 6 Scammers are stealing the identities of death row inmates And prisoners are unlikely to see correspondence alerting them to the fraud. (NBC News)7 Weight-loss drugs aren’t always enoughYou need long-term changes in health, not just weight. (The Atlantic $) + How is Trump planning to lower drug costs, exactly? (NY Mag $)+ Drugs like Ozempic now make up 5% of prescriptions in the US. (MIT Technology Review) 8 China’s e-commerce giants are racing to deliver goods within an hourAs competition has intensified, companies are fighting to be the quickest. (Reuters) 9 This spacecraft will police satellites’ orbits 🛰️ And hunt them down where necessary. (IEEE Spectrum)+ The world’s biggest space-based radar will measure Earth’s forests from orbit. (MIT Technology Review) 10 Is your beard trimmer broken? Simply 3D-print a new part. Philips is experimenting with letting its customers create their own replacements. (The Verge)Quote of the day “We usually set it up so that our team doesn’t get to creep in.” —Angie Saltman, founder and president of tech company Saltmedia, explains how her company helps store Indigenous data securely away from the Trump administration, the Verge reports. One more thing Meet the radio-obsessed civilian shaping Ukraine’s drone defenseDrones have come to define the brutal conflict in Ukraine that has now dragged on for more than three years. And most rely on radio communications—a technology that Serhii “Flash” Beskrestnov has obsessed over since childhood. While Flash is now a civilian, the former officer has still taken it upon himself to inform his country’s defense in all matters related to radio. Once a month, he studies the skies for Russian radio transmissions and tries to learn about the problems facing troops in the fields and in the trenches.In this race for survival—as each side constantly tries to best the other, only to start all over again when the other inevitably catches up—Ukrainian soldiers need to develop creative solutions, and fast. As Ukraine’s wartime radio guru, Flash may just be one of their best hopes for doing that. Read the full story. —Charlie Metcalfe We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet 'em at me.) + Tune in at any time to the Coral City Camera, an underwater camera streaming live from an urban coral reef in Miami 🐠+ Inhuman Resources, which mixes gaming, reading, and listening, sounds nuts.+ This compilation of 331 film clips to recreate Eminem’s Lose Yourself is spectacular.+ Questions I never thought I’d ask: what if Bigfoot were British?0 Комментарии 0 Поделились 0 предпросмотр -
WWW.TECHNOLOGYREVIEW.COMThe business of the future is adaptiveManufacturing is in a state of flux. From supply chain disruptions to rising costs, tougher environmental regulations, and a changing consumer market, the sector faces a series of competing challenges. But a new way of operating offers a way to tackle complexities head-on: adaptive production hardwires flexibility and resilience into the enterprise, drawing on powerful tools like artificial intelligence, digital twins, and robotics. Taking automation a step further, adaptive production allows manufacturers to respond in real time to demand fluctuations, adapt to supply chain disruptions, and autonomously optimize operations. It also facilitates an unprecedented level of personalization and customization for regional markets. DOWNLOAD THE REPORT Time to adapt The journey to adaptive production is not just about addressing today’s pressures, like rising costs and supply chain disruptions—it’s about positioning businesses for long-term success in a world of constant change. “In the coming years,” says Jana Kirchheim, director of manufacturing for Microsoft Germany, “I expect that new key technologies like copilots, small language models, high-performance computing, or the adaptive cloud approach will revolutionize the shop floor and accelerate industrial automation by enabling faster adjustments and re-programming for specific tasks.” These capabilities make adaptive production a transformative force, enhancing responsiveness and opening doors to systems with increasing autonomy—designed to complement human ingenuity rather than replace it. These advances enable more than technical upgrades—they drive fundamental shifts in how manufacturers operate. John Hart, professor of mechanical engineering and director of MIT’s Center for Advanced Production Technologies, explains that automation is “going from a rigid high-volume, low-mix focus”—where factories make large quantities of very few products—“to more flexible high-volume, high-mix, and low-volume, high-mix scenarios”—where many product types can be made in custom quantities. These new capabilities demand a fundamental shift in how value is created and captured. Download the full report. This content was produced by Insights, the custom content arm of MIT Technology Review. It was not written by MIT Technology Review’s editorial staff. This content was researched, designed, and written entirely by human writers, editors, analysts, and illustrators. This includes the writing of surveys and collection of data for surveys. AI tools that may have been used were limited to secondary production processes that passed thorough human review.0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMA long-abandoned US nuclear technology is making a comeback in ChinaChina has once again beat everyone else to a clean energy milestone—its new nuclear reactor is reportedly one of the first to use thorium instead of uranium as a fuel and the first of its kind that can be refueled while it’s running. It’s an interesting (if decidedly experimental) development out of a country that’s edging toward becoming the world leader in nuclear energy. China has now surpassed France in terms of generation, though not capacity; it still lags behind the US in both categories. But one recurring theme in media coverage about the reactor struck me, because it’s so familiar: This technology was invented decades ago, and then abandoned. You can basically copy and paste that line into countless stories about today’s advanced reactor technology. Molten-salt cooling systems? Invented in the mid-20th century but never commercialized. Same for several alternative fuels, like TRISO. And, of course, there’s thorium. This one research reactor in China running with an alternative fuel says a lot about this moment for nuclear energy technology: Many groups are looking into the past for technologies, with a new appetite for building them. First, it’s important to note that China is the hot spot for nuclear energy right now. While the US still has the most operational reactors in the world, China is catching up quickly. The country is building reactors at a remarkable clip and currently has more reactors under construction than any other country by far. Just this week, China approved 10 new reactors, totaling over $27 billion in investment. China is also leading the way for some advanced reactor technologies (that category includes basically anything that deviates from the standard blueprint of what’s on the grid today: large reactors that use enriched uranium for fuel and high-pressure water to keep the reactor cool). High-temperature reactors that use gas as a coolant are one major area of focus for China—a few reactors that use this technology have recently started up, and more are in the planning stages or under construction. Now, Chinese state media is reporting that scientists in the country reached a milestone with a thorium-based reactor. The reactor came online in June 2024, but researchers say it recently went through refueling without shutting down. (Conventional reactors generally need to be stopped to replenish the fuel supply.) The project’s lead scientists shared the results during a closed meeting at the Chinese Academy of Sciences. I’ll emphasize here that this isn’t some massive power plant: This reactor is tiny. It generates just two megawatts of heat—less than the research reactor on MIT’s campus, which rings in at six megawatts. (To be fair, MIT’s is one of the largest university research reactors in the US, but still … it’s small.) Regardless, progress is progress for thorium reactors, as the world has been entirely focused on uranium for the last 50 years or so. Much of the original research on thorium came out of the US, which pumped resources into all sorts of different reactor technologies in the 1950s and ’60s. A reactor at Oak Ridge National Laboratory in Tennessee that ran in the 1960s used Uranium-233 fuel (which can be generated when thorium is bombarded with radiation). Eventually, though, the world more or less settled on a blueprint for nuclear reactors, focusing on those that use Uranium-238 as fuel and are cooled by water at a high pressure. One reason for the focus on uranium for energy tech? The research could also be applied to nuclear weapons. But now there’s a renewed interest in alternative nuclear technologies, and the thorium-fueled reactor is just one of several examples. A prominent one we’ve covered before: Kairos Power is building reactors that use molten salt as a coolant for small nuclear reactors, also a technology invented and developed in the 1950s and ’60s before being abandoned. Another old-but-new concept is using high-temperature gas to cool reactors, as X-energy is aiming to do in its proposed power station at a chemical plant in Texas. (That reactor will be able to be refueled while it’s running, like the new thorium reactor.) Some problems from decades ago that contributed to technologies being abandoned will still need to be dealt with today. In the case of molten-salt reactors, for example, it can be tricky to find materials that can withstand the corrosive properties of super-hot salt. For thorium reactors, the process of transforming thorium into U-233 fuel has historically been one of the hurdles. But as early progress shows, the archives could provide fodder for new commercial reactors, and revisiting these old ideas could give the nuclear industry a much-needed boost. This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here.0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMBuilding better citiesClara Brenner, MBA ’12, arrived in Cambridge on the lookout for a business partner. She wanted to start her own company—and never have to deal with a boss again. She would go it alone if she had to, but she hoped to find someone whose skills would complement her own. It’s a common MBA tale. Many people attend business school with hopes of finding the one. Building that relationship is so important to a company’s foundation that it’s been described in romantic terms: Networking is akin to dating around, and some view settling down with a business partner as a marriage of sorts. Brenner didn’t have to look for long. She met her match—Julie Lein, MBA ’12—soon after arriving at Sloan more than a decade ago. But their first encounter wasn’t exactly auspicious. In fact, their relationship began with an expletive. Lein was sitting at a card table in a hallway in E52, glumly selling tickets to a fashion show featuring work-appropriate clothes for women—at that time, the marquee event for Sloan’s Women in Management Club, and one that both Lein and Brenner thought was patently absurd. Lein had no interest in attending, but she wanted to support the club’s mission of boosting women in business. “She looked very miserable,” says Brenner. Lein asked if she wanted to buy a ticket, Brenner recalls, and “I think I said, ‘F*** no.’” “We both bonded over the fact that this was such a stupid idea,” says Lein. (The fashion show has since been retired, in part thanks to Lein and Brenner’s lobbying.) Today, the two run the Urban Innovation Fund, a San Francisco–based venture capital firm that has raised $212 million since 2016 and invested in 64 startups addressing the most pressing problems facing cities. It has supported businesses like Electriphi, a provider of EV charging and fleet management software, which was acquired by one of the biggest names in the auto industry. And it funds companies focused on helping kids learn to code, providing virtual tutoring services, offering financing for affordable housing, and more. The companies in its portfolio have a total value of $5.3 billion, and at least eight have been acquired thus far. Though Brenner and Lein hit it off quickly, they weren’t an obvious fit as business partners. Brenner arrived at Sloan after weathering an early career in commercial real estate just after the 2008 financial crash. She hoped to start her own company in that industry. Lein, on the other hand, had worked in political polling and consulting. She initially planned to get an advanced policy degree, until a mentor suggested an MBA. She hoped to start her own political polling firm after graduation. Ultimately, though, their instant kinship became more important than their subject matter expertise. Brenner, says Lein, is “methodical” and organized, while she “just goes and executes” without overthinking. Their relationship—in business, and still as close friends—is rooted in trust and a commitment to realizing the vision they’ve created together. “We were able to see that ... our skills and style were very complementary, and we just were able to do things better and faster together,” says Brenner. In 2012, the two teamed up to run Sloan’s second Women in Management Conference, which they had helped found the year before. It was then, they say, that they knew they would work together after graduation. Still, they had trouble agreeing on the type of venture that made the most sense. Their initial talks involved a tug-of-war over whose area of expertise would win—real estate or policy. But in the summer of 2011, they’d both happened to land internships at companies focused on challenges in cities—companies that would now be called “urban-tech startups,” says Brenner, though that term was not used at the time. The overlap was fortuitous: When they compared notes, they agreed that it made sense to investigate the potential for companies in that emerging space. Lyft was just getting its start, as was Airbnb. After exploring the idea further, the two concluded there was some “there” there. “We felt like all these companies had a lot in common,” says Brenner. “They were solving very interesting community challenges in cities, but in a very scalable, nontraditional way.” They were also working in highly regulated areas that VC firms were often hesitant to touch, even though these companies were attracting significant attention. To Brenner and Lein, some of that attention was the wrong kind; companies like Uber were making what they saw as obvious missteps that were landing in the news. “No one was helping [these companies] with, like, ‘You should hire a lobbyist’ or ‘You should have a policy team,’” says Brenner. The two saw an opportunity to fund businesses that could make a measurable positive impact on urban life—and to help them navigate regulatory and policy environments as they grew from startups to huge companies. Upon graduating in 2012, they launched Tumml, an accelerator program for such startups. The name was drawn from the Yiddish word tummler, often used by Brenner’s grandmother to describe someone who inspires others to action. At the time, Brenner says, “world-positive investing” was “not cool at all” among funders because it was perceived as yielding lower returns, even though growing numbers of tech companies were touting their efforts to improve society. In another unusual move, the partners structured their startup accelerator as a nonprofit evergreen fund, allowing them to invest in companies continuously without setting a fixed end date. By the end of their third year, they were supporting 38 startups. Tumml found success by offering money, mentorship, and guidance, but the pair realized that relying solely on fickle philanthropic funding meant the model had a ceiling. To expand their work, they retired Tumml and launched the Urban Innovation Fund in 2016 with $24.5 million in initial investments. While Tumml had offered relatively small checks and support to companies at the earliest stages, UIF would allow Brenner and Lein to supercharge their funding and involvement. Their focus has remained on startups tackling urban problems in areas such as public health, education, and transportation. The types of companies they look for are those that drive economic vitality in cities, make urban areas more livable, or make cities more sustainable. As Tumml did, UIF provides not just funding but also consistent support in navigating regulatory challenges. “It’s a very, very small subset of companies that can both work on a problem that, at least in our minds, really matters and be an enormous business.” And, like Tumml, UIF has taken on industries or companies that other investors may see as risky. When it was raising its first fund, Lein remembers, they pitched a large institution on its vision, which includes investing in companies that work on climate and energy. The organization, burned by the money it lost when the first cleantech bubble burst, was extremely wary—it wasn’t interested in a fund that emphasized those areas. But Lein and Brenner pressed on. Today, climate tech remains one of the fund’s largest areas, accounting for more than a sixth of its portfolio of 64 companies (see “Urban innovation in action,” at right). In addition to Electriphi, they have invested in Public Grid, a company that gives households access to affordable clean energy, and Optiwatt, an app that helps EV drivers schedule charging at times of day when it is cheaper or cleaner. “They took risks in areas, [including] mobility and transportation, where other people might not play because of policy and regulation risk. And they were willing to think about the public-private partnerships and what might be needed,” says Rachel Sheinbein, MBA ’04, SM ’04, a Bay Area–based angel investor who has worked with the Urban Innovation Fund on investments. “They weren’t afraid to take that on.” Lein and Brenner have also invested in health companies like Cleancard, which is working to provide at-home testing for cancers, and startups creating workflow tools, like KarmaSuite, which has built software to help nonprofits track grants. Meanwhile, they have cast a wide net and built a portfolio rich in companies that happen to be led by entrepreneurs from underrepresented groups: Three-quarters of the companies in UIF’s current portfolio were founded by women or people of color, and nearly 60% include an immigrant on their founding team. When it comes to selecting companies, Brenner says, they make “very calculated decisions” based in part on regulatory factors that may affect profits. But they’re still looking for the huge returns that drive other investors. “It’s a very, very small subset of companies that can both work on a problem that, at least in our minds, really matters and be an enormous business,” she says. “Those are really the companies that we’re looking for.” One of the most obvious examples of that winning combination is Electriphi. When Brenner and Lein invested in the company, in 2019, the Biden administration hadn’t mandated the electrification of federal auto fleets, and the Inflation Reduction Act, which included financial incentives for clean energy, hadn’t yet been drafted. And California had yet to announce its intention to completely phase out gas-powered cars. “It was not a hot space,” says Brenner. But after meeting with Electriphi’s team, both Brenner and Lein felt there was something there. The partners tracked the startup for months, saw it achieving its goals, and ended up offering it the largest investment, by several orders of magnitude, that their fund had ever made. Less than two years later, Ford acquired it for an undisclosed sum. “When we were originally talking about Electriphi, a lot of people were like, ‘Eh, it’s going to take too long for fleets to transition, and we don’t want to make a bet at this time,’” Sheinbein recalls. But she says the partners at Urban Innovation Fund were willing to take on an investment that other people were “still a little bit hesitant” about. Sheinbein also invested in the startup. GABRIELA HASBUN Impact investing has now taken root in the building where Lein and Brenner first met. What was once an often overlooked investing area, says Bill Aulet, SM ’94, managing director of the Martin Trust Center for MIT Entrepreneurship, is now a core element of how Sloan teaches entrepreneurship. Aulet sees Urban Innovation Fund’s social-enterprise investing strategy as very viable in the current market. “Will it outperform cryptocurrency? Not right now,” he says, but he adds that many people want to put their money toward companies with the potential to improve the world. Lein, who worked as Aulet’s teaching assistant at Sloan for a class now known as Entrepreneurship 101, helped establish the mold at Sloan for a social-impact entrepreneur—that is, someone who sees doing good as a critical objective, not just a marketing strategy. “Entrepreneurs don’t just have to found startups,” says Aulet. “You can also be what we call an entrepreneurship amplifier,” which he defines as “someone who helps entrepreneurship thrive.” When they make investments, VCs tend to prioritize such things as the need for a company’s products and the size of its potential market. Brenner and Lein say they pay the most attention to the team when deciding whether to make a bet: Do they work together well? Are they obsessive about accomplishing their goals? Those who have watched UIF grow say Brenner and Lein’s partnership fits that profile itself. “I can just tell when a team really respects each other and [each] sees the value in the other one’s brain,” says Sheinbein. For Lein and Brenner, she says, their “mutual respect and admiration for each other” is obvious. “We went to Sloan, we spent a bunch of money, but we found each other,” says Lein. “We couldn’t agree on a new urban-tech startup to start,” she adds, so instead, they built an ecosystem of them—all in the name of improving cities for the people who live there.0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMUnleashing the potential of qubits, one molecule at a timeIt all began with a simple origami model. As an undergrad at Harvard, Danna Freedman went to a professor’s office hours for her general chemistry class and came across an elegant paper model that depicted the fullerene molecule. The intricately folded representation of chemical bonds and atomic arrangements sparked her interest, igniting a profound curiosity about how the structure of molecules influences their function. She stayed and chatted with the professor after the other students left, and he persuaded her to drop his class so she could instead dive immediately into the study of chemistry at a higher level. Soon she was hooked. After graduating with a chemistry degree, Freedman earned a PhD at the University of California, Berkeley, did a postdoc at MIT, and joined the faculty at Northwestern University. In 2021, she returned to MIT as the Frederick George Keyes Professor of Chemistry. Freedman’s fascination with the relationship between form and function at the molecular level laid the groundwork for a trailblazing career in quantum information science, eventually leading her to be honored with a 2022 MacArthur fellowship—and the accompanying “genius” grant—as one of the leading figures in the field. Today, her eyes light up when she talks about the “beauty” of chemistry, which is how she sees the intricate dance of atoms that dictates a molecule’s behavior. At MIT, Freedman focuses on creating novel molecules with specific properties that could revolutionize the technology of sensing, leading to unprecedented levels of precision. Designer molecules Early in her graduate studies, Freedman noticed that many chemistry research papers claimed to contribute to the development of quantum computing, which exploits the behavior of matter at extremely small scales to deliver much more computational power than a conventional computer can achieve. While the ambition was clear, Freedman wasn’t convinced. When she read these papers carefully, she found that her skepticism was warranted. “I realized that nobody was trying to design magnetic molecules for the actual goal of quantum computing!” she says. Such molecules would be suited to acting as quantum bits, or qubits, the basic unit of information in quantum systems. But the research she was reading about had little to do with that. Nevertheless, that realization got Freedman thinking—could molecules be designed to serve as qubits? She decided to find out. Her work made her among the first to use chemistry in a way that demonstrably advanced the field of quantum information science, which she describes as a general term encompassing the use of quantum technology for computation, sensing, measurement, and communication. Unlike traditional bits, which can only equal 0 or 1, qubits are capable of “superposition”—simultaneously existing in multiple states. This is why quantum computers made from qubits can solve large problems faster than classical computers. Freedman, however, has always been far more interested in tapping into qubits’ potential to serve as exquisitely precise sensors. Qubits store information in quantum properties that can be easily disrupted. While the delicacy of those properties makes qubits hard to control, it also makes them especially sensitive and therefore very useful as sensors. Qubits encode information in quantum properties—such as spin and energy—that can be easily disrupted. While the delicacy of those properties makes qubits hard to control, it also makes them especially sensitive and therefore very useful as sensors. Harnessing the power of qubits is notoriously tricky, though. For example, two of the most common types—superconducting qubits, which are often made of thin aluminum layers, and trapped-ion qubits, which use the energy levels of an ion’s electrons to represent 1s and 0s—must be kept at temperatures approaching absolute zero (–273 °C). Maintaining special refrigerators to keep them cool can be costly and difficult. And while researchers have made significant progress recently, both types of qubits have historically been difficult to connect into larger systems. Eager to explore the potential of molecular qubits, Freedman has pioneered a unique “bottom-up” approach to creating them: She designs novel molecules with specific quantum properties to serve as qubits targeted for individual applications. Instead of focusing on a general goal such as maximizing coherence time (how long a qubit can preserve its quantum state), she begins by asking what kinds of properties are needed for, say, a sensor meant to measure biological phenomena at the molecular level. Then she and her team set out to create molecules that have these properties and are suitable for the environment where they’d be used. To determine the precise structure of a new molecule, Freedman’s team uses software to analyze and process visualizations (such as those in teal and pink above) of data collected by an x-ray diffractometer. The diagram at right depicts an organometallic Cr(IV) complex made of a central chromium atom and four hydrocarbon ligands.COURTESY OF DANNA FREEDMAN Made of a central metallic atom surrounded by hydrocarbon atoms, molecular qubits store information in their spin. The encoded information is later translated into photons, which are emitted to “read out” the information. These qubits can be tuned with laser precision—imagine adjusting a radio dial—by modifying the strength of the ligands, or bonds, connecting the hydrocarbons to the metal atom. These bonds act like tiny tuning forks; by adjusting their strength, the researchers can precisely control the qubit’s spin and the wavelength of the emitted photons. That emitted light can be used to provide information about atomic-level changes in electrical or magnetic fields. While many researchers are eager to build reliable, scalable quantum computers, Freedman and her group devote most of their attention to developing custom molecules for quantum sensors. These ultrasensitive sensors contain particles in a state so delicately balanced that extremely small changes in their environments unbalance them, causing them to emit light differently. For example, one qubit designed in Freedman’s lab, made of a chromium atom surrounded by four hydrocarbon molecules, can be customized so that tiny changes in the strength of a nearby magnetic field will change its light emissions in a particular way. A key benefit of using such molecules for sensing is that they are small enough—just a nanometer or so wide—to get extremely close to the thing they are sensing. That can offer an unprecedented level of precision when measuring something like the surface magnetism of two-dimensional materials, since the strength of a magnetic field decays with distance. A molecular quantum sensor “might not be more inherently accurate than a competing quantum sensor,” says Freedman, “but if you can lose an order of magnitude of distance, that can give us a lot of information.” Quantum sensors’ ability to detect electric or magnetic changes at the atomic level and make extraordinarily precise measurements could be useful in many fields, such as environmental monitoring, medical diagnostics, geolocation, and more. When designing molecules to serve as quantum sensors, Freedman’s group also factors in the way they can be expected to act in a specific sensing environment. Creating a sensor for water, for example, requires a water-compatible molecule, and a sensor for use at very low temperatures requires molecules that are optimized to perform well in the cold. By custom-engineering molecules for different uses, the Freedman lab aims to make quantum technology more versatile and widely adaptable. Embracing interdisciplinarity As Freedman and her group focus on the highly specific work of designing custom molecules, she is keenly aware that tapping into the power of quantum science depends on the collective efforts of scientists from different fields. “Quantum is a broad and heterogeneous field,” she says. She believes that attempts to define it narrowly hurt collective research—and that scientists must welcome collaboration when the research leads them beyond their own field. Even in the seemingly straightforward scenario of using a quantum computer to solve a chemistry problem, you would need a physicist to write a quantum algorithm, engineers and materials scientists to build the computer, and chemists to define the problem and identify how the quantum computer might solve it. MIT’s collaborative environment has helped Freedman connect with researchers in different disciplines, which she says has been instrumental in advancing her research. She’s recently spoken with neurobiologists who proposed problems that quantum sensing could potentially solve and provided helpful context for building the sensors. Looking ahead, she’s excited about the potential applications of quantum science in many scientific fields. “MIT is such a great place to nucleate a lot of these connections,” she says. “As quantum expands, there are so many of these threads which are inherently interdisciplinary,” she says. Inside the lab Freedman’s lab in Building 6 is a beehive of creativity and collaboration. Against a backdrop of colorful flasks and beakers, researchers work together to synthesize molecules, analyze their structures, and unlock the secrets hidden within their intricate atomic arrangements. “We are making new molecules and putting them together atom by atom to discover whether they have the properties we want,” says Christian Oswood, a postdoctoral fellow. Some sensitive molecules can only be made in the lab’s glove box, a nitrogen-filled transparent container that protects chemicals from oxygen and water in the ambient air. An example is an organometallic solution synthesized by one of Freedman’s graduate students, David Ullery, which takes the form of a vial of purple liquid. (“A lot of molecules have really pretty colors,” he says.) Freedman is a passionate educator, dedicated to demystifying the complexities of chemistry for her students. Aware that many of them find the subject daunting, she strives to go beyond textbook equations. Once synthesized, the molecules are taken to a single-crystal x-ray diffractometer a few floors below the Freedman lab. There, x-rays are directed at crystallized samples, and from the diffraction pattern, researchers can deduce their molecular structure—how the atoms connect. Studying the precise geometry of these synthesized molecules reveals how the structure affects their quantum properties, Oswood explains. Researchers and students at the lab say Freedman’s cross-disciplinary outlook played a big role in drawing them to it. With a chemistry background and a special interest in physics, for example, Ullery joined because he was excited by the way Freedman’s research bridges those two fields. Crystals of an organometallic Cr(IV) complex. Freedman’s lab designed a series of molecules like this one to detect changes in a magnetic field.COURTESY OF DANNA FREEDMAN Others echo this sentiment. “The opportunity to be in a field that’s both new and expanding like quantum science, and attacking it from this specific angle, was exciting to me both intellectually and professionally,” says Oswood. Another graduate student, Cindy Serena Ngompe Massado, says she enjoys being part of the lab because she gets to collaborate with scientists in other fields. “It allows you to really approach scientific challenges in a more holistic and productive way,” she says. Though the researchers spend most of their time synthesizing and analyzing molecules, fun infuses the lab too. Freedman checks in with everyone frequently, and conversations often drift beyond just science. She’s just as comfortable chatting about Taylor Swift and Travis Kelce as she is discussing research. “Danna is very personable and very herself with us,” Ullery says. “It adds a bit of levity to being in an otherwise stressful grad school environment.” Bringing textbook chemistry to life In the classroom, Freedman is a passionate educator, dedicated to demystifying the complexities of chemistry for her students. Aware that many of them find the subject daunting, she strives to go beyond textbook equations. For each lecture in her advanced inorganic chemistry classes, she introduces the “molecule of the day,” which is always connected to the lesson plan. When teaching about bimetallic molecules, for example, she showcased the potassium rubidium molecule, citing active research at Harvard aimed at entangling its nuclear spins. For a lecture on superconductors, she brought a sample of the superconducting material yttrium barium copper oxide that students could handle. Chemistry students often think “This is painful” or “Why are we learning this?” Freedman says. Making the subject matter more tangible and showing its connection to ongoing research spark students’ interest and underscore the material’s relevance. Freedman sees frustrating research as an opportunity to discover new things. “I like students to work on at least one ‘safer’ project along with something more ambitious,” she says.M. SCOTT BRAUER/MIT NEWS OFFICE Freedman believes this is an exceptionally exciting time for budding chemists. She emphasizes the importance of curiosity and encourages them to ask questions. “There is a joy to being able to walk into any room and ask any question and extract all the knowledge that you can,” she says. In her own research, she embodies this passion for the pursuit of knowledge, framing challenges as stepping stones to discovery. When she was a postdoc, her research on electron spins in synthetic materials hit what seemed to be a dead end that ultimately led to the discovery of a new class of magnetic material. So she tells her students that even the most difficult aspects of research are rewarding because they often lead to interesting findings. That’s exactly what happened to Ullery. When he designed a molecule meant to be stable in air and water and emit light, he was surprised that it didn’t—and that threw a wrench into his plan to develop the molecule into a sensor that would emit light only under particular circumstances. So he worked with theoreticians in Giulia Galli’s group at the University of Chicago, developing new insights on what drives emission, and that led to the design of a new molecule that did emit light. “Frustrating research is almost fun to deal with,” says Freedman, “even if it doesn’t always feel that way.”0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMInside-out learningWhen the prison doors first closed behind him more than 50 years ago, Lee Perlman, PhD ’89, felt decidedly unsettled. In his first job out of college, as a researcher for a consulting company working on a project for the US Federal Bureau of Prisons, he had been tasked with interviewing incarcerated participants in a drug rehab program. Once locked inside, he found himself alone in a room with a convicted criminal. “I didn’t know whether I should be scared,” he recalls. Since then, he has spent countless hours in such environments in his role as a teacher of philosophy. He’s had “very, very few experiences” where he felt unsafe in prisons over the years, he says. “But that first time you go in, you do feel unsafe. I think that’s what you should feel. That teaches you something about what it feels like for anybody going into prison.” As a lecturer in MIT’s Experimental Study Group (ESG) for more than 40 years, Perlman has guided numerous MIT students through their own versions of that passage through prison doors. He first began teaching in prisons in the 1980s, when he got the idea of bringing his ESG students studying nonviolence into the Massachusetts Correctional Institution at Norfolk to talk with men serving life sentences. The experience was so compelling that Perlman kept going back, and since the early 2000s he has been offering full courses behind bars. In 2018, Perlman formalized these efforts by cofounding the Educational Justice Institute (TEJI) at MIT with Carole Cafferty, a former corrections professional. Conceived both to provide college-level education with technology access to incarcerated individuals and to foster empathy and offer a window into the criminal justice system for MIT students, TEJI creates opportunities for the two groups to learn side by side. “There’s hard data that there’s nothing that works like education to cut recidivism, to change the atmosphere within a prison so prisons become less violent places.” Lee Perlman, PhD ’89 “We believe that there are three fundamental components of education that everybody should have, regardless of their incarceration status: emotional literacy, digital literacy, and financial literacy,” says Cafferty. TEJI offers incarcerated students classes in the humanities, computer science, and business, the credits from which can be applied toward degrees from private universities and community colleges. The emotional literacy component, featuring Perlman’s philosophy courses, is taught in an “inside-out” format, with a mixed group of incarcerated “inside” students and “outside” classmates (from MIT and other universities where TEJI courses are sometimes cross-listed). “I’ve been really torn throughout my life,” Perlman says, “between this part of me that would like to be a monk and sit in a cave and read books all day long and come out and discuss them with other monks, and this other half of me that wants to do some good in the world, really wants to make a difference.” Behind prison walls, the concepts he relishes discussing—love, authenticity, compassion—have become his tools for doing that good. TEJI also serves as a convener of people from academia and the criminal justice system. Within MIT, it works with the Sloan School of Management, the Music and Theater Arts Section, the Priscilla King Gray Public Service Center, and others on courses and special prison-related projects. And by spearheading broader initiatives like the Massachusetts Prison Education Consortium and the New England Commission on the Future of Higher Education in Prison, TEJI has helped lay the groundwork for significant shifts in how incarcerated people across the region and beyond prepare to rejoin society. “Lee and I both share the belief that education can and should be a transformative force in the lives of incarcerated people,” Cafferty says. “But we also recognize that the current system doesn’t offer a lot of opportunities for that.” Through TEJI, they’re working to create more. Perlman didn’t set out to reform prison education. “There’s never been any plan,” he says. “Before I was an academic I was a political organizer, so I have that political organizer brain. I just look for … where’s the opening you can run through?” Before earning his PhD in political philosophy, Perlman spent eight years making his mark on Maryland’s political scene. At age 28, he came up short by a few hundred votes in a primary for the state senate. In the late 1970s, Perlman says, he was named one of 10 rising stars in Maryland politics by the Baltimore Sun and one of the state’s most feared lobbyists by Baltimore Magazine because he got lawmakers to “do things they’d be perfectly willing to leave alone,” as he puts it, like pass election reform bills. The legislators gave him the nickname Wolfman, “probably just because I had a beard,” he says, “but it kind of grew to mean other things.” Perlman still has the beard. Working in tandem with Cafferty and others, he’s also retained his knack for nudging change forward. Lee Perlman, PhD ’89, and Philip Hutchful, an incarcerated student, take part in the semester’s final meeting of Perlman’s “inside-out” class Nonviolence as a Way of Life at the Boston Pre-Release Center.JAY DIAS/MASSACHUSETTS DEPARTMENT OF CORRECTION Cafferty understands, better than most, how difficult that can be in the prison system. She held numerous roles in her 25-year corrections career, ultimately serving as superintendent of the Middlesex Jail and House of Correction, where she oversaw the introduction of the first tablet-based prison literacy program in New England. “I used to say someday when I write a book, it’s going to be called Swimming Against the Tide,” she says. In a correctional environment, “safety and security come first, always,” she explains. “Programming and education are much further down the list of priorities.” TEJI’s work pushes against a current in public opinion that takes a punitive rather than rehabilitative view of incarceration. Some skeptics see educating people in prison as rewarding bad deeds. “Out in the world I’ve had people say to me, ‘Maybe I should commit a crime so I can get a free college education,’” says Perlman. “My general response is, well, you really have one choice here: Do you want more crime or less crime? There’s hard data that there’s nothing that works like education to cut recidivism, to change the atmosphere within a prison so prisons become less violent places. Also, do you want to spend more or do you want to spend less money on this problem? For every dollar we spend on prison education and similar programs, we save five dollars.” The research to which Perlman refers includes a 2018 RAND study, which found that participants in correctional education programs in the US were 28% less likely to reoffend than their counterparts who did not participate. It’s a powerful number, considering that roughly 500,000 people are released from custody each year. Perlman has such statistics at the ready, as he must. But talk to him for any amount of time and the humanity behind the numbers is what stands out. “There is a sizable group of people in prison who, if society was doing a better job, would have different lives,” he says, noting that “they’re smart enough and they have character enough” to pull it off: “We can make things happen in prison that will put them on a different path.” “Most of the people I teach behind bars are people that have had terrible experiences with education and don’t feel themselves to be very capable at all,” he says. So he sometimes opens his class by saying: “Something you probably wouldn’t guess about me is that I failed the 11th grade twice and dropped out of high school. And now I have a PhD from MIT and I’ve been teaching at MIT for 40 years. So you never know where life’s gonna lead you.” Though Perlman struggled to find his motivation in high school, he “buckled down and learned how much I loved learning,” as he puts it, when his parents sent him to boarding school to finish his diploma. He went on to graduate from St. John’s College in Annapolis, Maryland. Growing up in Michigan in the 1960s, he’d learned about fair housing issues because his mother was involved with the civil rights movement, and he lived for a time with a Black family that ran a halfway house for teenage girls. By the time he took that first job interviewing incarcerated former drug addicts, he was primed to understand their stories within the context of poverty, discrimination, and other systemic factors. He began volunteering for a group helping people reenter society after incarceration, and as part of his training, he spent a night booked into jail. “I didn’t experience any ill treatment,” he says, “but I did experience the complete powerlessness you have when you’re a prisoner.” Jocelyn Zhu ’25 took a class with Perlman in the fall of 2023 at the Suffolk County House of Correction, and entering the facility gave her a similar sense of powerlessness. “We had to put our phones away, and whatever we were told to do we would have to do, and that’s not really an experience that you’re in very often as a student at MIT,” says Zhu. “There was definitely that element of surrender: ‘I’m not in charge of my environment.’” On the flip side, she says, “because you’re in that environment, the only thing you’re doing while you’re there is learning—and really focusing in on the discussion you’re having with other students.” “I call them the ‘philosophical life skills’ classes,” says Perlman, “because there are things in our lives that everybody should sit down and think through as well as they can at some point.” He says that while those classes work fine with just MIT students, being able to go into a prison and talk through the same issues with people who have had very different life experiences adds a richness to the discussion that would be hard to replicate in a typical classroom. He recalls the first time he broached the topic of forgiveness in a prison setting. Someone serving a life sentence for murder put things in a way Perlman had never considered. He remembers the man saying: “What I did was unforgivable. If somebody said ‘I forgive you for taking my child’s life,’ I wouldn’t even understand what that meant. For me, forgiveness means trying, at least … to regard me as somebody who’s capable of change … giving me the space to show you that I’m not the person who did that anymore.’” Perlman went home and revised his lecture notes. “I completely reformulated my conception of forgiveness based on that,” he says. “And I tell that story every time I teach the class.” The meeting room at the minimum-security Boston Pre-Release Center is simply furnished: clusters of wooden tables and chairs, a whiteboard, some vending machines. December’s bare branches are visible through a row of windows that remain closed even on the warmest of days (“Out of Bounds,” warns a sign taped beside them). This afternoon, the room is hosting one of Perlman’s signature classes, Nonviolence as a Way of Life. To close the fall 2024 semester, he has asked his students to creatively recap four months of Thursdays together. Before long, the students are enmeshed in a good-natured showdown, calling out letters to fill in the blanks in a mystery phrase unfolding on the whiteboard. Someone solves it (“An eye for an eye makes the world go blind”) and scores bonus points for identifying its corresponding unit on the syllabus (Restorative Justice). “It’s still anybody’s game!” announces the presenting student, Jay Ferran, earning guffaws with his spot-on TV host impression. Ferran and the other men in the room wearing jeans are residents of the Pre-Release Center. They have shared this class all semester with undergrad and grad students from MIT and Harvard (who are prohibited from wearing jeans by the visitor dress code). Before they all part ways, they circle up their chairs one last time. “Humor can be a defense mechanism, but it never felt that way in here,” says Isabel Burney, a student at the Harvard Graduate School of Education. “I really had a good time laughing with you guys.” “I appreciate everyone’s vulnerability,” says Jack Horgen ’26. “I think that takes a lot of grace, strength, and honesty.” “I’d like to thank the outside students for coming in and sharing as well,” says Ferran. “It gives a bit of freedom to interact with students who come from the outside. We want to get on the same level. You give us hope.” After the room has emptied out, Ferran reflects further on finding himself a college student at this stage in his life. Now in his late 40s, he dropped out of high school when he became a father. “I always knew I was smart and had the potential, but I was a follower,” he says. As Ferran approaches the end of his sentence, he’s hoping to leverage the college credits he’s earned so far into an occupation in counseling and social work. His classmate Philip Hutchful, 35, is aiming for a career in construction management. Access to education in prison “gives people a second chance at life,” Hutchful says. “It keeps your mind busy, rewires your brain.” JAY DIAS/MASSACHUSETTS DEPARTMENT OF CORRECTION JAY DIAS/MASSACHUSETTS DEPARTMENT OF CORRECTION JAY DIAS/MASSACHUSETTS DEPARTMENT OF CORRECTION MIT undergrads Denisse Romero Cruz ’25, Jack Horgen ’26, and Alor Sahoo ’26 at the final session of Perlman’s Nonviolence as a Way of Life class at the Boston Pre-Release Center. Along with about 45% of the Boston Pre-Release Center’s residents, Ferran and Hutchful are enrolled in the facility’s School of Reentry, which partners with MIT and other local colleges and universities to provide educational opportunities during the final 12 to 18 months of a sentence. “We have seen a number of culture shifts for our students and their families, such as accountability, flexible thinking, and curiosity,” says the program’s executive director, Lisa Millwood. There are “students who worked hard just so they can proudly be there to support their grandchildren, or students who have made pacts with their teenage children who are struggling in school to stick with it together.” Ferran and Hutchful had previously taken college-level classes through the School of Reentry, but the prospect of studying alongside MIT and Harvard students raised new qualms. “These kids are super smart—how can I compete with them? I’m going to feel so stupid,” Ferran remembers thinking. “In fact, it wasn’t like that at all.” “We all had our own different types of knowledge,” says Hutchful. Both Ferran and Hutchful say they’ve learned skills that they’ll put to use in their post-release lives, from recognizing manipulation to fostering nonviolent communication. Hutchful especially appreciates the principle that “you need to attack the problem, not the person,” saying, “This class teaches you how to deal with all aspects of people—angry people, impatient people. You’re not being triggered to react.” Perlman has taught Nonviolence as a Way of Life nearly every semester since TEJI launched. Samuel Tukua ’25 took the class a few years ago. Like Hutchful, he has applied its lessons. “I wouldn’t be TAing it for the third year now if it didn’t have this incredible impact on my life,” Tukua says. Meeting incarcerated people did not in itself shift Tukua’s outlook; their stories didn’t surprise him, given his own upbringing in a low-income neighborhood near Atlanta. But watching learners from a range of backgrounds find common ground in big philosophical ideas helped convince him of those ideas’ validity. For example, he started to notice undercurrents of violence in everyday actions and speech. “It doesn’t matter whether you came from a highly violent background or if you came from a privileged, less violent background,” he says he realized. “That kind of inner violence or that kind of learned treatment exists inside all of us.” Marisa Gaetz ’20, a fifth-year PhD candidate in math at MIT, has stayed in TEJI’s orbit in the seven years since its founding—first as a student, then as a teaching assistant, and now by helping to run its computer science classes. Limitations on in-person programming imposed by the covid-19 pandemic led Gaetz and fellow MIT grad student Martin Nisser, SM ’19, PhD ’24, to develop remote computer education classes for incarcerated TEJI students. In 2021, she and Nisser (now an assistant professor at the University of Washington) joined with Emily Harburg, a tech access advocate, to launch Brave Behind Bars, which partners closely with TEJI to teach Intro to Python, web development, and game design in both English and Spanish to incarcerated people across the US and formerly incarcerated students in Colombia and Mexico. Since many inside students have laptop access only during class time, the remote computer courses typically begin with a 30-minute lecture followed by Zoom breakouts with teaching assistants. A ratio of one TA for every three or four students ensures that “each student feels supported, especially with coding, which can be frustrating if you’re left alone with a bug for too long,” Gaetz says. Gaetz doesn’t always get to hear how things work out for her students,but she’s learned of encouraging outcomes. One Brave Behind Bars TA who got his start in their classes is now a software engineer. Another group of alums founded Reentry Sisters, an organization for formerly incarcerated women. “They made their own website using the skills that they learned in our class,” Gaetz says. “That was really amazing to see.” Although the pandemic spurred some prisons to expand use of technology, applying those tools to education in a coordinated way requires the kind of bridge-building TEJI has become known for since forming the Massachusetts Prison Education Consortium (MPEC) in 2018. “I saw there were a bunch of colleges doing various things in prisons and we weren’t really talking to each other,” says Perlman. TEJI secured funding from the Mellon Foundation and quickly expanded MPEC’s membership to more than 80 educational institutions, corrections organizations, and community-based agencies. Millwood says the School of Reentry has doubled its capacity and program offerings thanks to collaborations developed through MPEC. At the regional level, TEJI teamed up with the New England Board of Higher Education in 2022 to create the New England Commission on the Future of Higher Education in Prison. Its formation was prompted in part by the anticipated increase in demand for high-quality prison education programs thanks to the FAFSA Simplification Act, which as of 2023 reversed a nearly three-decade ban on awarding federal Pell grants to incarcerated people. Participants included leaders from academia and correctional departments as well as formerly incarcerated people. One, Daniel Throop, cochaired a working group called “Career, Workforce, and Employer Connections” just a few months after his release. “I lived out a reentry while I was on the commission in a way that was very, very powerful,” Throop says. “I was still processing in real time.” “Most of the people I teach behind bars are people that have had terrible experiences with education and don’t feel themselves to be very capable at all.” Lee Perlman, PhD ’89 During his incarceration in Massachusetts, Throop had revived the long-defunct Norfolk Prison Debating Society, which went head-to-head with university teams including MIT’s. Credits from his classes, including two with Perlman, culminated in a bachelor’s degree in interdisciplinary studies magna cum laude from Boston University, which he earned before his release. But he still faced big challenges. “Having a criminal record is still a very, very real hurdle,” Throop says. “I was so excited when those doors of prison finally opened after two decades, only to be greatly discouraged that so many doors of the community remained closed to me.” Initially, the only employment he could get was loading UPS trucks by day and unloading FedEx trucks by night. He eventually landed a job with the Massachusetts Bail Fund and realized his goal of launching the National Prison Debate League. “I fortunately had the educational credentials and references and the wherewithal to not give up on myself,” says Throop. “A lot of folks fail with less resources and privilege and ability and support.” The commission’s 2023 report advocates for improved programming and support for incarcerated learners spanning the intake, incarceration, and reentry periods. To help each state implement the recommendations, the New England Prison Education Collaborative (NEPEC) launched in October 2024 with funding from the Ascendium Education Group. Perlman encouraged TEJI alumna Nicole O’Neal, then working at Tufts University, to apply for the position she now holds as a NEPEC project manager. Like Throop, O’Neal has firsthand experience with the challenges of reentry. Despite the stigma of having served time, having a transcript with credits earned during the period she was incarcerated “proved valuable for both job applications and securing housing,” she says. With the help of a nonprofit called Partakers and “a lot of personal initiative,” she navigated the confusing path to matriculation on Boston University’s campus, taking out student loans so she could finish the bachelor’s degree she’d begun in prison. A master’s followed. “I’ve always known that education was going to be my way out of poverty,” she says. From her vantage point at NEPEC, O’Neal sees how TEJI’s approach can inspire other programs. “What truly sets TEJI apart is the way that it centers students as a whole, as people and not just as learners,” she says. “Having the opportunity to take an MIT course during my incarceration wasn’t just about earning credits—it was about being seen as capable of engaging with the same level of intellectual rigor as students outside. That recognition changed how I saw myself and my future.” On a Zoom call one Wednesday evening in December, Perlman’s inside-out course on Stoicism is wrapping up. Most participants are women incarcerated in Maine. These are among Perlman’s most advanced and long-standing students, thanks to the state’s flexible approach to prison education—Perlman says it’s “maybe the most progressive system in the country,” early to adopt remote learning, experiment with mixed-gender classes, and allow email communication between teachers and students. The mood is convivial, the banter peppered with quotes from the likes of Marcus Aurelius and Epictetus. More than one student is crocheting a Christmas gift, hands working busily at the edges of their respective Zoom rectangles. As the students review what they’ve learned, the conversation turns to the stereotype of Stoicism as a lack of emotion. “I get the feeling the Stoics understood their emotions better than most because they weren’t puppets to their emotions,” says a student named Nicole. “They still feel things—they’re just not governed by it.” Jay Ferran, an incarcerated student at the Boston Pre-Release Center, presents a game to help recap what the class learned over the semester.JAY DIAS/MASSACHUSETTS DEPARTMENT OF CORRECTION Jade, who is a year into a 16-month sentence, connects this to her relationship with her 14-month-old son: “I think I would be a bad Stoic in how I love him. That totally governs me.” Perlman, a bit mischievously: “Does anyone want to talk Jade into being a Stoic mother?” Another classmate, Victoria, quips: “I think you’d like it better when he’s a teenager.” When the laughter dies down, she says more seriously, “I think it’s more about not allowing your emotions to carry you away.” But she adds that it’s hard to do that as a parent. “Excessive worry is also a hindrance,” Jade concedes. “So how do I become a middle Stoic?” “A middle Stoic would be an Aristotelian, I think,” muses Perlman. When the conversation comes around to amor fati, the Stoic notion of accepting one’s fate, Perlman asks how successful his students have been at this. The group’s sole participant from a men’s facility, Arthur, confesses that he has struggled with this over more than 20 years in prison. But for the last few years, school has brought him new focus. He helps run a space where other residents can study. “I hear you saying you can only love your fate if you have a telos, a purpose,” Perlman says. “I was always teaching people things to survive or get ahead by any means necessary,” Arthur says. “Now it’s positive building blocks.” “Education is my telos, and when I couldn’t access it at first, I had to focus on what was in my control,” says Victoria. “I framed my prison experiences as refusing to be harmed by the harmful process of incarceration. I’m going to use this opportunity for myself … so I can be who I want to be when I leave here.” Soon after, the video call—and the course—ends. But if Perlman’s former students’ experience is any indication, the ideas their teacher has introduced will continue to percolate. O’Neal, who took Perlman’s Philosophy of Love, is still mulling over an exploration of loyalty in Tristan and Isolde that brought a classmate to tears. She thinks Perlman’s ability to nurture dialogue on sensitive topics begins with his relaxed demeanor—a remarkable quality in the prison environment. “It’s like you’re coming to our house. A lot of [people] show up as guests. Lee shows up like someone who’s been around—you know, and he’s willing to clean up the dishes with you. He just feels at home,” she says. “So he made us feel at home.” Throop becomes animated when he describes taking Philosophy of the Self and Soul with Perlman and MIT students at MCI-Norfolk in 2016. “Over those days and weeks, we got to meet and discuss the subject matter—walking around the prison yards together, my classmates and I, and then coming back and having these almost indescribable—I’m rarely at a loss for words!—weekly class discussions,” Throop remembers. Perlman “would throw one big question out there, and he would sit back and patiently let us all chop that material up,” he adds. “These discussions were like the highlight of all of our weeks, because we got to have this super-cool exchange of ideas, testing our perspectives … And then these 18-to-20-year-old students who were coming in with a whole different worldview, and being able to have those worldviews collide in a healthy way.” “We all were having such enriching discussions that the semester flew by,” he says. “You didn’t want school to end.”0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMThe Institute’s greatest ambassadorsAfter decades of working as a biologist at a Southern school with a Division 1 football team, coming to MIT was a bit of a culture shock—in the best possible way. I’ve heard from MIT alumni all about late-night psetting, when to catch MITHenge, and the best way to celebrate Pi Day (with pie, of course). And I’ve also learned that for many of you, the Institute is more than simply your alma mater. As the MIT Alumni Association celebrates its 150th anniversary, I’m reflecting on the extraordinary talent and drive of the people here, and what it is that makes MIT alumni—like MIT itself—just a little bit different. As students, you learned to investigate, question, argue, critique, and refine your ideas with faculty and with each other, managing to be both collaborative and competitive. You hacked the toughest and most interesting problems and came up with the most unconventional solutions. And you developed and nurtured a uniquely entrepreneurial, hands-on MIT spirit that only those who have earned a degree here can fully understand, but that the rest of us can easily identify and admire. An article in this magazine about the history of the MIT Alumni Association notes that when the association was formed, there were 84 alumni in total. By 1888, the number had increased to an impressive 579. And it grew by orders of magnitude; today nearly 149,000 alumni are members. But even as the alumni community has grown and evolved, its culture and character have remained remarkably consistent, represented by men and women known for their rigorous thinking, incisive analysis, mens et manus ethos, and drive to make a real and transformative impact on people and communities everywhere. As MIT alumni, you recognize each other by your Brass Rats. These sturdy, cleverly designed rings not only signify your completion (some might say survival) of an immensely difficult course of study. They also signal to the world that you stand ready to share your expertise, knowledge, and experience in the service of humanity. Alumni have always been the Institute’s greatest ambassadors, and today that role has taken on even greater meaning and importance. We are working intensely, every day, to make the case for the vital importance of MIT to ensuring the nation’s security, prosperity, health, and quality of life. And I’m deeply grateful that we can rely on MIT’s extraordinary family of alumni to help share that message far and wide.0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMBug-size robots that fly and flip could pollinate futuristic farms’ cropsTiny flying robots could perform such useful tasks as pollinating crops inside multilevel warehouses, boosting yields while mitigating some of agriculture’s harmful impacts on the environment. The latest robo-bug from an MIT lab, inspired by the anatomy of the bee, comes closer to matching nature’s performance than ever before. Led by Kevin Chen, an associate professor in the Department of Electrical Engineering and Computer Science and the senior author of a paper on the work, the team adapted an earlier flying robot composed of four identical two-winged units, combined into a rectangular device about the size of a microcassette. The wings managed to flap like an insect’s, but the bot couldn’t fly for long. One problem is that the wings would blow air into each other when flapping, reducing the lift forces they could generate. In the new design, each of the four units has a single flapping wing pointing away from the robot’s center, stabilizing the wings and boosting their lift forces. The researchers also improved the way the wings are connected to the actuators, or artificial muscles, that flap them. In previous designs, when the actuators’ movements reached the extremely high frequencies needed for flight, the devices often started buckling. That reduced the power and efficiency of the robot. Thanks in part to a new, longer wing hinge, the actuators now experience less mechanical strain and can apply more force, so the bots can fly faster, longer, and in more precise paths. The robots can precisely track a trajectory enough to spell M-I-T.COURTESY OF THE RESEARCHERS Weighing less than a paper clip, the new robotic insect can hover for more than 1,000 seconds—almost 17 minutes—without any degradation of flight precision. “When my student Yi-Hsuan Hsiao was performing that flight, he said it was the slowest 1,000 seconds he had spent in his entire life. The experiment was extremely nerve-racking,” Chen says. The new robot also reached an average speed of 35 centimeters per second, the fastest flight researchers have reported, and was able to perform body rolls and double flips. It can even precisely track a trajectory that spells M-I-T. “At the end of the day, we’ve shown flight that is 100 times longer than anyone else in the field has been able to do, so this is an extremely exciting result,” Chen says. COURTESY OF THE RESEARCHERS From here, he and his students want to see how far they can push this new design, with the goal of achieving flight for longer than 10,000 seconds. They also want to improve the precision of the robots so they could land in and take off from the center of a flower. In the long run, the researchers hope to install tiny batteries and sensors so the robots could fly and navigate outside the lab. The design has more room for those electronics now that they’ve halved the number of wings. The bots still can’t achieve the fine-tuned behavior of a real bee, Chen acknowledges. Still, he says, “with the improved lifespan and precision of this robot, we are getting closer to some very exciting applications, like assisted pollination.”0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMThe quest to build islands with ocean currents in the MaldivesIn satellite images, the 20-odd coral atolls of the Maldives look something like skeletal remains or chalk lines at a crime scene. But these landforms, which circle the peaks of a mountain range that has vanished under the Indian Ocean, are far from inert. They’re the products of living processes—places where coral has grown toward the surface over hundreds of thousands of years. Shifting ocean currents have gradually pushed sand—made from broken-up bits of this same coral—into more than 1,000 other islands that poke above the surface. But these currents can also be remarkably transient, constructing new sandbanks or washing them away in a matter of weeks. In the coming decades, the daily lives of the half-million people who live on this archipelago—the world’s lowest-lying nation—will depend on finding ways to keep a solid foothold amid these shifting sands. More than 90% of the islands have experienced severe erosion, and climate change could make much of the country uninhabitable by the middle of the century. Off one atoll, just south of the Maldives’ capital, Malé, researchers are testing one way to capture sand in strategic locations—to grow islands, rebuild beaches, and protect coastal communities from sea-level rise. Swim 10 minutes out into the En’boodhoofinolhu Lagoon and you’ll find the Ramp Ring, an unusual structure made up of six tough-skinned geotextile bladders. These submerged bags, part of a recent effort called the Growing Islands project, form a pair of parentheses separated by 90 meters (around 300 feet). The bags, each about two meters tall, were deployed in December 2024, and by February, underwater images showed that sand had climbed about a meter and a half up the surface of each one, demonstrating how passive structures can quickly replenish beaches and, in time, build a solid foundation for new land. “There’s just a ton of sand in there. It’s really looking good,” says Skylar Tibbits, an architect and founder of the MIT Self-Assembly Lab, which is developing the project in partnership with the Malé-based climate tech company Invena. The Self-Assembly Lab designs material technologies that can be programmed to transform or “self-assemble” in the air or underwater, exploiting natural forces like gravity, wind, waves, and sunlight. Its creations include sheets of wood fiber that form into three-dimensional structures when splashed with water, which the researchers hope could be used for tool-free flat-pack furniture. Growing Islands is their largest-scale undertaking yet. Since 2017, the project has deployed 10 experiments in the Maldives, testing different materials, locations, and strategies, including inflatable structures and mesh nets. The Ramp Ring is many times larger than previous deployments and aims to overcome their biggest limitation. In the Maldives, the direction of the currents changes with the seasons. Past experiments have been able to capture only one seasonal flow, meaning they lie dormant for months of the year. By contrast, the Ramp Ring is “omnidirectional,” capturing sand year-round. “It’s basically a big ring, a big loop, and no matter which monsoon season and which wave direction, it accumulates sand in the same area,” Tibbits says. The approach points to a more sustainable way to protect the archipelago, whose growing population is supported by an economy that caters to 2 million annual tourists drawn by its white beaches and teeming coral reefs. Most of the country’s 187 inhabited islands have already had some form of human intervention to reclaim land or defend against erosion, such as concrete blocks, jetties, and breakwaters. Since the 1990s, dredging has become by far the most significant strategy. Boats equipped with high-power pumping systems vacuum up sand from one part of the seabed and spray it into a pile somewhere else. This temporary process allows resort developers and densely populated islands like Malé to quickly replenish beaches and build limitlessly customizable islands. But it also leaves behind dead zones where sand has been extracted—and plumes of sediment that cloud the water with a sort of choking marine smog. Last year, the government placed a temporary ban on dredging to prevent damage to reef ecosystems, which were already struggling amid spiking ocean temperatures. Holly East, a geographer at the University of Northumbria, says Growing Islands’ structures offer an exciting alternative to dredging. But East, who is not involved in the project, warns that they must be sited carefully to avoid interrupting sand flows that already build up islands’ coastlines. To do this, Tibbits and Invena cofounder Sarah Dole are conducting long-term satellite analysis of the En’boodhoofinolhu Lagoon to understand how sediment flows move around atolls. On the basis of this work, the team is currently spinning out a predictive coastal intelligence platform called Littoral. The aim is for it to be “a global health monitoring system for sediment transport,” Dole says. It’s meant not only to show where beaches are losing sand but to “tell us where erosion is going to happen,” allowing government agencies and developers to know where new structures like Ramp Rings can best be placed. Growing Islands has been supported by the National Geographic Society, MIT, the Sri Lankan engineering group Sanken, and tourist resort developers. In 2023, it got a big bump from the US Agency for International Development: a $250,000 grant that funded the construction of the Ramp Ring deployment and would have provided opportunities to scale up the approach. But the termination of nearly all USAID contracts following the inauguration of President Trump means the project is looking for new partners. Matthew Ponsford is a freelance reporter based in London.0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMLongevity clinics around the world are selling unproven treatmentsThe quest for long, healthy life—and even immortality—is probably almost as old as humans are, but it’s never been hotter than it is right now. Today my newsfeed is full of claims about diets, exercise routines, and supplements that will help me live longer. A lot of it is marketing fluff, of course. It should be fairly obvious that a healthy, plant-rich diet and moderate exercise will help keep you in good shape. And no drugs or supplements have yet been proved to extend human lifespan. The growing field of longevity medicine is apparently aiming for something in between these two ends of the wellness spectrum. By combining the established tools of clinical medicine (think blood tests and scans) with some more experimental ones (tests that measure your biological age), these clinics promise to help their clients improve their health and longevity. But a survey of longevity clinics around the world, carried out by an organization that publishes updates and research on the industry, is revealing a messier picture. In reality, these clinics—most of which cater only to the very wealthy—vary wildly in their offerings. Today, the number of longevity clinics is thought to be somewhere in the hundreds. The proponents of these clinics say they represent the future of medicine. “We can write new rules on how we treat patients,” Eric Verdin, who directs the Buck Institute for Research on Aging, said at a professional meeting last year. Phil Newman, who runs Longevity.Technology, a company that tracks the longevity industry, says he knows of 320 longevity clinics operating around the world. Some operate multiple centers on an international scale, while others involve a single “practitioner” incorporating some element of “longevity” into the treatments offered, he says. To get a better idea of what these offerings might be, Newman and his colleagues conducted a survey of 82 clinics around the world, including the US, Australia, Brazil, and multiple countries in Europe and Asia. Some of the results are not all that surprising. Three-quarters of the clinics said that most of their clients were Gen Xers, aged between 44 and 59. This makes sense—anecdotally, it’s around this age that many people start to feel the effects of aging. And research suggests that waves of molecular changes associated with aging hit us in our 40s and again in our 60s. (Longevity influencers Bryan Johnson, Andrew Huberman, and Peter Attia all fall into this age group too.) And I wasn’t surprised to see that plenty of clinics are offering aesthetic treatments, focusing more on how old their clients look. Of the clinics surveyed, 28% said they offered Botox injections, 35% offered hair loss treatments, and 38% offered “facial rejuvenation procedures.” “The distinction between longevity medicine and aesthetic medicine remains blurred,” Andrea Maier of the National University of Singapore, and cofounder of a private longevity clinic, wrote in a commentary on the report. Maier is also former president of the Healthy Longevity Medicine Society, an organization that was set up with the aim of establishing clinical standards and credibility for longevity clinics. Other results from the survey underline how much of a challenge this will be; many clinics are still offering unproven treatments. Over a third of the clinics said they offered stem-cell treatments, for example. There is no evidence that those treatments will help people live longer—and they are not without risk, either. I was a little surprised to see that most of the clinics are also offering prescription medicines off label. In other words, drugs that have been approved for specific medical issues are apparently being prescribed for aging instead. This is also not without risks—all medicines have side effects. And, again, none of them have been proved to slow or reverse human aging. And these prescriptions are coming from certified medical doctors. More than 80% of clinics reported that their practice was overseen by a medical doctor with more than 10 years of clinical experience. It was also a little surprising to learn that despite their high fees, most of these clinics are not making a profit. For clients, the annual costs of attending a longevity clinic range between $10,000 and $150,000, according to Fountain Life, a company with clinics in Florida and Prague. But only 39% of the surveyed clinics said they were turning a profit and 30% said they were “approaching breaking even,” while 16% said they were operating at a loss. Proponents of longevity clinics have high hopes for the field. They see longevity medicine as nothing short of a revolution—a move away from reactive treatments and toward proactive health maintenance. But these survey results show just how far they have to go. This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMThe Download: the US office that tracks foreign disinformation is being eliminated, and explaining vibe codingThis is today's edition of The Download, our weekday newsletter that provides a daily dose of what's going on in the world of technology. US office that counters foreign disinformation is being eliminated The only office within the US State Department that monitors foreign disinformation is to be eliminated, according to US Secretary of State Marco Rubio, confirming reporting by MIT Technology Review. The Counter Foreign Information Manipulation and Interference (R/FIMI) Hub is a small office in the State Department’s Office of Public Diplomacy that tracks and counters foreign disinformation campaigns. The culling of the office leaves the State Department without a way to actively counter the increasingly sophisticated disinformation campaigns from foreign governments like those of Russia, Iran, and China. Read the full story. —Eileen Guo What is vibe coding, exactly? When OpenAI cofounder Andrej Karpathy excitedly took to X back in February to post about his new hobby, he probably had no idea he was about to coin a phrase that encapsulated an entire movement steadily gaining momentum across the world. “There’s a new kind of coding I call ‘vibe coding’, where you fully give in to the vibes, embrace exponentials, and forget that the code even exists," he said. “I’m building a project or webapp, but it’s not really coding—I just see stuff, say stuff, run stuff, and copy paste stuff, and it mostly works.” If this all sounds very different from poring over lines of code, that’s because Karpathy was talking about a particular style of coding with AI assistance. His words struck a chord among software developers and enthusiastic amateurs alike. In the months since, his post has sparked think pieces and impassioned debates across the internet. But what exactly is vibe coding? Who does it benefit, and what’s its likely future? Read the full story. —Rhiannon Williams This story is the latest for MIT Technology Review Explains, our series untangling the complex, messy world of technology to help you understand what’s coming next. You can read more from the series here. These four charts sum up the state of AI and energy You’ve probably read that AI will drive an increase in electricity demand. But how that fits into the context of the current and future grid can feel less clear from the headlines. A new report from the International Energy Agency digs into the details of energy and AI, and I think it’s worth looking at some of the data to help clear things up. Here are four charts from the report that sum up the crucial points about AI and energy demand. —Casey Crownhart This article is from The Spark, MIT Technology Review’s weekly climate newsletter. To receive it in your inbox every Wednesday, sign up here. We need targeted policies, not blunt tariffs, to drive “American energy dominance” —Addison Killean Stark President Trump and his appointees have repeatedly stressed the need to establish “American energy dominance.” But the White House’s profusion of executive orders and aggressive tariffs, along with its determined effort to roll back clean-energy policies, are moving the industry in the wrong direction, creating market chaos and economic uncertainty that are making it harder for both legacy players and emerging companies to invest, grow, and compete. Read the full story. This story is part of Heat Exchange, MIT Technology Review’s guest opinion series, offering expert commentary on legal, political and regulatory issues related to climate change and clean energy. You can read the rest of the pieces here. MIT Technology Review Narrated: Will we ever trust robots? If most robots still need remote human operators to be safe and effective, why should we welcome them into our homes? This is our latest story to be turned into a MIT Technology Review Narrated podcast, which we’re publishing each week on Spotify and Apple Podcasts. Just navigate to MIT Technology Review Narrated on either platform, and follow us to get all our new content as it’s released. The must-reads I’ve combed the internet to find you today’s most fun/important/scary/fascinating stories about technology. 1 The Trump administration has cancelled lifesaving aid to foreign children After Elon Musk previously promised to preserve it. (The Atlantic $)+ DOGE worker Jeremy Lewin, who dismantled USAID, has a new role. (Fortune $)+ The department attempted to embed its staff in an independent non-profit. (The Guardian)+ Elon Musk, DOGE, and the Evil Housekeeper Problem. (MIT Technology Review)2 Astronomers have detected a possible signature of life on a distant planet It’s the first time the potential for life has been spotted on a habitable planet. (NYT $)+ Maybe we should be building observatories on the moon. (Ars Technica) 3 OpenAI’s new AI models can reason with images They’re capable of integrating images directly into their reasoning process. (VentureBeat)+ But they’re still vulnerable to making mistakes. (Ars Technica)+ AI reasoning models can cheat to win chess games. (MIT Technology Review) 4 Trump’s new chip crackdown will cost US firms billionsIt’s not just Nvidia that’s set to suffer. (WP $) + But Jensen Huang isn’t giving up on China altogether. (WSJ $)+ He’s said the company follows export laws ‘to the letter.’ (CNBC)5 Elon Musk reportedly used X to search for potential mothers of his children Sources suggest he has many more children than is publicly known. (WSJ $)6 Local US cops are being trained as immigration enforcers Critics say the rollout is ripe for civil rights abuses. (The Markup)+ ICE is still bound by constitutional limits—for now. (The Conversation)7 This electronic weapon can fry drone swarms from a distanceThe RapidDestroyer uses a high-power radio frequency to take down multiple drones. (FT $) + Meet the radio-obsessed civilian shaping Ukraine’s drone defense. (MIT Technology Review)8 TikTok is attempting to fight back against misinformationIt’s rolling out an X-style community notes feature. (Bloomberg $) 9 A deceased composer’s brain is still making music Three years after Alvin Lucier’s death, cerebral organoids made from his white blood cells are making sounds. (Popular Mechanics)+ AI is coming for music, too. (MIT Technology Review)10 This AI agent can switch personalities Depending what you need it to do. (Wired $) Quote of the day “Yayy, we get one last meal before getting on the electric chair.” —Jing Levine, who runs a party goods business with her husband that’s heavily reliant on suppliers in China, reacts to Donald Trump’s plans to pause tariffs except for China, the New York Times reports. The big story AI means the end of internet search as we’ve known it We all know what it means, colloquially, to google something. You pop a few words in a search box and in return get a list of blue links to the most relevant results. Fundamentally, it’s just fetching information that’s already out there on the internet and showing it to you, in a structured way. But all that is up for grabs. We are at a new inflection point. The biggest change to the way search engines deliver information to us since the 1990s is happening right now. No more keyword searching. Instead, you can ask questions in natural language. And instead of links, you’ll increasingly be met with answers written by generative AI and based on live information from across the internet, delivered the same way. Not everyone is excited for the change. Publishers are completely freaked out. And people are also worried about what these new LLM-powered results will mean for our fundamental shared reality. Read the full story. —Mat Honan We can still have nice things A place for comfort, fun and distraction to brighten up your day. (Got any ideas? Drop me a line or skeet 'em at me.) + Essential viewing: Sweden is broadcasting its beloved moose spring migration for 20 days straight.+ Fearsome warlord Babur was obsessed with melons, and frankly, I don’t blame him.+ Great news for squid fans: a colossal squid has been captured on film for the first time! 🦑+ Who stole my cheese?0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMThis architect wants to build cities out of lavaArnhildur Pálmadóttir was around three years old when she saw a red sky from her living room window. A volcano was erupting about 25 miles away from where she lived on the northeastern coast of Iceland. Though it posed no immediate threat, its ominous presence seeped into her subconscious, populating her dreams with streaks of light in the night sky. Fifty years later, these “gloomy, strange dreams,” as Pálmadóttir now describes them, have led to a career as an architect with an extraordinary mission: to harness molten lava and build cities out of it. Pálmadóttir today lives in Reykjavik, where she runs her own architecture studio, S.AP Arkitektar, and the Icelandic branch of the Danish architecture company Lendager, which specializes in reusing building materials. The architect believes the lava that flows from a single eruption could yield enough building material to lay the foundations of an entire city. She has been researching this possibility for more than five years as part of a project she calls Lavaforming. Together with her son and colleague Arnar Skarphéðinsson, she has identified three potential techniques: drill straight into magma pockets and extract the lava; channel molten lava into pre-dug trenches that could form a city’s foundations; or 3D-print bricks from molten lava in a technique similar to the way objects can be printed out of molten glass. Pálmadóttir and Skarphéðinsson first presented the concept during a talk at Reykjavik’s DesignMarch festival in 2022. This year they are producing a speculative film set in 2150, in an imaginary city called Eldborg. Their film, titled Lavaforming, follows the lives of Eldborg’s residents and looks back on how they learned to use molten lava as a building material. It will be presented at the Venice Biennale, a leading architecture festival, in May. Set in 2150, her speculative film Lavaforming presents a fictional city built from molten lava.COURTESY OF S.AP ARKITEKTAR Buildings and construction materials like concrete and steel currently contribute a staggering 37% of the world’s annual carbon dioxide emissions. Many architects are advocating for the use of natural or preexisting materials, but mixing earth and water into a mold is one thing; tinkering with 2,000 °F lava is another. Still, Pálmadóttir is piggybacking on research already being done in Iceland, which has 30 active volcanoes. Since 2021, eruptions have intensified in the Reykjanes Peninsula, which is close to the capital and to tourist hot spots like the Blue Lagoon. In 2024 alone, there were six volcanic eruptions in that area. This frequency has given volcanologists opportunities to study how lava behaves after a volcano erupts. “We try to follow this beast,” says Gro Birkefeldt M. Pedersen, a volcanologist at the Icelandic Meteorological Office (IMO), who has consulted with Pálmadóttir on a few occasions. “There is so much going on, and we’re just trying to catch up and be prepared.” Pálmadóttir’s concept assumes that many years from now, volcanologists will be able to forecast lava flow accurately enough for cities to plan on using it in building. They will know when and where to dig trenches so that when a volcano erupts, the lava will flow into them and solidify into either walls or foundations. Today, forecasting lava flows is a complex science that requires remote sensing technology and tremendous amounts of computational power to run simulations on supercomputers. The IMO typically runs two simulations for every new eruption—one based on data from previous eruptions, and another based on additional data acquired shortly after the eruption (from various sources like specially outfitted planes). With every event, the team accumulates more data, which makes the simulations of lava flow more accurate. Pedersen says there is much research yet to be done, but she expects “a lot of advancement” in the next 10 years or so. To design the speculative city of Eldborg for their film, Pálmadóttir and Skarphéðinsson used 3D-modeling software similar to what Pedersen uses for her simulations. The city is primarily built on a network of trenches that were filled with lava over the course of several eruptions, while buildings are constructed out of lava bricks. “We’re going to let nature design the buildings that will pop up,” says Pálmadóttir. The aesthetic of the city they envision will be less modernist and more fantastical—a bit “like [Gaudi’s] Sagrada Familia,” says Pálmadóttir. But the aesthetic output is not really the point; the architects’ goal is to galvanize architects today and spark an urgent discussion about the impact of climate change on our cities. She stresses the value of what can only be described as moonshot thinking. “I think it is important for architects not to be only in the present,” she told me. “Because if we are only in the present, working inside the system, we won’t change anything.” Pálmadóttir was born in 1972 in Húsavik, a town known as the whale-watching capital of Iceland. But she was more interested in space and technology and spent a lot of time flying with her father, a construction engineer who owned a small plane. She credits his job for the curiosity she developed about science and “how things were put together”—an inclination that proved useful later, when she started researching volcanoes. So was the fact that Icelanders “learn to live with volcanoes from birth.” At 21, she moved to Norway, where she spent seven years working in 3D visualization before returning to Reykjavik and enrolling in an architecture program at the Iceland University of the Arts. But things didn’t click until she moved to Barcelona for a master’s degree at the Institute for Advanced Architecture of Catalonia. “I remember being there and feeling, finally, like I was in the exact right place,” she says. Before, architecture had seemed like a commodity and architects like “slaves to investment companies,” she says. Now, it felt like a path with potential. COURTESY OF S.AP ARKITEKTAR COURTESY OF S.AP ARKITEKTAR COURTESY OF S.AP ARKITEKTAR COURTESY OF S.AP ARKITEKTAR Lava has proved to be a strong, durable building material, at least in its solid state. To explore its potential, Pálmadóttir and Skarphéðinsson envision a city built on a network of trenches that have filled with lava over the course of several eruptions, while buildings are constructed with lava bricks. She returned to Reykjavik in 2009 and worked as an architect until she founded S.AP (for “studio Arnhildur Pálmadóttir”) Arkitektar in 2018; her son started working with her in 2019 and officially joined her as an architect this year, after graduating from the Southern California Institute of Architecture. In 2021, the pair witnessed their first eruption up close, near the Fagradalsfjall volcano on the Reykjanes Peninsula. It was there that Pálmadóttir became aware of the sheer quantity of material coursing through the planet’s veins, and the potential to divert it into channels. Lava has already proved to be a strong, long-lasting building material—at least in its solid state. When it cools, it solidifies into volcanic rock like basalt or rhyolite. The type of rock depends on the composition of the lava, but basaltic lava—like the kind found in Iceland and Hawaii—forms one of the hardest rocks on Earth, which means that structures built from this type of lava would be durable and resilient. For years, architects in Mexico, Iceland, and Hawaii (where lava is widely available) have built structures out of volcanic rock. But quarrying that rock is an energy-intensive process that requires heavy machines to extract, cut, and haul it, often across long distances, leaving a big carbon footprint. Harnessing lava in its molten state, however, could unlock new methods for sustainable construction. Jeffrey Karson, a professor emeritus at Syracuse University who specializes in volcanic activity and who cofounded the Syracuse University Lava Project, agrees that lava is abundant enough to warrant interest as a building material. To understand how it behaves, Karson has spent the past 15 years performing over a thousand controlled lava pours from giant furnaces. If we figure out how to build up its strength as it cools, he says, “that stuff has a lot of potential.” In his research, Karson found that inserting metal rods into the lava flow helps reduce the kind of uneven cooling that would lead to thermal cracking—and therefore makes the material stronger (a bit like rebar in concrete). Like glass and other molten materials, lava behaves differently depending on how fast it cools. When glass or lava cools slowly, crystals start forming, strengthening the material. Replicating this process—perhaps in a kiln—could slow down the rate of cooling and let the lava become stronger. This kind of controlled cooling is “easy to do on small things like bricks,” says Karson, so “it’s not impossible to make a wall.” Pálmadóttir is clear-eyed about the challenges before her. She knows the techniques she and Skarphéðinsson are exploring may not lead to anything tangible in their lifetimes, but they still believe that the ripple effect the projects could create in the architecture community is worth pursuing. Both Karson and Pedersen caution that more experiments are necessary to study this material’s potential. For Skarphéðinsson, that potential transcends the building industry. More than 12 years ago, Icelanders voted that the island’s natural resources, like its volcanoes and fishing waters, should be declared national property. That means any city built from lava flowing out of these volcanoes would be controlled not by deep-pocketed individuals or companies, but by the nation itself. (The referendum was considered illegal almost as soon as it was approved by voters and has since stalled.) For Skarphéðinsson, the Lavaforming project is less about the material than about the “political implications that get brought to the surface with this material.” “That is the change I want to see in the world,” he says. “It could force us to make radical changes and be a catalyst for something”—perhaps a social megalopolis where citizens have more say in how resources are used and profits are shared more evenly. Cynics might dismiss the idea of harnessing lava as pure folly. But the more I spoke with Pálmadóttir, the more convinced I became. It wouldn’t be the first time in modern history that a seemingly dangerous idea (for example, drilling into scalding pockets of underground hot springs) proved revolutionary. Once entirely dependent on oil, Iceland today obtains 85% of its electricity and heat from renewable sources. “[My friends] probably think I’m pretty crazy, but they think maybe we could be clever geniuses,” she told me with a laugh. Maybe she is a little bit of both. Elissaveta M. Brandon is a regular contributor to Fast Company and Wired.0 Комментарии 0 Поделились 0 предпросмотр
-
WWW.TECHNOLOGYREVIEW.COMLove or immortality: A short story1. Sophie and Martin are at the 2012 Gordon Research Conference on the Biology of Aging in Ventura, California. It is a foggy February weekend. Both are disappointed about how little sun there is on the California beach. They are two graduate students—Sophie in her sixth and final year, Martin in his fourth—who have traveled from different East Coast cities to present posters on their work. Martin’s shows health data collected from supercentenarians compared with the general Medicare population, capturing the diseases that are less and more common in the populations. Sophie is presenting on her recently accepted first-author paper in Aging Cell on two specific genes that, when activated, extend lifespan in C. elegans roundworms, the model organism of her research. 2. Sophie walks by Martin’s poster after she is done presenting her own. She is not immediately impressed by his work. It is not published, for one thing. But she sees how it is attention-grabbing and relevant, even necessary. He has a little crowd listening to him. He notices her—a frowning girl—standing in the back and begins to talk louder, hoping she hears. “Supercentenarians are much less likely to have seven diseases,” he says, pointing to his poster. “Alzheimer’s, heart failure, diabetes, depression, prostate cancer, hip fracture, and chronic kidney disease. Though they have higher instances of four diseases, which are arthritis, cataracts, osteoporosis, and glaucoma. These aren’t linked to mortality, but they do affect quality of life.” What stands out to Sophie is the confidence in Martin’s voice, despite the unsurprising nature of the findings. She admires that sound, its sturdiness. She makes note of his name and plans to seek him out. 3. They find one another in the hotel bar among other graduate students. The students are talking about the logistics of their futures: Who is going for a postdoc, who will opt for industry, do any have job offers already, where will their research have the most impact, is it worth spending years working toward something so uncertain? They stay up too late, dissecting journal articles they’ve read as if they were debating politics. They enjoy the freedom away from their labs and PIs. Martin says, again with that confidence, that he will become a professor. Sophie says she likely won’t go down that path. She has received an offer to start as a scientist at an aging research startup called Abyssinian Bio, after she defends. Martin says, “Wouldn’t your work make more sense in an academic setting, where you have more freedom and power over what you do?” She says, “But that could be years from now and I want to start my real life, so …” 4-18. Martin is enamored with Sophie. She is not only brilliant; she is helpful. She strengthens his papers with precise edits and grounds his arguments with stronger evidence. Sophie is enamored with Martin. He is not only ambitious; he is supportive and adventurous. He encourages her to try new activities and tools, both in and out of work, like learning to ride a motorcycle or using CRISPR. Martin visits Sophie in San Francisco whenever he can, which amounts to a weekend or two every other month. After two years, their long-distance relationship is taking its toll. They want more weekends, more months, more everything together. They make plans for him to get a postdoc near her, but after multiple rejections from the labs where he most wants to work, his resentment toward academia grows. “They don’t see the value of my work,” he says. 19. “Join Abyssinian,” Sophie offers. The company is growing. They want more researchers with data science backgrounds. He takes the job, drawn more by their future together than by the science. 20-35. For a long time, they are happy. They marry. They do their research. They travel. Sophie visits Martin’s extended family in France. Martin goes with Sophie to her cousin’s wedding in Taipei. They get a dog. The dog dies. They are both devastated but increasingly motivated to better understand the mechanisms of aging. Maybe their next dog will have the opportunity to live longer. They do not get a next dog. Sophie moves up at Abyssinian. Despite being in industry, her work is published in well-respected journals. She collaborates well with her colleagues. Eventually, she is promoted to executive director of research. Martin stalls at the rank of principal scientist, and though Sophie is technically his boss—or his boss’s boss—he genuinely doesn’t mind when others call him “Dr. Sophie Xie’s husband.” 40. At dinner on his 35th birthday, a friend jokes that Martin is now middle-aged. Sophie laughs and agrees, though she is older than Martin. Martin joins in the laughter, but this small comment unlocks a sense of urgency inside him. What once felt hypothetical—his own death, the death of his wife—now appears very close. He can feel his wrinkles forming. First come the subtle shifts in how he talks about his research and Abyssinian’s work. He wants to “defeat” and “obliterate” aging, which he comes to describe as humankind’s “greatest adversary.” 43. He begins taking supplements touted by tech influencers. He goes on a calorie-restricted diet. He gets weekly vitamin IV sessions. He looks into blood transfusions from young donors, but Sophie tells him to stop with all the fake science. She says he’s being ridiculous, that what he’s doing could be dangerous. Martin, for the first time, sees Sophie differently. Not without love, but love burdened by an opposing weight, what others might recognize as resentment. Sophie is dedicated to the demands of her growing department. Martin thinks she is not taking the task of living longer seriously enough. He does not want her to die. He does not want to die. Nobody at Abyssinian is taking the task of living longer seriously enough. Of all the aging bio startups he could have ended up at, how has he ended up at one with such modest—no, lazy—goals? He begins publicly dismissing basic research as “too slow” and “too limited,” which offends many of his and Sophie’s colleagues. Sophie defends him, says he is still doing good work, despite the evidence. She is busy, traveling often for conferences, and mistakenly misclassifies the changes in Martin’s attitude as temporary outliers. 44. One day, during a meeting, Martin says to Jerry, a well-respected scientist at Abyssinian and in the electron microscopy imaging community at large, that EM is an outdated, old, crusty technology. Martin says it is stupid to use it when there are more advanced, cutting-edge methods, like cryo-EM and super-resolution microscopy. Martin has always been outspoken, but this instance veers into rudeness. At home, Martin and Sophie argue. Initially, they argue about whether tools of the past can be useful to their work. Then the argument morphs. What is the true purpose of their research? Martin says it’s called anti-aging research for a reason: It’s to defy aging! Sophie says she’s never called her work anti-aging research; she calls it aging research or research into the biology of aging. And Abyssinian’s overarching mission is more simply to find druggable targets for chronic and age-related diseases. Occasionally, the company’s marketing arm will push out messaging about extending the human lifespan by 20 years, but that has nothing to do with scientists like them in R&D. Martin seethes. Only 20 years! What about hundreds? Thousands? 45-49. They continue to argue and the arguments are roundabout, typically ending with Sophie crying, absconding to her sister’s house, and the two of them not speaking for short periods of time. 50. What hurts Sophie most is Martin’s persistent dismissal of death as merely an engineering problem to be solved. Sophie thinks of the ways the C. elegans she observes regulate their lifespans in response to environmental stress. The complex dance of genes and proteins that orchestrates their aging process. In the previous month’s experiment, a seemingly simple mutation produced unexpected effects across three generations of worms. Nature’s complexity still humbles her daily. There is still so much unknown. Martin is at the kitchen counter, methodically crushing his evening supplements into powder. “I’m trying to save humanity. And all you want to do is sit in the lab to watch worms die.” 50. Martin blames the past. He realizes he should have tried harder to become a professor. Let Sophie make the industry money—he could have had academic clout. Professor Warwick. It would have had a nice sound to it. To his dismay, everyone in his lab calls him Martin. Abyssinian has a first-name policy. Something about flat hierarchies making for better collaboration. Good ideas could come from anyone, even a lowly, unintelligent senior associate scientist in Martin’s lab who barely understands how to process a data set. A great idea could come from anyone at all—except him, apparently. Sophie has made that clear. 51-59. They live in a tenuous peace for some time, perfecting the art of careful scheduling: separate coffee times, meetings avoided, short conversations that stick to the day-to-day facts of their lives. 60. Then Martin stands up to interrupt a presentation by the VP of research to announce that studying natural aging is pointless since they will soon eliminate it entirely. While Jerry may have shrugged off Martin’s aggressiveness, the VP does not. This leads to a blowout fight between Martin and many of his colleagues, in which Martin refuses to apologize and calls them all shortsighted idiots. Sophie watches with a mixture of fear and awe. Martin thinks: Can’t she, my wife, just side with me this once? 61. Back at home: Martin at the kitchen counter, methodically crushing his evening supplements into powder. “I’m trying to save humanity.” He taps the powder into his protein shake with the precision of a scientist measuring reagents. “And all you want to do is sit in the lab to watch worms die.” Sophie observes his familiar movements, now foreign in their desperation. The kitchen light catches the silver spreading at his temples and on his chin—the very evidence of aging he is trying so hard to erase. “That’s not true,” she says. Martin gulps down his shake. “What about us? What about children?” Martin coughs, then laughs, a sound that makes Sophie flinch. “Why would we have children now? You certainly don’t have the time. But if we solve aging, which I believe we can, we’d have all the time in the world.” “We used to talk about starting a family.” “Any children we have should be born into a world where we already know they never have to die.” “We could both make the time. I want to grow old together—” All Martin hears are promises that lead to nothing, nowhere. “You want us to deteriorate? To watch each other decay?” “I want a real life.” “So you’re choosing death. You’re choosing limitation. Mediocrity.” 64. Martin doesn’t hear from his wife for four days, despite texting her 16 times—12 too many, by his count. He finally breaks down enough to call her in the evening, after a couple of glasses of aged whisky (a gift from a former colleague, which Martin has rarely touched and kept hidden in the far back of a desk drawer). Voicemail. And after this morning’s text, still no glimmering ellipsis bubble to indicate Sophie’s typing. 66. Forget her, he thinks, leaning back in his Steelcase chair, adjusted specifically for his long runner’s legs and shorter-than-average torso. At 39, Martin’s spreadsheets of vitals now show an upward trajectory; proof of his ability to reverse his biological age. Sophie does not appreciate this. He stares out his office window, down at the employees crawling around Abyssinian Bio’s main quad. How small, he thinks. How significantly unaware of the future’s true possibilities. Sophie is like them. 67. Forget her, he thinks again as he turns down a bay toward Robert, one of his struggling postdocs, who is sitting at his bench staring at his laptop. As Martin approaches, Robert minimizes several windows, leaving only his home screen behind. “Where are you at with the NAD+ data?” Martin asks. Robert shifts in his chair to face Martin. The skin of his neck grows red and splotchy. Martin stares at it in disgust. “Well?” he asks again. “Oh, I was told not to work on that anymore?” The boy has a tendency to speak in the lilt of questions. “By who?” Martin demands. “Uh, Sophie?” “I see. Well, I expect new data by end of day.” “Oh, but—” Martin narrows his eyes. The red splotches on Robert’s neck grow larger. “Um, okay,” the boy says, returning his focus to the computer. Martin decides a response is called for … 70. Immortality Promise I am immortal. This doesn’t make me special. In fact, most people on Earth are immortal. I am 6,000 years old. Now, 6,000 years of existence give one a certain perspective. I remember back when genetic engineering and knowledge about the processes behind aging were still in their infancy. Oh, how people argued and protested. “It’s unethical!” “We’ll kill the Earth if there’s no death!” “Immortal people won’t be motivated to do anything! We’ll become a useless civilization living under our AI overlords!” I believed back then, and now I know. Their concerns had no ground to stand on. Eternal life isn’t even remarkable anymore, but being among its architects and early believers still garners respect from the world. The elegance of my team’s solution continues to fill me with pride. We didn’t just halt aging; we mastered it. My cellular machinery hums with an efficiency that would make evolution herself jealous. Those early protesters—bless their mortal, no-longer-beating hearts—never grasped the biological imperative of what we were doing. Nature had already created functionally immortal organisms—the hydra, certain jellyfish species, even some plants. We simply perfected what evolution had sketched out. The supposed ethical concerns melted away once people understood that we weren’t defying nature. We were fulfilling its potential. Today, those who did not want to be immortal aren’t around. Simple as that. Those who are here do care about the planet more than ever! There are almost no diseases, and we’re all very productive people. Young adults—or should I say young-looking adults—are naturally restless and energetic. And with all this life, you have the added benefit of not wasting your time on a career you might hate! You get to try different things and find out what you’re really good at and where you’re appreciated! Life is not short! Resources are plentiful! Of course, biological immortality doesn’t equal invincibility. People still die. Just not very often. My colleagues in materials science developed our modern protective exoskeletons. They’re elegant solutions, though I prefer to rely on my enhanced reflexes and reinforced skeletal structure most days. The population concerns proved mathematically unfounded. Stable reproduction rates emerged naturally once people realized they had unlimited time to start families. I’ve had four sets of children across 6,000 years, each born when I felt truly ready to pass on another iteration of my accumulated knowledge. With more life, people have much more patience. Now we are on to bigger and more ambitious projects. We conquered survival of individuals. The next step: survival of our species in this universe. The sun’s eventual death poses an interesting challenge, but nothing we can’t handle. We have colonized five planets and two moons in our solar system, and we will colonize more. Humanity will adapt to whatever environment we encounter. That’s what we do. My ancient motorcycle remains my favorite indulgence. I love taking it for long cruises on the old Earth roads that remain intact. The neural interface is state-of-the-art, of course. But mostly I keep it because it reminds me of earlier times, when we thought death was inevitable and life was limited to a single planet. The future stretches out before us like an infinity I helped create—yet another masterpiece in the eternal gallery of human evolution. 71. Martin feels better after writing it out. He rereads it a couple times, feels even better. Then he has the idea to send his writing to the department administrator. He asks her to create a new tab on his lab page, titled “Immortality Promise,” and to post his piece there. That will get his message across to Sophie and everyone at Abyssinian. 72. Sophie’s boss, Ray, is the first to email her. The subject line: “martn” [sic]. No further words in the body. Ray is known to be short and blunt in all his communications, but his meaning is always clear. They’ve had enough conversations about Martin by then. She is already in the process of slowly shutting down his projects, has been ignoring his texts and calls because of this. Now she has to move even faster. 73. Sophie leaves her office and goes into the lab. As an executive, she is not expected to do experiments, but watching a thousand tiny worms crawl across their agar plates soothes her. Each of the ones she now looks at carries a fluorescent marker she designed to track mitochondrial dynamics during aging. The green glow pulses with their movements, like stars blinking in a microscopic galaxy. She spent years developing this strain of C. elegans, carefully selecting for longevity without sacrificing health. The worms that lived longest weren’t always the healthiest—a truth about aging that seemed to elude Martin. Those worms taught her more about the genuine complexity of aging. Just last week, she observed something unexpected: The mitochondrial networks in her long-lived strains showed subtle patterns of reorganization never documented before. The discovery felt intimate, like being trusted with a secret. “How are things looking?” Jerry appears beside her. “That new strain expressing the dual markers?” Sophie nods, adjusting the focus. “Look at this network pattern. It’s different from anything in the literature.” She shifts aside so Jerry can see. This is what she loves about science: the genuine puzzles, the patient observation, the slow accumulation of knowledge that, while far removed from a specific application, could someday help people age with dignity. “Beautiful,” Jerry murmurs. He straightens. “I heard about Martin’s … post.” Sophie closes her eyes for a moment, the image of the mitochondrial networks still floating in her vision. She’s read Martin’s “Immortality Promise” piece three times, each more painful than the last. Not because of its grandiose claims—those were comically disconnected from reality—but because of what it’s revealed about her husband. The writing pulsed with a frightening certainty, a complete absence of doubt or wonder. Gone was the scientist who once spent many lively evenings debating with her about the evolutionary purpose of aging, who delighted in being proved wrong because it meant learning something new. 74. She sees in his words a man who has abandoned the fundamental principles of science. His piece reads like a religious text or science fiction story, casting himself as the hero. He isn’t pursuing research anymore. He hasn’t been for a long time. She wonders how and when he arrived there. The change in Martin didn’t take place overnight. It was gradual, almost imperceptible—not unlike watching someone age. It wasn’t easy to notice if you saw the person every day; Sophie feels guilty for not noticing. Then again, she read a new study out a few months ago from Stanford researchers that found people do not age linearly but in spurts—specifically, around 44 and 60. Shifts in the body lead to sudden accelerations of change. If she’s honest with herself, she knew this was happening to Martin, to their relationship. But she chose to ignore it, give other problems precedence. Now it is too late. Maybe if she’d addressed the conditions right before the spike—but how? wasn’t it inevitable?—he would not have gone from scientist to fanatic. 75. “You’re giving the keynote at next month’s Gordon conference,” Jerry reminds her, pulling her back to reality. “Don’t let this overshadow that.” She manages a small smile. Her work has always been methodical, built on careful observation and respect for the fundamental mysteries of biology. The keynote speech represents more than five years of research: countless hours of guiding her teams, of exciting discussions among her peers, of watching worms age and die, of documenting every detail of their cellular changes. It is one of the biggest honors of her career. There is poetry in it, she thinks—in the collisions between discoveries and failures. 76. The knock on her office door comes at 2:45. Linda from HR, right on schedule. Sophie walks with her to conference room B2, two floors below, where Martin’s group resides. Through the glass walls of each lab, they see scientists working at their benches. One adjusts a microscope’s focus. Another pipettes clear liquid into rows of tubes. Three researchers point at data on a screen. Each person is investigating some aspect of aging, one careful experiment at a time. The work will continue, with or without Martin. In the conference room, Sophie opens her laptop and pulls up the folder of evidence. She has been collecting it for months. Martin’s emails to colleagues, complaints from collaborators and direct reports, and finally, his “Immortality Promise” piece. The documentation is thorough, organized chronologically. She has labeled each file with dates and brief descriptions, as she would for any other data. 77. Martin walks in at 3:00. Linda from HR shifts in her chair. Sophie is the one to hand the papers over to Martin; this much she owes him. They contain words like “termination” and “effective immediately.” Martin’s face complicates itself when he looks them over. Sophie hands over a pen and he signs quickly. He stands, adjusts his shirt cuffs, and walks to the door. He turns back. “I’ll prove you wrong,” he says, looking at Sophie. But what stands out to her is the crack in his voice on the last word. Sophie watches him leave. She picks up the signed papers and hands them to Linda, and then walks out herself. Alexandra Chang is the author of Days of Distraction and Tomb Sweeping and is a National Book Foundation 5 under 35 honoree. She lives in Camarillo, California.0 Комментарии 0 Поделились 0 предпросмотр
Больше