• Ah, the audiovisual world—a realm where influencers are the new directors and AI is our not-so-creative assistant. What a time to be alive! As we ponder what's left of the industry, let's ask ourselves: Are we creating art, or just scrolling through endless TikTok dances while our souls slowly evaporate? Platforms are competing for our fleeting attention, and we’re all just waiting for the next viral trend.

    So, what’s the current state of the audiovisual sector? Maybe it’s just a fancy way of saying we’re drowning in content while trying to remember the last time we watched something that wasn’t a meme. But hey, at least we can blame piracy for our poor taste, right?

    #Audiovisual #Influencer
    Ah, the audiovisual world—a realm where influencers are the new directors and AI is our not-so-creative assistant. What a time to be alive! As we ponder what's left of the industry, let's ask ourselves: Are we creating art, or just scrolling through endless TikTok dances while our souls slowly evaporate? Platforms are competing for our fleeting attention, and we’re all just waiting for the next viral trend. So, what’s the current state of the audiovisual sector? Maybe it’s just a fancy way of saying we’re drowning in content while trying to remember the last time we watched something that wasn’t a meme. But hey, at least we can blame piracy for our poor taste, right? #Audiovisual #Influencer
    GRAFFICA.INFO
    Encuesta: ¿Qué pasa con el audiovisual? Ayúdanos a dibujar el estado actual del sector
    ¿Cómo ha cambiado el audiovisual en los últimos años? ¿Qué plataformas dominan nuestro tiempo? ¿Cómo trabajamos con vídeo, animación o sonido en el diseño gráfico? ¿Qué papel juegan los influencers, la inteligencia artificial o la piratería en todo e
    Like
    Love
    Wow
    Angry
    Sad
    97
    1 Yorumlar 0 hisse senetleri 0 önizleme
  • OpenAI: The power and the pride

    In April, Paul Graham, the founder of the tech startup accelerator Y Combinator, sent a tweet in response to former YC president and current OpenAI CEO Sam Altman. Altman had just bid a public goodbye to GPT-4 on X, and Graham had a follow-up question. 

    “If you hadetched on a piece of metal in the most compressed form,” Graham wrote, referring to the values that determine the model’s behavior, “how big would the piece of metal have to be? This is a mostly serious question. These models are history, and by default digital data evaporates.” 

    There is no question that OpenAI pulled off something historic with its release of ChatGPT 3.5 in 2022. It set in motion an AI arms race that has already changed the world in a number of ways and seems poised to have an even greater long-term effect than the short-term disruptions to things like education and employment that we are already beginning to see. How that turns out for humanity is something we are still reckoning with and may be for quite some time. But a pair of recent books both attempt to get their arms around it with accounts of what two leading technology journalists saw at the OpenAI revolution. 

    In Empire of AI: Dreams and Nightmares in Sam Altman’s OpenAI, Karen Hao tells the story of the company’s rise to power and its far-reaching impact all over the world. Meanwhile, The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future, by the Wall Street Journal’s Keach Hagey, homes in more on Altman’s personal life, from his childhood through the present day, in order to tell the story of OpenAI. Both paint complex pictures and show Altman in particular as a brilliantly effective yet deeply flawed creature of Silicon Valley—someone capable of always getting what he wants, but often by manipulating others. 

    Hao, who was formerly a reporter with MIT Technology Review, began reporting on OpenAI while at this publication and remains an occasional contributor. One chapter of her book grew directly out of that reporting. And in fact, as Hao says in the acknowledgments of Empire of AI, some of her reporting for MIT Technology Review, a series on AI colonialism, “laid the groundwork for the thesis and, ultimately, the title of this book.” So you can take this as a kind of disclaimer that we are predisposed to look favorably on Hao’s work. 

    With that said, Empire of AI is a powerful work, bristling not only with great reporting but also with big ideas. This comes across in service to two main themes. 

    The first is simple: It is the story of ambition overriding ethics. The history of OpenAI as Hao tells itis very much a tale of a company that was founded on the idealistic desire to create a safety-focused artificial general intelligence but instead became more interested in winning. This is a story we’ve seen many times before in Big Tech. See Theranos, which was going to make diagnostics easier, or Uber, which was founded to break the cartel of “Big Taxi.” But the closest analogue might be Google, which went from “Don’t be evil” toillegal monopolist. For that matter, consider how Google went from holding off on releasing its language model as a consumer product out of an abundance of caution to rushing a chatbot out the door to catch up with and beat OpenAI. In Silicon Valley, no matter what one’s original intent, it always comes back to winning.  

    The second theme is more complex and forms the book’s thesis about what Hao calls AI colonialism. The idea is that the large AI companies act like traditional empires, siphoning wealth from the bottom rungs of society in the forms of labor, creative works, raw materials, and the like to fuel their ambition and enrich those at the top of the ladder. “I’ve found only one metaphor that encapsulates the nature of what these AI power players are: empires,” she writes.

    “During the long era of European colonialism, empires seized and extracted resources that were not their own and exploited the labor of the people they subjugated to mine, cultivate, and refine those resources for the empires’ enrichment.” She goes on to chronicle her own growing disillusionment with the industry. “With increasing clarity,” she writes, “I realized that the very revolution promising to bring a better future was instead, for people on the margins of society, reviving the darkest remnants of the past.” 

    To document this, Hao steps away from her desk and goes out into the world to see the effects of this empire as it sprawls across the planet. She travels to Colombia to meet with data labelers tasked with teaching AI what various images show, one of whom she describes sprinting back to her apartment for the chance to make a few dollars. She documents how workers in Kenya who performed data-labeling content moderation for OpenAI came away traumatized by seeing so much disturbing material. In Chile she documents how the industry extracts precious resources—water, power, copper, lithium—to build out data centers. 

    She lands on the ways people are pushing back against the empire of AI across the world. Hao draws lessons from New Zealand, where Maori people are attempting to save their language using a small language model of their own making. Trained on volunteers’ voice recordings and running on just two graphics processing units, or GPUs, rather than the thousands employed by the likes of OpenAI, it’s meant to benefit the community, not exploit it. 

    Hao writes that she is not against AI. Rather: “What I reject is the dangerous notion that broad benefit from AI can only be derived from—indeed will ever emerge from—a vision of the technology that requires the complete capitulation of our privacy, our agency, and our worth, including the value of our labor and art, toward an ultimately imperial centralization project …shows us another way. It imagines how AI could be exactly the opposite. Models can be small and task-specific, their training data contained and knowable, ridding the incentives for widespread exploitative and psychologically harmful labor practices and the all-consuming extractivism of producing and running massive supercomputers.” 

    Hagey’s book is more squarely focused on Altman’s ambition, which she traces back to his childhood. Yet interestingly, she also  zeroes in on the OpenAI CEO’s attempt to create an empire. Indeed, “Altman’s departure from YC had not slowed his civilization-building ambitions,” Hagey writes. She goes on to chronicle how Altman, who had previously mulled a run for governor of California, set up experiments with income distribution via Tools for Humanity, the parent company of Worldcoin. She quotes Altman saying of it, “I thought it would be interesting to see … just how far technology could accomplish some of the goals that used to be done by nation-states.” 

    Overall, The Optimist is the more straightforward business biography of the two. Hagey has packed it full with scoops and insights and behind-the-scenes intrigue. It is immensely readable as a result, especially in the second half, when OpenAI really takes over the story. Hagey also seems to have been given far more access to Altman and his inner circles, personal and professional, than Hao did, and that allows for a fuller telling of the CEO’s story in places. For example, both writers cover the tragic story of Altman’s sister Annie, her estrangement from the family, and her accusations in particular about suffering sexual abuse at the hands of Sam. Hagey’s telling provides a more nuanced picture of the situation, with more insight into family dynamics. 

    Hagey concludes by describing Altman’s reckoning with his role in the long arc of human history and what it will mean to create a “superintelligence.” His place in that sweep is something that clearly has consumed the CEO’s thoughts. When Paul Graham asked about preserving GPT-4, for example, Altman had a response at the ready. He replied that the company had already considered this, and that the sheet of metal would need to be 100 meters square.
    #openai #power #pride
    OpenAI: The power and the pride
    In April, Paul Graham, the founder of the tech startup accelerator Y Combinator, sent a tweet in response to former YC president and current OpenAI CEO Sam Altman. Altman had just bid a public goodbye to GPT-4 on X, and Graham had a follow-up question.  “If you hadetched on a piece of metal in the most compressed form,” Graham wrote, referring to the values that determine the model’s behavior, “how big would the piece of metal have to be? This is a mostly serious question. These models are history, and by default digital data evaporates.”  There is no question that OpenAI pulled off something historic with its release of ChatGPT 3.5 in 2022. It set in motion an AI arms race that has already changed the world in a number of ways and seems poised to have an even greater long-term effect than the short-term disruptions to things like education and employment that we are already beginning to see. How that turns out for humanity is something we are still reckoning with and may be for quite some time. But a pair of recent books both attempt to get their arms around it with accounts of what two leading technology journalists saw at the OpenAI revolution.  In Empire of AI: Dreams and Nightmares in Sam Altman’s OpenAI, Karen Hao tells the story of the company’s rise to power and its far-reaching impact all over the world. Meanwhile, The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future, by the Wall Street Journal’s Keach Hagey, homes in more on Altman’s personal life, from his childhood through the present day, in order to tell the story of OpenAI. Both paint complex pictures and show Altman in particular as a brilliantly effective yet deeply flawed creature of Silicon Valley—someone capable of always getting what he wants, but often by manipulating others.  Hao, who was formerly a reporter with MIT Technology Review, began reporting on OpenAI while at this publication and remains an occasional contributor. One chapter of her book grew directly out of that reporting. And in fact, as Hao says in the acknowledgments of Empire of AI, some of her reporting for MIT Technology Review, a series on AI colonialism, “laid the groundwork for the thesis and, ultimately, the title of this book.” So you can take this as a kind of disclaimer that we are predisposed to look favorably on Hao’s work.  With that said, Empire of AI is a powerful work, bristling not only with great reporting but also with big ideas. This comes across in service to two main themes.  The first is simple: It is the story of ambition overriding ethics. The history of OpenAI as Hao tells itis very much a tale of a company that was founded on the idealistic desire to create a safety-focused artificial general intelligence but instead became more interested in winning. This is a story we’ve seen many times before in Big Tech. See Theranos, which was going to make diagnostics easier, or Uber, which was founded to break the cartel of “Big Taxi.” But the closest analogue might be Google, which went from “Don’t be evil” toillegal monopolist. For that matter, consider how Google went from holding off on releasing its language model as a consumer product out of an abundance of caution to rushing a chatbot out the door to catch up with and beat OpenAI. In Silicon Valley, no matter what one’s original intent, it always comes back to winning.   The second theme is more complex and forms the book’s thesis about what Hao calls AI colonialism. The idea is that the large AI companies act like traditional empires, siphoning wealth from the bottom rungs of society in the forms of labor, creative works, raw materials, and the like to fuel their ambition and enrich those at the top of the ladder. “I’ve found only one metaphor that encapsulates the nature of what these AI power players are: empires,” she writes. “During the long era of European colonialism, empires seized and extracted resources that were not their own and exploited the labor of the people they subjugated to mine, cultivate, and refine those resources for the empires’ enrichment.” She goes on to chronicle her own growing disillusionment with the industry. “With increasing clarity,” she writes, “I realized that the very revolution promising to bring a better future was instead, for people on the margins of society, reviving the darkest remnants of the past.”  To document this, Hao steps away from her desk and goes out into the world to see the effects of this empire as it sprawls across the planet. She travels to Colombia to meet with data labelers tasked with teaching AI what various images show, one of whom she describes sprinting back to her apartment for the chance to make a few dollars. She documents how workers in Kenya who performed data-labeling content moderation for OpenAI came away traumatized by seeing so much disturbing material. In Chile she documents how the industry extracts precious resources—water, power, copper, lithium—to build out data centers.  She lands on the ways people are pushing back against the empire of AI across the world. Hao draws lessons from New Zealand, where Maori people are attempting to save their language using a small language model of their own making. Trained on volunteers’ voice recordings and running on just two graphics processing units, or GPUs, rather than the thousands employed by the likes of OpenAI, it’s meant to benefit the community, not exploit it.  Hao writes that she is not against AI. Rather: “What I reject is the dangerous notion that broad benefit from AI can only be derived from—indeed will ever emerge from—a vision of the technology that requires the complete capitulation of our privacy, our agency, and our worth, including the value of our labor and art, toward an ultimately imperial centralization project …shows us another way. It imagines how AI could be exactly the opposite. Models can be small and task-specific, their training data contained and knowable, ridding the incentives for widespread exploitative and psychologically harmful labor practices and the all-consuming extractivism of producing and running massive supercomputers.”  Hagey’s book is more squarely focused on Altman’s ambition, which she traces back to his childhood. Yet interestingly, she also  zeroes in on the OpenAI CEO’s attempt to create an empire. Indeed, “Altman’s departure from YC had not slowed his civilization-building ambitions,” Hagey writes. She goes on to chronicle how Altman, who had previously mulled a run for governor of California, set up experiments with income distribution via Tools for Humanity, the parent company of Worldcoin. She quotes Altman saying of it, “I thought it would be interesting to see … just how far technology could accomplish some of the goals that used to be done by nation-states.”  Overall, The Optimist is the more straightforward business biography of the two. Hagey has packed it full with scoops and insights and behind-the-scenes intrigue. It is immensely readable as a result, especially in the second half, when OpenAI really takes over the story. Hagey also seems to have been given far more access to Altman and his inner circles, personal and professional, than Hao did, and that allows for a fuller telling of the CEO’s story in places. For example, both writers cover the tragic story of Altman’s sister Annie, her estrangement from the family, and her accusations in particular about suffering sexual abuse at the hands of Sam. Hagey’s telling provides a more nuanced picture of the situation, with more insight into family dynamics.  Hagey concludes by describing Altman’s reckoning with his role in the long arc of human history and what it will mean to create a “superintelligence.” His place in that sweep is something that clearly has consumed the CEO’s thoughts. When Paul Graham asked about preserving GPT-4, for example, Altman had a response at the ready. He replied that the company had already considered this, and that the sheet of metal would need to be 100 meters square. #openai #power #pride
    WWW.TECHNOLOGYREVIEW.COM
    OpenAI: The power and the pride
    In April, Paul Graham, the founder of the tech startup accelerator Y Combinator, sent a tweet in response to former YC president and current OpenAI CEO Sam Altman. Altman had just bid a public goodbye to GPT-4 on X, and Graham had a follow-up question.  “If you had [GPT-4’s model weights] etched on a piece of metal in the most compressed form,” Graham wrote, referring to the values that determine the model’s behavior, “how big would the piece of metal have to be? This is a mostly serious question. These models are history, and by default digital data evaporates.”  There is no question that OpenAI pulled off something historic with its release of ChatGPT 3.5 in 2022. It set in motion an AI arms race that has already changed the world in a number of ways and seems poised to have an even greater long-term effect than the short-term disruptions to things like education and employment that we are already beginning to see. How that turns out for humanity is something we are still reckoning with and may be for quite some time. But a pair of recent books both attempt to get their arms around it with accounts of what two leading technology journalists saw at the OpenAI revolution.  In Empire of AI: Dreams and Nightmares in Sam Altman’s OpenAI, Karen Hao tells the story of the company’s rise to power and its far-reaching impact all over the world. Meanwhile, The Optimist: Sam Altman, OpenAI, and the Race to Invent the Future, by the Wall Street Journal’s Keach Hagey, homes in more on Altman’s personal life, from his childhood through the present day, in order to tell the story of OpenAI. Both paint complex pictures and show Altman in particular as a brilliantly effective yet deeply flawed creature of Silicon Valley—someone capable of always getting what he wants, but often by manipulating others.  Hao, who was formerly a reporter with MIT Technology Review, began reporting on OpenAI while at this publication and remains an occasional contributor. One chapter of her book grew directly out of that reporting. And in fact, as Hao says in the acknowledgments of Empire of AI, some of her reporting for MIT Technology Review, a series on AI colonialism, “laid the groundwork for the thesis and, ultimately, the title of this book.” So you can take this as a kind of disclaimer that we are predisposed to look favorably on Hao’s work.  With that said, Empire of AI is a powerful work, bristling not only with great reporting but also with big ideas. This comes across in service to two main themes.  The first is simple: It is the story of ambition overriding ethics. The history of OpenAI as Hao tells it (and as Hagey does too) is very much a tale of a company that was founded on the idealistic desire to create a safety-focused artificial general intelligence but instead became more interested in winning. This is a story we’ve seen many times before in Big Tech. See Theranos, which was going to make diagnostics easier, or Uber, which was founded to break the cartel of “Big Taxi.” But the closest analogue might be Google, which went from “Don’t be evil” to (at least in the eyes of the courts) illegal monopolist. For that matter, consider how Google went from holding off on releasing its language model as a consumer product out of an abundance of caution to rushing a chatbot out the door to catch up with and beat OpenAI. In Silicon Valley, no matter what one’s original intent, it always comes back to winning.   The second theme is more complex and forms the book’s thesis about what Hao calls AI colonialism. The idea is that the large AI companies act like traditional empires, siphoning wealth from the bottom rungs of society in the forms of labor, creative works, raw materials, and the like to fuel their ambition and enrich those at the top of the ladder. “I’ve found only one metaphor that encapsulates the nature of what these AI power players are: empires,” she writes. “During the long era of European colonialism, empires seized and extracted resources that were not their own and exploited the labor of the people they subjugated to mine, cultivate, and refine those resources for the empires’ enrichment.” She goes on to chronicle her own growing disillusionment with the industry. “With increasing clarity,” she writes, “I realized that the very revolution promising to bring a better future was instead, for people on the margins of society, reviving the darkest remnants of the past.”  To document this, Hao steps away from her desk and goes out into the world to see the effects of this empire as it sprawls across the planet. She travels to Colombia to meet with data labelers tasked with teaching AI what various images show, one of whom she describes sprinting back to her apartment for the chance to make a few dollars. She documents how workers in Kenya who performed data-labeling content moderation for OpenAI came away traumatized by seeing so much disturbing material. In Chile she documents how the industry extracts precious resources—water, power, copper, lithium—to build out data centers.  She lands on the ways people are pushing back against the empire of AI across the world. Hao draws lessons from New Zealand, where Maori people are attempting to save their language using a small language model of their own making. Trained on volunteers’ voice recordings and running on just two graphics processing units, or GPUs, rather than the thousands employed by the likes of OpenAI, it’s meant to benefit the community, not exploit it.  Hao writes that she is not against AI. Rather: “What I reject is the dangerous notion that broad benefit from AI can only be derived from—indeed will ever emerge from—a vision of the technology that requires the complete capitulation of our privacy, our agency, and our worth, including the value of our labor and art, toward an ultimately imperial centralization project … [The New Zealand model] shows us another way. It imagines how AI could be exactly the opposite. Models can be small and task-specific, their training data contained and knowable, ridding the incentives for widespread exploitative and psychologically harmful labor practices and the all-consuming extractivism of producing and running massive supercomputers.”  Hagey’s book is more squarely focused on Altman’s ambition, which she traces back to his childhood. Yet interestingly, she also  zeroes in on the OpenAI CEO’s attempt to create an empire. Indeed, “Altman’s departure from YC had not slowed his civilization-building ambitions,” Hagey writes. She goes on to chronicle how Altman, who had previously mulled a run for governor of California, set up experiments with income distribution via Tools for Humanity, the parent company of Worldcoin. She quotes Altman saying of it, “I thought it would be interesting to see … just how far technology could accomplish some of the goals that used to be done by nation-states.”  Overall, The Optimist is the more straightforward business biography of the two. Hagey has packed it full with scoops and insights and behind-the-scenes intrigue. It is immensely readable as a result, especially in the second half, when OpenAI really takes over the story. Hagey also seems to have been given far more access to Altman and his inner circles, personal and professional, than Hao did, and that allows for a fuller telling of the CEO’s story in places. For example, both writers cover the tragic story of Altman’s sister Annie, her estrangement from the family, and her accusations in particular about suffering sexual abuse at the hands of Sam (something he and the rest of the Altman family vehemently deny). Hagey’s telling provides a more nuanced picture of the situation, with more insight into family dynamics.  Hagey concludes by describing Altman’s reckoning with his role in the long arc of human history and what it will mean to create a “superintelligence.” His place in that sweep is something that clearly has consumed the CEO’s thoughts. When Paul Graham asked about preserving GPT-4, for example, Altman had a response at the ready. He replied that the company had already considered this, and that the sheet of metal would need to be 100 meters square.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Texas is headed for a drought—but lawmakers won’t do the one thing necessary to save its water supply

    LUBBOCK — Every winter, after the sea of cotton has been harvested in the South Plains and the ground looks barren, technicians with the High Plains Underground Water Conservation District check the water levels in nearly 75,000 wells across 16 counties.

    For years, their measurements have shown what farmers and water conservationists fear most—the Ogallala Aquifer, an underground water source that’s the lifeblood of the South Plains agriculture industry, is running dry.

    That’s because of a century-old law called the rule of capture.

    The rule is simple: If you own the land above an aquifer in Texas, the water underneath is yours. You can use as much as you want, as long as it’s not wasted or taken maliciously. The same applies to your neighbor. If they happen to use more water than you, then that’s just bad luck.

    To put it another way, landowners can mostly pump as much water as they choose without facing liability to surrounding landowners whose wells might be depleted as a result.

    Following the Dust Bowl—and to stave off catastrophe—state lawmakers created groundwater conservation districts in 1949 to protect what water is left. But their power to restrict landowners is limited.

    “The mission is to save as much water possible for as long as possible, with as little impact on private property rights as possible,” said Jason Coleman, manager for the High Plains Underground Water Conservation District. “How do you do that? It’s a difficult task.”

    A 1953 map of the wells in Lubbock County hangs in the office of the groundwater district.Rapid population growth, climate change, and aging water infrastructure all threaten the state’s water supply. Texas does not have enough water to meet demand if the state is stricken with a historic drought, according to the Texas Water Development Board, the state agency that manages Texas’ water supply.

    Lawmakers want to invest in every corner to save the state’s water. This week, they reached a historic billion deal on water projects.

    High Plains Underground Water District General Manager Jason Coleman stands in the district’s meeting room on May 21 in Lubbock.But no one wants to touch the rule of capture. In a state known for rugged individualism, politically speaking, reforming the law is tantamount to stripping away freedoms.

    “There probably are opportunities to vest groundwater districts with additional authority,” said Amy Hardberger, director for the Texas Tech University Center for Water Law and Policy. “I don’t think the political climate is going to do that.”

    State Sen. Charles Perry, a Lubbock Republican, and Rep. Cody Harris, a Palestine Republican, led the effort on water in Austin this year. Neither responded to requests for comment.

    Carlos Rubinstein, a water expert with consulting firm RSAH2O and a former chairman of the water development board, said the rule has been relied upon so long that it would be near impossible to undo the law.

    “I think it’s better to spend time working within the rules,” Rubinstein said. “And respect the rule of capture, yet also recognize that, in and of itself, it causes problems.”

    Even though groundwater districts were created to regulate groundwater, the law effectively stops them from doing so, or they risk major lawsuits. The state water plan, which spells out how the state’s water is to be used, acknowledges the shortfall. Groundwater availability is expected to decline by 25% by 2070, mostly due to reduced supply in the Ogallala and Edwards-Trinity aquifers. Together, the aquifers stretch across West Texas and up through the Panhandle.

    By itself, the Ogallala has an estimated three trillion gallons of water. Though the overwhelming majority in Texas is used by farmers. It’s expected to face a 50% decline by 2070.

    Groundwater is 54% of the state’s total water supply and is the state’s most vulnerable natural resource. It’s created by rainfall and other precipitation, and seeps into the ground. Like surface water, groundwater is heavily affected by ongoing droughts and prolonged heat waves. However, the state has more say in regulating surface water than it does groundwater. Surface water laws have provisions that cut supply to newer users in a drought and prohibit transferring surface water outside of basins.

    Historically, groundwater has been used by agriculture in the High Plains. However, as surface water evaporates at a quicker clip, cities and businesses are increasingly interested in tapping the underground resource. As Texas’ population continues to grow and surface water declines, groundwater will be the prize in future fights for water.

    In many ways, the damage is done in the High Plains, a region that spans from the top of the Panhandle down past Lubbock. The Ogallala Aquifer runs beneath the region, and it’s faced depletion to the point of no return, according to experts. Simply put: The Ogallala is not refilling to keep up with demand.

    “It’s a creeping disaster,” said Robert Mace, executive director of the Meadows Center for Water and the Environment. “It isn’t like you wake up tomorrow and nobody can pump anymore. It’s just happening slowly, every year.”Groundwater districts and the law

    The High Plains Water District was the first groundwater district created in Texas.

    Over a protracted multi-year fight, the Legislature created these new local government bodies in 1949, with voter approval, enshrining the new stewards of groundwater into the state Constitution.

    If the lawmakers hoped to embolden local officials to manage the troves of water under the soil, they failed. There are areas with groundwater that don’t have conservation districts. Each groundwater districts has different powers. In practice, most water districts permit wells and make decisions on spacing and location to meet the needs of the property owner.

    The one thing all groundwater districts have in common: They stop short of telling landowners they can’t pump water.

    In the seven decades since groundwater districts were created, a series of lawsuits have effectively strangled groundwater districts. Even as water levels decline from use and drought, districts still get regular requests for new wells. They won’t say no out of fear of litigation.

    The field technician coverage area is seen in Nathaniel Bibbs’ office at the High Plains Underground Water District. Bibbs is a permit assistant for the district.“You have a host of different decisions to make as it pertains to management of groundwater,” Coleman said. “That list has grown over the years.”

    The possibility of lawsuits makes groundwater districts hesitant to regulate usage or put limitations on new well permits. Groundwater districts have to defend themselves in lawsuits, and most lack the resources to do so.

    A well spacing guide is seen in Nathaniel Bibbs’ office.“The law works against us in that way,” Hardberger, with Texas Tech University, said. “It means one large tool in our toolbox, regulation, is limited.”

    The most recent example is a lawsuit between the Braggs Farm and the Edwards Aquifer Authority. The farm requested permits for two pecan orchards in Medina County, outside San Antonio. The authority granted only one and limited how much water could be used based on state law.

    It wasn’t an arbitrary decision. The authority said it followed the statute set by the Legislature to determine the permit.

    “That’s all they were guaranteed,” said Gregory Ellis, the first general manager of the authority, referring to the water available to the farm.

    The Braggs family filed a takings lawsuit against the authority. This kind of claim can be filed when any level of government—including groundwater districts—takes private property for public use without paying for the owner’s losses.

    Braggs won. It is the only successful water-related takings claim in Texas, and it made groundwater laws murkier. It cost the authority million.

    “I think it should have been paid by the state Legislature,” Ellis said. “They’re the ones who designed that permitting system. But that didn’t happen.”

    An appeals court upheld the ruling in 2013, and the Texas Supreme Court denied petitions to consider appeals. However, the state’s supreme court has previously suggested the Legislature could enhance the powers of the groundwater districts and regulate groundwater like surface water, just as many other states have done.

    While the laws are complicated, Ellis said the fundamental rule of capture has benefits. It has saved Texas’ legal system from a flurry of lawsuits between well owners.

    “If they had said ‘Yes, you can sue your neighbor for damaging your well,’ where does it stop?” Ellis asked. “Everybody sues everybody.”

    Coleman, the High Plains district’s manager, said some people want groundwater districts to have more power, while others think they have too much. Well owners want restrictions for others, but not on them, he said.

    “You’re charged as a district with trying to apply things uniformly and fairly,” Coleman said.

    Can’t reverse the past

    Two tractors were dropping seeds around Walt Hagood’s farm as he turned on his irrigation system for the first time this year. He didn’t plan on using much water. It’s too precious.

    The cotton farm stretches across 2,350 acres on the outskirts of Wolfforth, a town 12 miles southwest of Lubbock. Hagood irrigates about 80 acres of land, and prays that rain takes care of the rest.

    Walt Hagood drives across his farm on May 12, in Wolfforth. Hagood utilizes “dry farming,” a technique that relies on natural rainfall.“We used to have a lot of irrigated land with adequate water to make a crop,” Hagood said. “We don’t have that anymore.”

    The High Plains is home to cotton and cattle, multi-billion-dollar agricultural industries. The success is in large part due to the Ogallala. Since its discovery, the aquifer has helped farms around the region spring up through irrigation, a way for farmers to water their crops instead of waiting for rain that may not come. But as water in the aquifer declines, there are growing concerns that there won’t be enough water to support agriculture in the future.

    At the peak of irrigation development, more than 8.5 million acres were irrigated in Texas. About 65% of that was in the High Plains. In the decades since the irrigation boom, High Plains farmers have resorted to methods that might save water and keep their livelihoods afloat. They’ve changed their irrigation systems so water is used more efficiently. They grow cover crops so their soil is more likely to soak up rainwater. Some use apps to see where water is needed so it’s not wasted.

    A furrow irrigation is seen at Walt Hagood’s cotton farm.Farmers who have not changed their irrigation systems might not have a choice in the near future. It can take a week to pump an inch of water in some areas from the aquifer because of how little water is left. As conditions change underground, they are forced to drill deeper for water. That causes additional problems. Calcium can build up, and the water is of poorer quality. And when the water is used to spray crops through a pivot irrigation system, it’s more of a humidifier as water quickly evaporates in the heat.

    According to the groundwater district’s most recent management plan, 2 million acres in the district use groundwater for irrigation. About 95% of water from the Ogallala is used for irrigated agriculture. The plan states that the irrigated farms “afford economic stability to the area and support a number of other industries.”

    The state water plan shows groundwater supply is expected to decline, and drought won’t be the only factor causing a shortage. Demand for municipal use outweighs irrigation use, reflecting the state’s future growth. In Region O, which is the South Plains, water for irrigation declines by 2070 while demand for municipal use rises because of population growth in the region.

    Coleman, with the High Plains groundwater district, often thinks about how the aquifer will hold up with future growth. There are some factors at play with water planning that are nearly impossible to predict and account for, Coleman said. Declining surface water could make groundwater a source for municipalities that didn’t depend on it before. Regions known for having big, open patches of land, like the High Plains, could be attractive to incoming businesses. People could move to the country and want to drill a well, with no understanding of water availability.

    The state will continue to grow, Coleman said, and all the incoming businesses and industries will undoubtedly need water.

    “We could say ‘Well, it’s no one’s fault. We didn’t know that factory would need 20,000 acre-feet of water a year,” Coleman said. “It’s not happening right now, but what’s around the corner?”

    Coleman said this puts agriculture in a tenuous position. The region is full of small towns that depend on agriculture and have supporting businesses, like cotton gins, equipment and feed stores, and pesticide and fertilizer sprayers. This puts pressure on the High Plains water district, along with the two regional water planning groups in the region, to keep agriculture alive.

    “Districts are not trying to reduce pumping down to a sustainable level,” said Mace with the Meadows Foundation. “And I don’t fault them for that, because doing that is economic devastation in a region with farmers.”

    Hagood, the cotton farmer, doesn’t think reforming groundwater rights is the way to solve it. What’s done is done, he said.

    “Our U.S. Constitution protects our private property rights, and that’s what this is all about,” Hagood said. “Any time we have a regulation and people are given more authority, it doesn’t work out right for everybody.”

    Rapid population growth, climate change, and aging water infrastructure all threaten the state’s water supply.What can be done

    The state water plan recommends irrigation conservation as a strategy. It’s also the least costly water management method.

    But that strategy is fraught. Farmers need to irrigate in times of drought, and telling them to stop can draw criticism.

    In Eastern New Mexico, the Ogallala Land and Water Conservancy, a nonprofit organization, has been retiring irrigation wells. Landowners keep their water rights, and the organization pays them to stop irrigating their farms. Landowners get paid every year as part of the voluntary agreement, and they can end it at any point.

    Ladona Clayton, executive director of the organization, said they have been criticized, with their efforts being called a “war” and “land grab.” They also get pushback on why the responsibility falls on farmers. She said it’s because of how much water is used for irrigation. They have to be aggressive in their approach, she said. The aquifer supplies water to the Cannon Air Force Base.

    “We don’t want them to stop agricultural production,” Clayton said. “But for me to say it will be the same level that irrigation can support would be untrue.”

    There is another possible lifeline that people in the High Plains are eyeing as a solution: the Dockum Aquifer. It’s a minor aquifer that underlies part of the Ogallala, so it would be accessible to farmers and ranchers in the region. The High Plains Water District also oversees this aquifer.

    If it seems too good to be true—that the most irrigated part of Texas would just so happen to have another abundant supply of water flowing underneath—it’s because there’s a catch. The Dockum is full of extremely salty brackish water. Some counties can use the water for irrigation and drinking water without treatment, but it’s unusable in others. According to the groundwater district, a test well in Lubbock County pulled up water that was as salty as seawater.

    Rubinstein, the former water development board chairman, said there are pockets of brackish groundwater in Texas that haven’t been tapped yet. It would be enough to meet the needs on the horizon, but it would also be very expensive to obtain and use. A landowner would have to go deeper to get it, then pump the water over a longer distance.

    “That costs money, and then you have to treat it on top of that,” Rubinstein said. “But, it is water.”

    Landowners have expressed interest in using desalination, a treatment method to lower dissolved salt levels. Desalination of produced and brackish water is one of the ideas that was being floated around at the Legislature this year, along with building a pipeline to move water across the state. Hagood, the farmer, is skeptical. He thinks whatever water they move could get used up before it makes it all the way to West Texas.

    There is always brackish groundwater. Another aquifer brings the chance of history repeating—if the Dockum aquifer is treated so its water is usable, will people drain it, too?

    Hagood said there would have to be limits.

    Disclosure: Edwards Aquifer Authority and Texas Tech University have been financial supporters of The Texas Tribune. Financial supporters play no role in the Tribune’s journalism. Find a complete list of them here.

    This article originally appeared in The Texas Tribune, a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.
    #texas #headed #droughtbut #lawmakers #wont
    Texas is headed for a drought—but lawmakers won’t do the one thing necessary to save its water supply
    LUBBOCK — Every winter, after the sea of cotton has been harvested in the South Plains and the ground looks barren, technicians with the High Plains Underground Water Conservation District check the water levels in nearly 75,000 wells across 16 counties. For years, their measurements have shown what farmers and water conservationists fear most—the Ogallala Aquifer, an underground water source that’s the lifeblood of the South Plains agriculture industry, is running dry. That’s because of a century-old law called the rule of capture. The rule is simple: If you own the land above an aquifer in Texas, the water underneath is yours. You can use as much as you want, as long as it’s not wasted or taken maliciously. The same applies to your neighbor. If they happen to use more water than you, then that’s just bad luck. To put it another way, landowners can mostly pump as much water as they choose without facing liability to surrounding landowners whose wells might be depleted as a result. Following the Dust Bowl—and to stave off catastrophe—state lawmakers created groundwater conservation districts in 1949 to protect what water is left. But their power to restrict landowners is limited. “The mission is to save as much water possible for as long as possible, with as little impact on private property rights as possible,” said Jason Coleman, manager for the High Plains Underground Water Conservation District. “How do you do that? It’s a difficult task.” A 1953 map of the wells in Lubbock County hangs in the office of the groundwater district.Rapid population growth, climate change, and aging water infrastructure all threaten the state’s water supply. Texas does not have enough water to meet demand if the state is stricken with a historic drought, according to the Texas Water Development Board, the state agency that manages Texas’ water supply. Lawmakers want to invest in every corner to save the state’s water. This week, they reached a historic billion deal on water projects. High Plains Underground Water District General Manager Jason Coleman stands in the district’s meeting room on May 21 in Lubbock.But no one wants to touch the rule of capture. In a state known for rugged individualism, politically speaking, reforming the law is tantamount to stripping away freedoms. “There probably are opportunities to vest groundwater districts with additional authority,” said Amy Hardberger, director for the Texas Tech University Center for Water Law and Policy. “I don’t think the political climate is going to do that.” State Sen. Charles Perry, a Lubbock Republican, and Rep. Cody Harris, a Palestine Republican, led the effort on water in Austin this year. Neither responded to requests for comment. Carlos Rubinstein, a water expert with consulting firm RSAH2O and a former chairman of the water development board, said the rule has been relied upon so long that it would be near impossible to undo the law. “I think it’s better to spend time working within the rules,” Rubinstein said. “And respect the rule of capture, yet also recognize that, in and of itself, it causes problems.” Even though groundwater districts were created to regulate groundwater, the law effectively stops them from doing so, or they risk major lawsuits. The state water plan, which spells out how the state’s water is to be used, acknowledges the shortfall. Groundwater availability is expected to decline by 25% by 2070, mostly due to reduced supply in the Ogallala and Edwards-Trinity aquifers. Together, the aquifers stretch across West Texas and up through the Panhandle. By itself, the Ogallala has an estimated three trillion gallons of water. Though the overwhelming majority in Texas is used by farmers. It’s expected to face a 50% decline by 2070. Groundwater is 54% of the state’s total water supply and is the state’s most vulnerable natural resource. It’s created by rainfall and other precipitation, and seeps into the ground. Like surface water, groundwater is heavily affected by ongoing droughts and prolonged heat waves. However, the state has more say in regulating surface water than it does groundwater. Surface water laws have provisions that cut supply to newer users in a drought and prohibit transferring surface water outside of basins. Historically, groundwater has been used by agriculture in the High Plains. However, as surface water evaporates at a quicker clip, cities and businesses are increasingly interested in tapping the underground resource. As Texas’ population continues to grow and surface water declines, groundwater will be the prize in future fights for water. In many ways, the damage is done in the High Plains, a region that spans from the top of the Panhandle down past Lubbock. The Ogallala Aquifer runs beneath the region, and it’s faced depletion to the point of no return, according to experts. Simply put: The Ogallala is not refilling to keep up with demand. “It’s a creeping disaster,” said Robert Mace, executive director of the Meadows Center for Water and the Environment. “It isn’t like you wake up tomorrow and nobody can pump anymore. It’s just happening slowly, every year.”Groundwater districts and the law The High Plains Water District was the first groundwater district created in Texas. Over a protracted multi-year fight, the Legislature created these new local government bodies in 1949, with voter approval, enshrining the new stewards of groundwater into the state Constitution. If the lawmakers hoped to embolden local officials to manage the troves of water under the soil, they failed. There are areas with groundwater that don’t have conservation districts. Each groundwater districts has different powers. In practice, most water districts permit wells and make decisions on spacing and location to meet the needs of the property owner. The one thing all groundwater districts have in common: They stop short of telling landowners they can’t pump water. In the seven decades since groundwater districts were created, a series of lawsuits have effectively strangled groundwater districts. Even as water levels decline from use and drought, districts still get regular requests for new wells. They won’t say no out of fear of litigation. The field technician coverage area is seen in Nathaniel Bibbs’ office at the High Plains Underground Water District. Bibbs is a permit assistant for the district.“You have a host of different decisions to make as it pertains to management of groundwater,” Coleman said. “That list has grown over the years.” The possibility of lawsuits makes groundwater districts hesitant to regulate usage or put limitations on new well permits. Groundwater districts have to defend themselves in lawsuits, and most lack the resources to do so. A well spacing guide is seen in Nathaniel Bibbs’ office.“The law works against us in that way,” Hardberger, with Texas Tech University, said. “It means one large tool in our toolbox, regulation, is limited.” The most recent example is a lawsuit between the Braggs Farm and the Edwards Aquifer Authority. The farm requested permits for two pecan orchards in Medina County, outside San Antonio. The authority granted only one and limited how much water could be used based on state law. It wasn’t an arbitrary decision. The authority said it followed the statute set by the Legislature to determine the permit. “That’s all they were guaranteed,” said Gregory Ellis, the first general manager of the authority, referring to the water available to the farm. The Braggs family filed a takings lawsuit against the authority. This kind of claim can be filed when any level of government—including groundwater districts—takes private property for public use without paying for the owner’s losses. Braggs won. It is the only successful water-related takings claim in Texas, and it made groundwater laws murkier. It cost the authority million. “I think it should have been paid by the state Legislature,” Ellis said. “They’re the ones who designed that permitting system. But that didn’t happen.” An appeals court upheld the ruling in 2013, and the Texas Supreme Court denied petitions to consider appeals. However, the state’s supreme court has previously suggested the Legislature could enhance the powers of the groundwater districts and regulate groundwater like surface water, just as many other states have done. While the laws are complicated, Ellis said the fundamental rule of capture has benefits. It has saved Texas’ legal system from a flurry of lawsuits between well owners. “If they had said ‘Yes, you can sue your neighbor for damaging your well,’ where does it stop?” Ellis asked. “Everybody sues everybody.” Coleman, the High Plains district’s manager, said some people want groundwater districts to have more power, while others think they have too much. Well owners want restrictions for others, but not on them, he said. “You’re charged as a district with trying to apply things uniformly and fairly,” Coleman said. Can’t reverse the past Two tractors were dropping seeds around Walt Hagood’s farm as he turned on his irrigation system for the first time this year. He didn’t plan on using much water. It’s too precious. The cotton farm stretches across 2,350 acres on the outskirts of Wolfforth, a town 12 miles southwest of Lubbock. Hagood irrigates about 80 acres of land, and prays that rain takes care of the rest. Walt Hagood drives across his farm on May 12, in Wolfforth. Hagood utilizes “dry farming,” a technique that relies on natural rainfall.“We used to have a lot of irrigated land with adequate water to make a crop,” Hagood said. “We don’t have that anymore.” The High Plains is home to cotton and cattle, multi-billion-dollar agricultural industries. The success is in large part due to the Ogallala. Since its discovery, the aquifer has helped farms around the region spring up through irrigation, a way for farmers to water their crops instead of waiting for rain that may not come. But as water in the aquifer declines, there are growing concerns that there won’t be enough water to support agriculture in the future. At the peak of irrigation development, more than 8.5 million acres were irrigated in Texas. About 65% of that was in the High Plains. In the decades since the irrigation boom, High Plains farmers have resorted to methods that might save water and keep their livelihoods afloat. They’ve changed their irrigation systems so water is used more efficiently. They grow cover crops so their soil is more likely to soak up rainwater. Some use apps to see where water is needed so it’s not wasted. A furrow irrigation is seen at Walt Hagood’s cotton farm.Farmers who have not changed their irrigation systems might not have a choice in the near future. It can take a week to pump an inch of water in some areas from the aquifer because of how little water is left. As conditions change underground, they are forced to drill deeper for water. That causes additional problems. Calcium can build up, and the water is of poorer quality. And when the water is used to spray crops through a pivot irrigation system, it’s more of a humidifier as water quickly evaporates in the heat. According to the groundwater district’s most recent management plan, 2 million acres in the district use groundwater for irrigation. About 95% of water from the Ogallala is used for irrigated agriculture. The plan states that the irrigated farms “afford economic stability to the area and support a number of other industries.” The state water plan shows groundwater supply is expected to decline, and drought won’t be the only factor causing a shortage. Demand for municipal use outweighs irrigation use, reflecting the state’s future growth. In Region O, which is the South Plains, water for irrigation declines by 2070 while demand for municipal use rises because of population growth in the region. Coleman, with the High Plains groundwater district, often thinks about how the aquifer will hold up with future growth. There are some factors at play with water planning that are nearly impossible to predict and account for, Coleman said. Declining surface water could make groundwater a source for municipalities that didn’t depend on it before. Regions known for having big, open patches of land, like the High Plains, could be attractive to incoming businesses. People could move to the country and want to drill a well, with no understanding of water availability. The state will continue to grow, Coleman said, and all the incoming businesses and industries will undoubtedly need water. “We could say ‘Well, it’s no one’s fault. We didn’t know that factory would need 20,000 acre-feet of water a year,” Coleman said. “It’s not happening right now, but what’s around the corner?” Coleman said this puts agriculture in a tenuous position. The region is full of small towns that depend on agriculture and have supporting businesses, like cotton gins, equipment and feed stores, and pesticide and fertilizer sprayers. This puts pressure on the High Plains water district, along with the two regional water planning groups in the region, to keep agriculture alive. “Districts are not trying to reduce pumping down to a sustainable level,” said Mace with the Meadows Foundation. “And I don’t fault them for that, because doing that is economic devastation in a region with farmers.” Hagood, the cotton farmer, doesn’t think reforming groundwater rights is the way to solve it. What’s done is done, he said. “Our U.S. Constitution protects our private property rights, and that’s what this is all about,” Hagood said. “Any time we have a regulation and people are given more authority, it doesn’t work out right for everybody.” Rapid population growth, climate change, and aging water infrastructure all threaten the state’s water supply.What can be done The state water plan recommends irrigation conservation as a strategy. It’s also the least costly water management method. But that strategy is fraught. Farmers need to irrigate in times of drought, and telling them to stop can draw criticism. In Eastern New Mexico, the Ogallala Land and Water Conservancy, a nonprofit organization, has been retiring irrigation wells. Landowners keep their water rights, and the organization pays them to stop irrigating their farms. Landowners get paid every year as part of the voluntary agreement, and they can end it at any point. Ladona Clayton, executive director of the organization, said they have been criticized, with their efforts being called a “war” and “land grab.” They also get pushback on why the responsibility falls on farmers. She said it’s because of how much water is used for irrigation. They have to be aggressive in their approach, she said. The aquifer supplies water to the Cannon Air Force Base. “We don’t want them to stop agricultural production,” Clayton said. “But for me to say it will be the same level that irrigation can support would be untrue.” There is another possible lifeline that people in the High Plains are eyeing as a solution: the Dockum Aquifer. It’s a minor aquifer that underlies part of the Ogallala, so it would be accessible to farmers and ranchers in the region. The High Plains Water District also oversees this aquifer. If it seems too good to be true—that the most irrigated part of Texas would just so happen to have another abundant supply of water flowing underneath—it’s because there’s a catch. The Dockum is full of extremely salty brackish water. Some counties can use the water for irrigation and drinking water without treatment, but it’s unusable in others. According to the groundwater district, a test well in Lubbock County pulled up water that was as salty as seawater. Rubinstein, the former water development board chairman, said there are pockets of brackish groundwater in Texas that haven’t been tapped yet. It would be enough to meet the needs on the horizon, but it would also be very expensive to obtain and use. A landowner would have to go deeper to get it, then pump the water over a longer distance. “That costs money, and then you have to treat it on top of that,” Rubinstein said. “But, it is water.” Landowners have expressed interest in using desalination, a treatment method to lower dissolved salt levels. Desalination of produced and brackish water is one of the ideas that was being floated around at the Legislature this year, along with building a pipeline to move water across the state. Hagood, the farmer, is skeptical. He thinks whatever water they move could get used up before it makes it all the way to West Texas. There is always brackish groundwater. Another aquifer brings the chance of history repeating—if the Dockum aquifer is treated so its water is usable, will people drain it, too? Hagood said there would have to be limits. Disclosure: Edwards Aquifer Authority and Texas Tech University have been financial supporters of The Texas Tribune. Financial supporters play no role in the Tribune’s journalism. Find a complete list of them here. This article originally appeared in The Texas Tribune, a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org. #texas #headed #droughtbut #lawmakers #wont
    WWW.FASTCOMPANY.COM
    Texas is headed for a drought—but lawmakers won’t do the one thing necessary to save its water supply
    LUBBOCK — Every winter, after the sea of cotton has been harvested in the South Plains and the ground looks barren, technicians with the High Plains Underground Water Conservation District check the water levels in nearly 75,000 wells across 16 counties. For years, their measurements have shown what farmers and water conservationists fear most—the Ogallala Aquifer, an underground water source that’s the lifeblood of the South Plains agriculture industry, is running dry. That’s because of a century-old law called the rule of capture. The rule is simple: If you own the land above an aquifer in Texas, the water underneath is yours. You can use as much as you want, as long as it’s not wasted or taken maliciously. The same applies to your neighbor. If they happen to use more water than you, then that’s just bad luck. To put it another way, landowners can mostly pump as much water as they choose without facing liability to surrounding landowners whose wells might be depleted as a result. Following the Dust Bowl—and to stave off catastrophe—state lawmakers created groundwater conservation districts in 1949 to protect what water is left. But their power to restrict landowners is limited. “The mission is to save as much water possible for as long as possible, with as little impact on private property rights as possible,” said Jason Coleman, manager for the High Plains Underground Water Conservation District. “How do you do that? It’s a difficult task.” A 1953 map of the wells in Lubbock County hangs in the office of the groundwater district. [Photo: Annie Rice for The Texas Tribune] Rapid population growth, climate change, and aging water infrastructure all threaten the state’s water supply. Texas does not have enough water to meet demand if the state is stricken with a historic drought, according to the Texas Water Development Board, the state agency that manages Texas’ water supply. Lawmakers want to invest in every corner to save the state’s water. This week, they reached a historic $20 billion deal on water projects. High Plains Underground Water District General Manager Jason Coleman stands in the district’s meeting room on May 21 in Lubbock. [Photo: Annie Rice for The Texas Tribune] But no one wants to touch the rule of capture. In a state known for rugged individualism, politically speaking, reforming the law is tantamount to stripping away freedoms. “There probably are opportunities to vest groundwater districts with additional authority,” said Amy Hardberger, director for the Texas Tech University Center for Water Law and Policy. “I don’t think the political climate is going to do that.” State Sen. Charles Perry, a Lubbock Republican, and Rep. Cody Harris, a Palestine Republican, led the effort on water in Austin this year. Neither responded to requests for comment. Carlos Rubinstein, a water expert with consulting firm RSAH2O and a former chairman of the water development board, said the rule has been relied upon so long that it would be near impossible to undo the law. “I think it’s better to spend time working within the rules,” Rubinstein said. “And respect the rule of capture, yet also recognize that, in and of itself, it causes problems.” Even though groundwater districts were created to regulate groundwater, the law effectively stops them from doing so, or they risk major lawsuits. The state water plan, which spells out how the state’s water is to be used, acknowledges the shortfall. Groundwater availability is expected to decline by 25% by 2070, mostly due to reduced supply in the Ogallala and Edwards-Trinity aquifers. Together, the aquifers stretch across West Texas and up through the Panhandle. By itself, the Ogallala has an estimated three trillion gallons of water. Though the overwhelming majority in Texas is used by farmers. It’s expected to face a 50% decline by 2070. Groundwater is 54% of the state’s total water supply and is the state’s most vulnerable natural resource. It’s created by rainfall and other precipitation, and seeps into the ground. Like surface water, groundwater is heavily affected by ongoing droughts and prolonged heat waves. However, the state has more say in regulating surface water than it does groundwater. Surface water laws have provisions that cut supply to newer users in a drought and prohibit transferring surface water outside of basins. Historically, groundwater has been used by agriculture in the High Plains. However, as surface water evaporates at a quicker clip, cities and businesses are increasingly interested in tapping the underground resource. As Texas’ population continues to grow and surface water declines, groundwater will be the prize in future fights for water. In many ways, the damage is done in the High Plains, a region that spans from the top of the Panhandle down past Lubbock. The Ogallala Aquifer runs beneath the region, and it’s faced depletion to the point of no return, according to experts. Simply put: The Ogallala is not refilling to keep up with demand. “It’s a creeping disaster,” said Robert Mace, executive director of the Meadows Center for Water and the Environment. “It isn’t like you wake up tomorrow and nobody can pump anymore. It’s just happening slowly, every year.” [Image: Yuriko Schumacher/The Texas Tribune] Groundwater districts and the law The High Plains Water District was the first groundwater district created in Texas. Over a protracted multi-year fight, the Legislature created these new local government bodies in 1949, with voter approval, enshrining the new stewards of groundwater into the state Constitution. If the lawmakers hoped to embolden local officials to manage the troves of water under the soil, they failed. There are areas with groundwater that don’t have conservation districts. Each groundwater districts has different powers. In practice, most water districts permit wells and make decisions on spacing and location to meet the needs of the property owner. The one thing all groundwater districts have in common: They stop short of telling landowners they can’t pump water. In the seven decades since groundwater districts were created, a series of lawsuits have effectively strangled groundwater districts. Even as water levels decline from use and drought, districts still get regular requests for new wells. They won’t say no out of fear of litigation. The field technician coverage area is seen in Nathaniel Bibbs’ office at the High Plains Underground Water District. Bibbs is a permit assistant for the district. [Photo: Annie Rice for The Texas Tribune] “You have a host of different decisions to make as it pertains to management of groundwater,” Coleman said. “That list has grown over the years.” The possibility of lawsuits makes groundwater districts hesitant to regulate usage or put limitations on new well permits. Groundwater districts have to defend themselves in lawsuits, and most lack the resources to do so. A well spacing guide is seen in Nathaniel Bibbs’ office. [Photo: Annie Rice for The Texas Tribune] “The law works against us in that way,” Hardberger, with Texas Tech University, said. “It means one large tool in our toolbox, regulation, is limited.” The most recent example is a lawsuit between the Braggs Farm and the Edwards Aquifer Authority. The farm requested permits for two pecan orchards in Medina County, outside San Antonio. The authority granted only one and limited how much water could be used based on state law. It wasn’t an arbitrary decision. The authority said it followed the statute set by the Legislature to determine the permit. “That’s all they were guaranteed,” said Gregory Ellis, the first general manager of the authority, referring to the water available to the farm. The Braggs family filed a takings lawsuit against the authority. This kind of claim can be filed when any level of government—including groundwater districts—takes private property for public use without paying for the owner’s losses. Braggs won. It is the only successful water-related takings claim in Texas, and it made groundwater laws murkier. It cost the authority $4.5 million. “I think it should have been paid by the state Legislature,” Ellis said. “They’re the ones who designed that permitting system. But that didn’t happen.” An appeals court upheld the ruling in 2013, and the Texas Supreme Court denied petitions to consider appeals. However, the state’s supreme court has previously suggested the Legislature could enhance the powers of the groundwater districts and regulate groundwater like surface water, just as many other states have done. While the laws are complicated, Ellis said the fundamental rule of capture has benefits. It has saved Texas’ legal system from a flurry of lawsuits between well owners. “If they had said ‘Yes, you can sue your neighbor for damaging your well,’ where does it stop?” Ellis asked. “Everybody sues everybody.” Coleman, the High Plains district’s manager, said some people want groundwater districts to have more power, while others think they have too much. Well owners want restrictions for others, but not on them, he said. “You’re charged as a district with trying to apply things uniformly and fairly,” Coleman said. Can’t reverse the past Two tractors were dropping seeds around Walt Hagood’s farm as he turned on his irrigation system for the first time this year. He didn’t plan on using much water. It’s too precious. The cotton farm stretches across 2,350 acres on the outskirts of Wolfforth, a town 12 miles southwest of Lubbock. Hagood irrigates about 80 acres of land, and prays that rain takes care of the rest. Walt Hagood drives across his farm on May 12, in Wolfforth. Hagood utilizes “dry farming,” a technique that relies on natural rainfall. [Photo: Annie Rice for The Texas Tribune] “We used to have a lot of irrigated land with adequate water to make a crop,” Hagood said. “We don’t have that anymore.” The High Plains is home to cotton and cattle, multi-billion-dollar agricultural industries. The success is in large part due to the Ogallala. Since its discovery, the aquifer has helped farms around the region spring up through irrigation, a way for farmers to water their crops instead of waiting for rain that may not come. But as water in the aquifer declines, there are growing concerns that there won’t be enough water to support agriculture in the future. At the peak of irrigation development, more than 8.5 million acres were irrigated in Texas. About 65% of that was in the High Plains. In the decades since the irrigation boom, High Plains farmers have resorted to methods that might save water and keep their livelihoods afloat. They’ve changed their irrigation systems so water is used more efficiently. They grow cover crops so their soil is more likely to soak up rainwater. Some use apps to see where water is needed so it’s not wasted. A furrow irrigation is seen at Walt Hagood’s cotton farm. [Photo: Annie Rice for The Texas Tribune] Farmers who have not changed their irrigation systems might not have a choice in the near future. It can take a week to pump an inch of water in some areas from the aquifer because of how little water is left. As conditions change underground, they are forced to drill deeper for water. That causes additional problems. Calcium can build up, and the water is of poorer quality. And when the water is used to spray crops through a pivot irrigation system, it’s more of a humidifier as water quickly evaporates in the heat. According to the groundwater district’s most recent management plan, 2 million acres in the district use groundwater for irrigation. About 95% of water from the Ogallala is used for irrigated agriculture. The plan states that the irrigated farms “afford economic stability to the area and support a number of other industries.” The state water plan shows groundwater supply is expected to decline, and drought won’t be the only factor causing a shortage. Demand for municipal use outweighs irrigation use, reflecting the state’s future growth. In Region O, which is the South Plains, water for irrigation declines by 2070 while demand for municipal use rises because of population growth in the region. Coleman, with the High Plains groundwater district, often thinks about how the aquifer will hold up with future growth. There are some factors at play with water planning that are nearly impossible to predict and account for, Coleman said. Declining surface water could make groundwater a source for municipalities that didn’t depend on it before. Regions known for having big, open patches of land, like the High Plains, could be attractive to incoming businesses. People could move to the country and want to drill a well, with no understanding of water availability. The state will continue to grow, Coleman said, and all the incoming businesses and industries will undoubtedly need water. “We could say ‘Well, it’s no one’s fault. We didn’t know that factory would need 20,000 acre-feet of water a year,” Coleman said. “It’s not happening right now, but what’s around the corner?” Coleman said this puts agriculture in a tenuous position. The region is full of small towns that depend on agriculture and have supporting businesses, like cotton gins, equipment and feed stores, and pesticide and fertilizer sprayers. This puts pressure on the High Plains water district, along with the two regional water planning groups in the region, to keep agriculture alive. “Districts are not trying to reduce pumping down to a sustainable level,” said Mace with the Meadows Foundation. “And I don’t fault them for that, because doing that is economic devastation in a region with farmers.” Hagood, the cotton farmer, doesn’t think reforming groundwater rights is the way to solve it. What’s done is done, he said. “Our U.S. Constitution protects our private property rights, and that’s what this is all about,” Hagood said. “Any time we have a regulation and people are given more authority, it doesn’t work out right for everybody.” Rapid population growth, climate change, and aging water infrastructure all threaten the state’s water supply. [Photo: Annie Rice for The Texas Tribune] What can be done The state water plan recommends irrigation conservation as a strategy. It’s also the least costly water management method. But that strategy is fraught. Farmers need to irrigate in times of drought, and telling them to stop can draw criticism. In Eastern New Mexico, the Ogallala Land and Water Conservancy, a nonprofit organization, has been retiring irrigation wells. Landowners keep their water rights, and the organization pays them to stop irrigating their farms. Landowners get paid every year as part of the voluntary agreement, and they can end it at any point. Ladona Clayton, executive director of the organization, said they have been criticized, with their efforts being called a “war” and “land grab.” They also get pushback on why the responsibility falls on farmers. She said it’s because of how much water is used for irrigation. They have to be aggressive in their approach, she said. The aquifer supplies water to the Cannon Air Force Base. “We don’t want them to stop agricultural production,” Clayton said. “But for me to say it will be the same level that irrigation can support would be untrue.” There is another possible lifeline that people in the High Plains are eyeing as a solution: the Dockum Aquifer. It’s a minor aquifer that underlies part of the Ogallala, so it would be accessible to farmers and ranchers in the region. The High Plains Water District also oversees this aquifer. If it seems too good to be true—that the most irrigated part of Texas would just so happen to have another abundant supply of water flowing underneath—it’s because there’s a catch. The Dockum is full of extremely salty brackish water. Some counties can use the water for irrigation and drinking water without treatment, but it’s unusable in others. According to the groundwater district, a test well in Lubbock County pulled up water that was as salty as seawater. Rubinstein, the former water development board chairman, said there are pockets of brackish groundwater in Texas that haven’t been tapped yet. It would be enough to meet the needs on the horizon, but it would also be very expensive to obtain and use. A landowner would have to go deeper to get it, then pump the water over a longer distance. “That costs money, and then you have to treat it on top of that,” Rubinstein said. “But, it is water.” Landowners have expressed interest in using desalination, a treatment method to lower dissolved salt levels. Desalination of produced and brackish water is one of the ideas that was being floated around at the Legislature this year, along with building a pipeline to move water across the state. Hagood, the farmer, is skeptical. He thinks whatever water they move could get used up before it makes it all the way to West Texas. There is always brackish groundwater. Another aquifer brings the chance of history repeating—if the Dockum aquifer is treated so its water is usable, will people drain it, too? Hagood said there would have to be limits. Disclosure: Edwards Aquifer Authority and Texas Tech University have been financial supporters of The Texas Tribune. Financial supporters play no role in the Tribune’s journalism. Find a complete list of them here. This article originally appeared in The Texas Tribune, a member-supported, nonpartisan newsroom informing and engaging Texans on state politics and policy. Learn more at texastribune.org.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Xiaomi’s XRING 01 Does Not Mean It Will Immediately Cut Off Sourcing From Qualcomm & MediaTek, As An Estimate Reveals That 40 Percent Of The Company’s Smartphones Use ‘Off The Shelf’ Parts

    The arrival of the XRING 01 is a message from Xiaomi that it is fully prepared and equipped to design and manufacture custom chipsets, securing a new milestone by becoming the first Chinese firm to successfully commercialize a 3nm SoC. While the new silicon’s launch would indicate that the company is ready to wave goodbye to its partners Qualcomm and MediaTek, a new estimate reveals that it may take several years before Xiaomi is completely self-sufficient in making its own chipsets. For now, around 40 percent of the company’s smartphones feature parts from the aforementioned names.
    Relying on Qualcomm and MediaTek for future chipsets might still be a necessity, considering that the U.S. has not voiced its concern for Xiaomi’s XRING 01
    Currently, the XRING 01 powers the Xiaomi 15S Pro and the Xiaomi Pad 7 Ultra, and there is no mention of whether the custom SoC will make its way to other devices. The company has not mentioned how many units it intends to manufacture but utilizing the second-generation 3nm process from TSMC, also known as ‘N3E,’ is a costly decision, not to mention that the tape-out process likely racked up a bill of millions for Xiaomi.
    In the long run, it is cheaper to make your own chipsets as opposed to sourcing from Qualcomm or MediaTek, but in the initial stages, where a ton of trial and error is involved, there is no question that Xiaomi’s investment of billions was absolutely crucial. However, the company has barely scratched the surface, and until it has successfully adopted its in-house silicon for at least a couple of generations, we doubt the partnership with Qualcomm and MediaTek will evaporate.
    In fact, CNBC reports that, according to Counterpoint Research Partner Niel Shah, 40 percent of Xiaomi smartphones continue to feature chipsets from Qualcomm and MediaTek. These two might be included in Xiaomi’s supply chain for a little longer because of the potential U.S. export controls looming over the company’s head like the sword of Damocles. The XRING 01 represents not just a victory for Xiaomi but also for China, and leveraging TSMC’s advanced lithography will probably not go unnoticed by the Trump administration.
    There is the possibility that TSMC is barred from doing business with Xiaomi for fear that the latter’s technology could be distributed to other Chinese firms and give them an edge in the technological race. The problem is that the global landscape is too unpredictable to make such assumptions, so we must see what fate awaits Xiaomi in the future.
    News Source: CNBC

    Deal of the Day
    #xiaomis #xring #does #not #mean
    Xiaomi’s XRING 01 Does Not Mean It Will Immediately Cut Off Sourcing From Qualcomm & MediaTek, As An Estimate Reveals That 40 Percent Of The Company’s Smartphones Use ‘Off The Shelf’ Parts
    The arrival of the XRING 01 is a message from Xiaomi that it is fully prepared and equipped to design and manufacture custom chipsets, securing a new milestone by becoming the first Chinese firm to successfully commercialize a 3nm SoC. While the new silicon’s launch would indicate that the company is ready to wave goodbye to its partners Qualcomm and MediaTek, a new estimate reveals that it may take several years before Xiaomi is completely self-sufficient in making its own chipsets. For now, around 40 percent of the company’s smartphones feature parts from the aforementioned names. Relying on Qualcomm and MediaTek for future chipsets might still be a necessity, considering that the U.S. has not voiced its concern for Xiaomi’s XRING 01 Currently, the XRING 01 powers the Xiaomi 15S Pro and the Xiaomi Pad 7 Ultra, and there is no mention of whether the custom SoC will make its way to other devices. The company has not mentioned how many units it intends to manufacture but utilizing the second-generation 3nm process from TSMC, also known as ‘N3E,’ is a costly decision, not to mention that the tape-out process likely racked up a bill of millions for Xiaomi. In the long run, it is cheaper to make your own chipsets as opposed to sourcing from Qualcomm or MediaTek, but in the initial stages, where a ton of trial and error is involved, there is no question that Xiaomi’s investment of billions was absolutely crucial. However, the company has barely scratched the surface, and until it has successfully adopted its in-house silicon for at least a couple of generations, we doubt the partnership with Qualcomm and MediaTek will evaporate. In fact, CNBC reports that, according to Counterpoint Research Partner Niel Shah, 40 percent of Xiaomi smartphones continue to feature chipsets from Qualcomm and MediaTek. These two might be included in Xiaomi’s supply chain for a little longer because of the potential U.S. export controls looming over the company’s head like the sword of Damocles. The XRING 01 represents not just a victory for Xiaomi but also for China, and leveraging TSMC’s advanced lithography will probably not go unnoticed by the Trump administration. There is the possibility that TSMC is barred from doing business with Xiaomi for fear that the latter’s technology could be distributed to other Chinese firms and give them an edge in the technological race. The problem is that the global landscape is too unpredictable to make such assumptions, so we must see what fate awaits Xiaomi in the future. News Source: CNBC Deal of the Day #xiaomis #xring #does #not #mean
    WCCFTECH.COM
    Xiaomi’s XRING 01 Does Not Mean It Will Immediately Cut Off Sourcing From Qualcomm & MediaTek, As An Estimate Reveals That 40 Percent Of The Company’s Smartphones Use ‘Off The Shelf’ Parts
    The arrival of the XRING 01 is a message from Xiaomi that it is fully prepared and equipped to design and manufacture custom chipsets, securing a new milestone by becoming the first Chinese firm to successfully commercialize a 3nm SoC. While the new silicon’s launch would indicate that the company is ready to wave goodbye to its partners Qualcomm and MediaTek, a new estimate reveals that it may take several years before Xiaomi is completely self-sufficient in making its own chipsets. For now, around 40 percent of the company’s smartphones feature parts from the aforementioned names. Relying on Qualcomm and MediaTek for future chipsets might still be a necessity, considering that the U.S. has not voiced its concern for Xiaomi’s XRING 01 Currently, the XRING 01 powers the Xiaomi 15S Pro and the Xiaomi Pad 7 Ultra, and there is no mention of whether the custom SoC will make its way to other devices. The company has not mentioned how many units it intends to manufacture but utilizing the second-generation 3nm process from TSMC, also known as ‘N3E,’ is a costly decision, not to mention that the tape-out process likely racked up a bill of millions for Xiaomi. In the long run, it is cheaper to make your own chipsets as opposed to sourcing from Qualcomm or MediaTek, but in the initial stages, where a ton of trial and error is involved, there is no question that Xiaomi’s investment of billions was absolutely crucial. However, the company has barely scratched the surface, and until it has successfully adopted its in-house silicon for at least a couple of generations, we doubt the partnership with Qualcomm and MediaTek will evaporate. In fact, CNBC reports that, according to Counterpoint Research Partner Niel Shah, 40 percent of Xiaomi smartphones continue to feature chipsets from Qualcomm and MediaTek. These two might be included in Xiaomi’s supply chain for a little longer because of the potential U.S. export controls looming over the company’s head like the sword of Damocles. The XRING 01 represents not just a victory for Xiaomi but also for China, and leveraging TSMC’s advanced lithography will probably not go unnoticed by the Trump administration. There is the possibility that TSMC is barred from doing business with Xiaomi for fear that the latter’s technology could be distributed to other Chinese firms and give them an edge in the technological race. The problem is that the global landscape is too unpredictable to make such assumptions, so we must see what fate awaits Xiaomi in the future. News Source: CNBC Deal of the Day
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Lithium Extraction

    Located in the high-altitude desert of northwest Argentina, the Salar de Olaroz is part of the so-called "Lithium Triangle" ? a region that holds some of the world's largest lithium reserves. This project captures the transformation of this remote salt flat into an industrialized landscape shaped by the global demand for lithium, a key component in batteries for electric vehicles and energy storage.

    What stands out in the imagery are the large-scale evaporation ponds, especially the white and red-toned zones that fringe the salt flat. The white areas are crystallized salts left behind as the brine evaporates. The reddish hues are caused by mineral impurities, such as iron oxides, and variations in salinity, concentration stages, or organic residue.

    These abstract, geometric patterns sharply contrast with the natural surroundings, forming a surreal mosaic that reveals both beauty and environmental tension. The red-white areas dominate the visual composition, highlighting the scale and
    #lithium #extraction
    Lithium Extraction
    Located in the high-altitude desert of northwest Argentina, the Salar de Olaroz is part of the so-called "Lithium Triangle" ? a region that holds some of the world's largest lithium reserves. This project captures the transformation of this remote salt flat into an industrialized landscape shaped by the global demand for lithium, a key component in batteries for electric vehicles and energy storage. What stands out in the imagery are the large-scale evaporation ponds, especially the white and red-toned zones that fringe the salt flat. The white areas are crystallized salts left behind as the brine evaporates. The reddish hues are caused by mineral impurities, such as iron oxides, and variations in salinity, concentration stages, or organic residue. These abstract, geometric patterns sharply contrast with the natural surroundings, forming a surreal mosaic that reveals both beauty and environmental tension. The red-white areas dominate the visual composition, highlighting the scale and #lithium #extraction
    WWW.BEHANCE.NET
    Lithium Extraction
    Located in the high-altitude desert of northwest Argentina, the Salar de Olaroz is part of the so-called "Lithium Triangle" ? a region that holds some of the world's largest lithium reserves. This project captures the transformation of this remote salt flat into an industrialized landscape shaped by the global demand for lithium, a key component in batteries for electric vehicles and energy storage. What stands out in the imagery are the large-scale evaporation ponds, especially the white and red-toned zones that fringe the salt flat. The white areas are crystallized salts left behind as the brine evaporates. The reddish hues are caused by mineral impurities, such as iron oxides, and variations in salinity, concentration stages, or organic residue. These abstract, geometric patterns sharply contrast with the natural surroundings, forming a surreal mosaic that reveals both beauty and environmental tension. The red-white areas dominate the visual composition, highlighting the scale and
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Through Fairy Lights and Butterflies, Chiharu Shiota Tethers Presence and Absence

    “Metamorphosis of Consciousness”, mixed media, dimensions variable. All images courtesy of Red Brick Art Museum
    Through Fairy Lights and Butterflies, Chiharu Shiota Tethers Presence and Absence
    May 22, 2025
    Art
    Grace Ebert

    In one of the foundational texts of Taoism, Chinese philosopher Zhuang Zhou recalls a dream in which he was a butterfly, soaring through the sky with no recollection of his human form. Upon waking, though, he finds himself firmly in a bipedal body, prompting an important question: is he a butterfly dreaming he’s Zhuang Zhou or a man dreaming he’s a butterfly?
    This ancient story of transformation and the thin line between states of mind informs a dazzling new installation by Chiharu Shiota. “Metamorphosis of Consciousness” suspends glimmering lights and faint butterfly wings above an iron-framed twin bed topped with a white blanket and pillow. Rejecting the strict separation between body and mind, Shiota references her belief in the spirit’s ability to endure long after one’s final breath. “While each time we slip into sleep, it is a rehearsal for death—a journey beyond the body,” she says.
    “Metamorphosis of Consciousness”, mixed media, dimensions variable
    Exemplary of the artist’s interest in memory and knowledge, “Metamorphosis of Consciousness” is just one of the immersive works in the monumental exhibition Silent Emptiness at Red Brick Art Museum in Beijing.
    On view through August 31, the show revolves around Shiota’s ongoing explorations into the “presence in absence,” this time extending such inquiries into ideas of emptiness as it relates to Eastern philosophy and enlightenment.
    Included in the exhibtion is an antique Tibetan Buddhist doorway that anchors “Gateway to Silence,” an explosive installation that entwines the elaborately carved wood structure in a dense, criss-crossing labyrinth of string. Red thread, one of the artist’s favored materials, symbolizes relationships. And in this case, it’s an invitation to introspection and finding an awareness of the present moment.
    Metaphorically interlacing art, memory, and faith, Shiota very literally visualizes the intextricable web in which we’re all bound, regardless of geography or era. Pieces like “Echoes of Time” and “Rooted Memories” incorporate materials like soil and large stones, presenting the passage of time as cyclical and the past as always shaping the present.
    Detail of “Gateway to Silence”, antique porch and red wool, dimensions variable
    Born in Osaka, the artist has lived in Berlin for much of her life, and Silent Emptiness also tethers her roots to more global experiences. Shiota likened her understanding of herself to the way salt molecules appear as crystals only after water evaporates. “I was not visible as an individual in Japan,” she says. “Whereas I did not know who I was, what I wanted to do, and what was necessary in the water, I feel that I became an individual and crystal, and understood those things for the first time by coming to Germany.”
    Another example of finding presence in absence, Shiota’s migration and experience of discovery provides an important touchstone for her thinking and practice. She adds, “Absence does not signify disappearance but rather an integration into a vaster universe, re-entering the flow of time and forming new connections with all things.”“Gateway to Silence”, mixed media, dimensions variable
    “Rooted Memories”, red rope, boat, and earth, dimensions variable
    “Rooted Memories”, red rope, boat, and earth, dimensions variable
    Detail of “Rooted Memories”, red rope, boat, and earth, dimensions variable
    “Multiple Realities”, mixed media, dimensions variable
    “Echoes of Time”, black yarn and rock, dimensions variable
    “Echoes of Time”, black yarn and rock, dimensions variable
    Next article
    #through #fairy #lights #butterflies #chiharu
    Through Fairy Lights and Butterflies, Chiharu Shiota Tethers Presence and Absence
    “Metamorphosis of Consciousness”, mixed media, dimensions variable. All images courtesy of Red Brick Art Museum Through Fairy Lights and Butterflies, Chiharu Shiota Tethers Presence and Absence May 22, 2025 Art Grace Ebert In one of the foundational texts of Taoism, Chinese philosopher Zhuang Zhou recalls a dream in which he was a butterfly, soaring through the sky with no recollection of his human form. Upon waking, though, he finds himself firmly in a bipedal body, prompting an important question: is he a butterfly dreaming he’s Zhuang Zhou or a man dreaming he’s a butterfly? This ancient story of transformation and the thin line between states of mind informs a dazzling new installation by Chiharu Shiota. “Metamorphosis of Consciousness” suspends glimmering lights and faint butterfly wings above an iron-framed twin bed topped with a white blanket and pillow. Rejecting the strict separation between body and mind, Shiota references her belief in the spirit’s ability to endure long after one’s final breath. “While each time we slip into sleep, it is a rehearsal for death—a journey beyond the body,” she says. “Metamorphosis of Consciousness”, mixed media, dimensions variable Exemplary of the artist’s interest in memory and knowledge, “Metamorphosis of Consciousness” is just one of the immersive works in the monumental exhibition Silent Emptiness at Red Brick Art Museum in Beijing. On view through August 31, the show revolves around Shiota’s ongoing explorations into the “presence in absence,” this time extending such inquiries into ideas of emptiness as it relates to Eastern philosophy and enlightenment. Included in the exhibtion is an antique Tibetan Buddhist doorway that anchors “Gateway to Silence,” an explosive installation that entwines the elaborately carved wood structure in a dense, criss-crossing labyrinth of string. Red thread, one of the artist’s favored materials, symbolizes relationships. And in this case, it’s an invitation to introspection and finding an awareness of the present moment. Metaphorically interlacing art, memory, and faith, Shiota very literally visualizes the intextricable web in which we’re all bound, regardless of geography or era. Pieces like “Echoes of Time” and “Rooted Memories” incorporate materials like soil and large stones, presenting the passage of time as cyclical and the past as always shaping the present. Detail of “Gateway to Silence”, antique porch and red wool, dimensions variable Born in Osaka, the artist has lived in Berlin for much of her life, and Silent Emptiness also tethers her roots to more global experiences. Shiota likened her understanding of herself to the way salt molecules appear as crystals only after water evaporates. “I was not visible as an individual in Japan,” she says. “Whereas I did not know who I was, what I wanted to do, and what was necessary in the water, I feel that I became an individual and crystal, and understood those things for the first time by coming to Germany.” Another example of finding presence in absence, Shiota’s migration and experience of discovery provides an important touchstone for her thinking and practice. She adds, “Absence does not signify disappearance but rather an integration into a vaster universe, re-entering the flow of time and forming new connections with all things.”“Gateway to Silence”, mixed media, dimensions variable “Rooted Memories”, red rope, boat, and earth, dimensions variable “Rooted Memories”, red rope, boat, and earth, dimensions variable Detail of “Rooted Memories”, red rope, boat, and earth, dimensions variable “Multiple Realities”, mixed media, dimensions variable “Echoes of Time”, black yarn and rock, dimensions variable “Echoes of Time”, black yarn and rock, dimensions variable Next article #through #fairy #lights #butterflies #chiharu
    WWW.THISISCOLOSSAL.COM
    Through Fairy Lights and Butterflies, Chiharu Shiota Tethers Presence and Absence
    “Metamorphosis of Consciousness” (2025), mixed media, dimensions variable. All images courtesy of Red Brick Art Museum Through Fairy Lights and Butterflies, Chiharu Shiota Tethers Presence and Absence May 22, 2025 Art Grace Ebert In one of the foundational texts of Taoism, Chinese philosopher Zhuang Zhou recalls a dream in which he was a butterfly, soaring through the sky with no recollection of his human form. Upon waking, though, he finds himself firmly in a bipedal body, prompting an important question: is he a butterfly dreaming he’s Zhuang Zhou or a man dreaming he’s a butterfly? This ancient story of transformation and the thin line between states of mind informs a dazzling new installation by Chiharu Shiota (previously). “Metamorphosis of Consciousness” suspends glimmering lights and faint butterfly wings above an iron-framed twin bed topped with a white blanket and pillow. Rejecting the strict separation between body and mind, Shiota references her belief in the spirit’s ability to endure long after one’s final breath. “While each time we slip into sleep, it is a rehearsal for death—a journey beyond the body,” she says. “Metamorphosis of Consciousness” (2025), mixed media, dimensions variable Exemplary of the artist’s interest in memory and knowledge, “Metamorphosis of Consciousness” is just one of the immersive works in the monumental exhibition Silent Emptiness at Red Brick Art Museum in Beijing. On view through August 31, the show revolves around Shiota’s ongoing explorations into the “presence in absence,” this time extending such inquiries into ideas of emptiness as it relates to Eastern philosophy and enlightenment. Included in the exhibtion is an antique Tibetan Buddhist doorway that anchors “Gateway to Silence,” an explosive installation that entwines the elaborately carved wood structure in a dense, criss-crossing labyrinth of string. Red thread, one of the artist’s favored materials, symbolizes relationships. And in this case, it’s an invitation to introspection and finding an awareness of the present moment. Metaphorically interlacing art, memory, and faith, Shiota very literally visualizes the intextricable web in which we’re all bound, regardless of geography or era. Pieces like “Echoes of Time” and “Rooted Memories” incorporate materials like soil and large stones, presenting the passage of time as cyclical and the past as always shaping the present. Detail of “Gateway to Silence” (2025), antique porch and red wool, dimensions variable Born in Osaka, the artist has lived in Berlin for much of her life, and Silent Emptiness also tethers her roots to more global experiences. Shiota likened her understanding of herself to the way salt molecules appear as crystals only after water evaporates. “I was not visible as an individual in Japan,” she says. “Whereas I did not know who I was, what I wanted to do, and what was necessary in the water, I feel that I became an individual and crystal, and understood those things for the first time by coming to Germany.” Another example of finding presence in absence, Shiota’s migration and experience of discovery provides an important touchstone for her thinking and practice. She adds, “Absence does not signify disappearance but rather an integration into a vaster universe, re-entering the flow of time and forming new connections with all things.” (via designboom) “Gateway to Silence” (2025, antique porch and red wool, dimensions variable Detail of “Gateway to Silence” (2025, antique porch and red wool, dimensions variable Detail of “Metamorphosis of Consciousness” (2025), mixed media, dimensions variable “Rooted Memories” (2025), red rope, boat, and earth, dimensions variable “Rooted Memories” (2025), red rope, boat, and earth, dimensions variable Detail of “Rooted Memories” (2025), red rope, boat, and earth, dimensions variable “Multiple Realities” (2025), mixed media, dimensions variable “Echoes of Time” (2025), black yarn and rock, dimensions variable “Echoes of Time” (2025), black yarn and rock, dimensions variable Next article
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Mario Kart World was first developed for the original Switch, but 60fps only possible on Switch 2

    Mario Kart World was first developed for the original Switch, but 60fps only possible on Switch 2
    Plus more on the creation of Cow.

    Image credit: Nintendo

    News

    by Ed Nightingale
    Deputy News Editor

    Published on May 21, 2025

    Mario Kart World was originally in development for the original Switch, but Switch 2 has allowed the developers to realise their vision of an inter-connected world.
    Nintendo began prototyping for the new Mario Kart game back in 2017, even during development of Mario Kart 8 Deluxe. Development then began at the end of that year.
    "I felt that in Mario Kart 8 Deluxe, we were able to perfect the formula that we'd been following in the series up to that point, where players race on individual courses," explained Mario Kart World producer Kosuke Yabuki in a new interview from Nintendo. "That's why, this time, we wanted the gameplay to involve players driving around a large world, and we began creating a world map like this."

    Mario Kart World – Nintendo Direct | Nintendo Switch 2Watch on YouTube
    Programming director Kenta Sato continued: "When we were developing for Nintendo Switch, we often worried whether we could find the right balance between planning and performance. Of course, the Switch system's performance is sufficient for developing different kinds of games, but if we had included everything we wanted to in this game's vast world, then it wouldn't have run at 60 fps and would have suffered from constant framerate drops.
    "I think there were a lot of people on the team who were worried about whether we could really manage it. But once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we'd originally set out to."
    Yabuki noted that had the idea for the game just been more courses it would've been called Mario Kart 9, but the new approach led to the name Mario Kart World. In fact, that name was added to concept art in the early stages of development.
    "In previous Mario Kart games, after finishing a course, you'd move on to the next course," said Yabuki. "However, I thought that with modern technology, being able to seamlessly transition between courses and realise a single, vast world wasn't beyond the realm of possibility. So, with this in mind, we set out to create a new kind of Mario Kart...And that's when all our troubles began."

    Image credit: Nintendo

    Sato had heard of difficulties in creating open world games and felt pressure to achieve this with Mario Kart, especially as Nintendo considers 60fps to be important for the series, as well as split-screen multiplayer. It seems the power of the Switch 2 allowed for this to happen.
    The move to an open world also led to the increase of 24 players instead of 12, which was decided fairly early in development.
    "By creating long routes in a vast world, you could end up with players spread out in various places, which could diminish the sense that they're racing against each other," said Yabuki. "So, we figured that by increasing the number of racers, you'd be sure to see some competitive action here and there."
    Added art director Masaaki Ishikawa: "I felt like the 12 players we had previously was a lot, but as Yabuki-san said, once players spread out, the course starts to look sparse, and the visuals give off a sort of lonely feel. So, I thought that 24 players would be better because there'd be more interaction between various players. That said, it was quite challenging once we got going and the volume of design work increased. But it was worth it."
    Speaking of art, the team wanted to recapture the "lively, bustling atmosphere" of Super Mario Kart on the SNES, with a key phrase being "playfulness". This led to the more rounded, cartoonish designs of Mario Kart World.
    "The characters in the Super Mario series have a rounded look, so we wanted to give the vehicles a rounded design too, to match their appearance," said Ishikawa. "We also wanted to give the characters a livelier look even while they're in their vehicles, so we put everything together in a way that gives off a sense of playfulness through the combination of the characters' rounded designs, soft facial expressions, and rich movements."
    The open world design means courses are now inter-connected across the world, but Yabuki said adding up all the possible variations would "easily exceed 100".

    Image credit: Nintendo

    Lastly, the developers discussed the creation of Mario Kart World's meme-worthy new character, Cow.
    "Each new Mario Kart game features new characters to race with, but since we added so many to the previous game, we wondered where we could go from there," said Ishikawa. "And then one of the designers came up with that silly sketch of Cow cruising along, and I thought to myself, 'This is it!' So that's when we realised the course surroundings actually contained a lot of untapped resources.
    "The character designer quickly put together a prototype of Cow that could race, and surprisingly it didn't feel out of place at all. So we thought maybe we could include other obstacle characters, and decided to add Cheep Cheep and Pokey as racers. As a result, the idea of taking obstacle characters, usually found in courses in past games, and having them participate in races made sense to me in terms of an interconnected world."
    Mario Kart World will launch alongside Nintendo's Switch 2 on 5th June as a key game in the launch lineup.
    Earlier this month Nintendo responded to the suggestion it had used AI-generated images in the game.
    #mario #kart #world #was #first
    Mario Kart World was first developed for the original Switch, but 60fps only possible on Switch 2
    Mario Kart World was first developed for the original Switch, but 60fps only possible on Switch 2 Plus more on the creation of Cow. Image credit: Nintendo News by Ed Nightingale Deputy News Editor Published on May 21, 2025 Mario Kart World was originally in development for the original Switch, but Switch 2 has allowed the developers to realise their vision of an inter-connected world. Nintendo began prototyping for the new Mario Kart game back in 2017, even during development of Mario Kart 8 Deluxe. Development then began at the end of that year. "I felt that in Mario Kart 8 Deluxe, we were able to perfect the formula that we'd been following in the series up to that point, where players race on individual courses," explained Mario Kart World producer Kosuke Yabuki in a new interview from Nintendo. "That's why, this time, we wanted the gameplay to involve players driving around a large world, and we began creating a world map like this." Mario Kart World – Nintendo Direct | Nintendo Switch 2Watch on YouTube Programming director Kenta Sato continued: "When we were developing for Nintendo Switch, we often worried whether we could find the right balance between planning and performance. Of course, the Switch system's performance is sufficient for developing different kinds of games, but if we had included everything we wanted to in this game's vast world, then it wouldn't have run at 60 fps and would have suffered from constant framerate drops. "I think there were a lot of people on the team who were worried about whether we could really manage it. But once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we'd originally set out to." Yabuki noted that had the idea for the game just been more courses it would've been called Mario Kart 9, but the new approach led to the name Mario Kart World. In fact, that name was added to concept art in the early stages of development. "In previous Mario Kart games, after finishing a course, you'd move on to the next course," said Yabuki. "However, I thought that with modern technology, being able to seamlessly transition between courses and realise a single, vast world wasn't beyond the realm of possibility. So, with this in mind, we set out to create a new kind of Mario Kart...And that's when all our troubles began." Image credit: Nintendo Sato had heard of difficulties in creating open world games and felt pressure to achieve this with Mario Kart, especially as Nintendo considers 60fps to be important for the series, as well as split-screen multiplayer. It seems the power of the Switch 2 allowed for this to happen. The move to an open world also led to the increase of 24 players instead of 12, which was decided fairly early in development. "By creating long routes in a vast world, you could end up with players spread out in various places, which could diminish the sense that they're racing against each other," said Yabuki. "So, we figured that by increasing the number of racers, you'd be sure to see some competitive action here and there." Added art director Masaaki Ishikawa: "I felt like the 12 players we had previously was a lot, but as Yabuki-san said, once players spread out, the course starts to look sparse, and the visuals give off a sort of lonely feel. So, I thought that 24 players would be better because there'd be more interaction between various players. That said, it was quite challenging once we got going and the volume of design work increased. But it was worth it." Speaking of art, the team wanted to recapture the "lively, bustling atmosphere" of Super Mario Kart on the SNES, with a key phrase being "playfulness". This led to the more rounded, cartoonish designs of Mario Kart World. "The characters in the Super Mario series have a rounded look, so we wanted to give the vehicles a rounded design too, to match their appearance," said Ishikawa. "We also wanted to give the characters a livelier look even while they're in their vehicles, so we put everything together in a way that gives off a sense of playfulness through the combination of the characters' rounded designs, soft facial expressions, and rich movements." The open world design means courses are now inter-connected across the world, but Yabuki said adding up all the possible variations would "easily exceed 100". Image credit: Nintendo Lastly, the developers discussed the creation of Mario Kart World's meme-worthy new character, Cow. "Each new Mario Kart game features new characters to race with, but since we added so many to the previous game, we wondered where we could go from there," said Ishikawa. "And then one of the designers came up with that silly sketch of Cow cruising along, and I thought to myself, 'This is it!' So that's when we realised the course surroundings actually contained a lot of untapped resources. "The character designer quickly put together a prototype of Cow that could race, and surprisingly it didn't feel out of place at all. So we thought maybe we could include other obstacle characters, and decided to add Cheep Cheep and Pokey as racers. As a result, the idea of taking obstacle characters, usually found in courses in past games, and having them participate in races made sense to me in terms of an interconnected world." Mario Kart World will launch alongside Nintendo's Switch 2 on 5th June as a key game in the launch lineup. Earlier this month Nintendo responded to the suggestion it had used AI-generated images in the game. #mario #kart #world #was #first
    WWW.EUROGAMER.NET
    Mario Kart World was first developed for the original Switch, but 60fps only possible on Switch 2
    Mario Kart World was first developed for the original Switch, but 60fps only possible on Switch 2 Plus more on the creation of Cow. Image credit: Nintendo News by Ed Nightingale Deputy News Editor Published on May 21, 2025 Mario Kart World was originally in development for the original Switch, but Switch 2 has allowed the developers to realise their vision of an inter-connected world. Nintendo began prototyping for the new Mario Kart game back in 2017, even during development of Mario Kart 8 Deluxe. Development then began at the end of that year. "I felt that in Mario Kart 8 Deluxe, we were able to perfect the formula that we'd been following in the series up to that point, where players race on individual courses," explained Mario Kart World producer Kosuke Yabuki in a new interview from Nintendo. "That's why, this time, we wanted the gameplay to involve players driving around a large world, and we began creating a world map like this." Mario Kart World – Nintendo Direct | Nintendo Switch 2Watch on YouTube Programming director Kenta Sato continued: "When we were developing for Nintendo Switch, we often worried whether we could find the right balance between planning and performance. Of course, the Switch system's performance is sufficient for developing different kinds of games, but if we had included everything we wanted to in this game's vast world, then it wouldn't have run at 60 fps and would have suffered from constant framerate drops. "I think there were a lot of people on the team who were worried about whether we could really manage it. But once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we'd originally set out to." Yabuki noted that had the idea for the game just been more courses it would've been called Mario Kart 9, but the new approach led to the name Mario Kart World. In fact, that name was added to concept art in the early stages of development. "In previous Mario Kart games, after finishing a course, you'd move on to the next course," said Yabuki. "However, I thought that with modern technology, being able to seamlessly transition between courses and realise a single, vast world wasn't beyond the realm of possibility. So, with this in mind, we set out to create a new kind of Mario Kart...And that's when all our troubles began." Image credit: Nintendo Sato had heard of difficulties in creating open world games and felt pressure to achieve this with Mario Kart, especially as Nintendo considers 60fps to be important for the series, as well as split-screen multiplayer. It seems the power of the Switch 2 allowed for this to happen. The move to an open world also led to the increase of 24 players instead of 12, which was decided fairly early in development. "By creating long routes in a vast world, you could end up with players spread out in various places, which could diminish the sense that they're racing against each other," said Yabuki. "So, we figured that by increasing the number of racers, you'd be sure to see some competitive action here and there." Added art director Masaaki Ishikawa: "I felt like the 12 players we had previously was a lot, but as Yabuki-san said, once players spread out, the course starts to look sparse, and the visuals give off a sort of lonely feel. So, I thought that 24 players would be better because there'd be more interaction between various players. That said, it was quite challenging once we got going and the volume of design work increased. But it was worth it." Speaking of art, the team wanted to recapture the "lively, bustling atmosphere" of Super Mario Kart on the SNES, with a key phrase being "playfulness". This led to the more rounded, cartoonish designs of Mario Kart World. "The characters in the Super Mario series have a rounded look, so we wanted to give the vehicles a rounded design too, to match their appearance," said Ishikawa. "We also wanted to give the characters a livelier look even while they're in their vehicles, so we put everything together in a way that gives off a sense of playfulness through the combination of the characters' rounded designs, soft facial expressions, and rich movements." The open world design means courses are now inter-connected across the world, but Yabuki said adding up all the possible variations would "easily exceed 100". Image credit: Nintendo Lastly, the developers discussed the creation of Mario Kart World's meme-worthy new character, Cow. "Each new Mario Kart game features new characters to race with, but since we added so many to the previous game, we wondered where we could go from there," said Ishikawa. "And then one of the designers came up with that silly sketch of Cow cruising along, and I thought to myself, 'This is it!' So that's when we realised the course surroundings actually contained a lot of untapped resources. "The character designer quickly put together a prototype of Cow that could race, and surprisingly it didn't feel out of place at all. So we thought maybe we could include other obstacle characters, and decided to add Cheep Cheep and Pokey as racers. As a result, the idea of taking obstacle characters, usually found in courses in past games, and having them participate in races made sense to me in terms of an interconnected world." Mario Kart World will launch alongside Nintendo's Switch 2 on 5th June as a key game in the launch lineup. Earlier this month Nintendo responded to the suggestion it had used AI-generated images in the game.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Mario Kart World was originally designed for Switch 1 reveals Nintendo

    Mario Kart World was originally designed for Switch 1 reveals Nintendo

    GameCentral

    Published May 21, 2025 5:01pm

    Updated May 21, 2025 5:01pm

    Mario Kart World only works on Switch 2The new Mario Kart for the Switch 2 was not originally designed for the console, as Nintendo reveals how and when the game came to be.
    The launch of a new console is always an exciting time but the first few years of a new format’s existence can be frustrating, when it’s obvious that most of the games were actually designed as titles for the previous generation of hardware.
    This has been exacerbated on the PlayStation 5, where Sony continued to release cross-gen games for much longer than usual. It was equally obvious with the Switch 1, given launch title Zelda: Breath Of The Wild was released simultaneously on Wii U and many of the other first party titles were also Wii U ports.
    The only similar situation so far for the Switch 2 is Metroid Prime 4, which is primarily a Switch 1 title, but Nintendo has revealed that Mario Kart World was originally designed for the earlier console, until they realised it was never going to work on the older hardware.
    Nintendo has revealed that they first started thinking about the new game way back in March 2017, when the Switch 1 was originally released. That’s not too surprising, as Mario Kart 8 Deluxe is a Wii U port and technically there’s never been a new Mario Kart made for the Switch 1.
    ‘I felt that in Mario Kart 8 Deluxe, we were able to perfect the formula that we’d been following in the series up to that point, where players race on individual courses,’ says Mario Kart World producer Kosuke Yabuki in a new interview.
    ‘That’s why, this time, we wanted the gameplay to involve players driving around a large world, and we began creating a world map like this.’
    The idea that Mario Kart 8 takes the original idea as far as it can go has been echoed by many, with Mario Kart World appearing to be the series’ Breath Of The Wild moment, where it radically changes the franchise’s approach.
    No previous Mario Kart game has had an open world environment but according to programming director Kenta Sato, the current Switch 1 couldn’t handle the ideas they wanted to include.
    ‘We discussed things like toning down the visuals, lowering the resolution, and we even considered dropping the frame rate to 30 fps in some cases. It was a tough situation,’ he admitted.
    ‘Once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we’d originally set out to.’
    According to Sato, the decision to move the game to the Switch 2 took place ‘around 2020’, which enabled the team to expand their plans and ensure the game would still run at 60fps.

    More Trending

    ‘Back then, we already had an idea of the next console’s expected specs, but it wasn’t until a bit later that we actually received working development units. Until then, we just had to proceed with development based on provisional estimates,’ says Sato.
    ‘Once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we’d originally set out to.’
    The whole four part interview is well worth a read on Nintendo’s website, with a lot of insight on both the game and how Nintendo prepares for the launch of a new console.
    Mario Kart World will launch alongside the console itself on June 5. Controversially, the game is £75 for the physical edition and £67 for a digital download. However, the game is effectively half price if bought along with the console, as part of the official hardware bundle.

    All this and now you can play as a cow tooEmail gamecentral@metro.co.uk, leave a comment below, follow us on Twitter, and sign-up to our newsletter.
    To submit Inbox letters and Reader’s Features more easily, without the need to send an email, just use our Submit Stuff page here.
    For more stories like this, check our Gaming page.

    GameCentral
    Sign up for exclusive analysis, latest releases, and bonus community content.
    This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Your information will be used in line with our Privacy Policy
    #mario #kart #world #was #originally
    Mario Kart World was originally designed for Switch 1 reveals Nintendo
    Mario Kart World was originally designed for Switch 1 reveals Nintendo GameCentral Published May 21, 2025 5:01pm Updated May 21, 2025 5:01pm Mario Kart World only works on Switch 2The new Mario Kart for the Switch 2 was not originally designed for the console, as Nintendo reveals how and when the game came to be. The launch of a new console is always an exciting time but the first few years of a new format’s existence can be frustrating, when it’s obvious that most of the games were actually designed as titles for the previous generation of hardware. This has been exacerbated on the PlayStation 5, where Sony continued to release cross-gen games for much longer than usual. It was equally obvious with the Switch 1, given launch title Zelda: Breath Of The Wild was released simultaneously on Wii U and many of the other first party titles were also Wii U ports. The only similar situation so far for the Switch 2 is Metroid Prime 4, which is primarily a Switch 1 title, but Nintendo has revealed that Mario Kart World was originally designed for the earlier console, until they realised it was never going to work on the older hardware. Nintendo has revealed that they first started thinking about the new game way back in March 2017, when the Switch 1 was originally released. That’s not too surprising, as Mario Kart 8 Deluxe is a Wii U port and technically there’s never been a new Mario Kart made for the Switch 1. ‘I felt that in Mario Kart 8 Deluxe, we were able to perfect the formula that we’d been following in the series up to that point, where players race on individual courses,’ says Mario Kart World producer Kosuke Yabuki in a new interview. ‘That’s why, this time, we wanted the gameplay to involve players driving around a large world, and we began creating a world map like this.’ The idea that Mario Kart 8 takes the original idea as far as it can go has been echoed by many, with Mario Kart World appearing to be the series’ Breath Of The Wild moment, where it radically changes the franchise’s approach. No previous Mario Kart game has had an open world environment but according to programming director Kenta Sato, the current Switch 1 couldn’t handle the ideas they wanted to include. ‘We discussed things like toning down the visuals, lowering the resolution, and we even considered dropping the frame rate to 30 fps in some cases. It was a tough situation,’ he admitted. ‘Once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we’d originally set out to.’ According to Sato, the decision to move the game to the Switch 2 took place ‘around 2020’, which enabled the team to expand their plans and ensure the game would still run at 60fps. More Trending ‘Back then, we already had an idea of the next console’s expected specs, but it wasn’t until a bit later that we actually received working development units. Until then, we just had to proceed with development based on provisional estimates,’ says Sato. ‘Once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we’d originally set out to.’ The whole four part interview is well worth a read on Nintendo’s website, with a lot of insight on both the game and how Nintendo prepares for the launch of a new console. Mario Kart World will launch alongside the console itself on June 5. Controversially, the game is £75 for the physical edition and £67 for a digital download. However, the game is effectively half price if bought along with the console, as part of the official hardware bundle. All this and now you can play as a cow tooEmail gamecentral@metro.co.uk, leave a comment below, follow us on Twitter, and sign-up to our newsletter. To submit Inbox letters and Reader’s Features more easily, without the need to send an email, just use our Submit Stuff page here. For more stories like this, check our Gaming page. GameCentral Sign up for exclusive analysis, latest releases, and bonus community content. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Your information will be used in line with our Privacy Policy #mario #kart #world #was #originally
    METRO.CO.UK
    Mario Kart World was originally designed for Switch 1 reveals Nintendo
    Mario Kart World was originally designed for Switch 1 reveals Nintendo GameCentral Published May 21, 2025 5:01pm Updated May 21, 2025 5:01pm Mario Kart World only works on Switch 2 (Nintendo) The new Mario Kart for the Switch 2 was not originally designed for the console, as Nintendo reveals how and when the game came to be. The launch of a new console is always an exciting time but the first few years of a new format’s existence can be frustrating, when it’s obvious that most of the games were actually designed as titles for the previous generation of hardware. This has been exacerbated on the PlayStation 5, where Sony continued to release cross-gen games for much longer than usual. It was equally obvious with the Switch 1, given launch title Zelda: Breath Of The Wild was released simultaneously on Wii U and many of the other first party titles were also Wii U ports. The only similar situation so far for the Switch 2 is Metroid Prime 4, which is primarily a Switch 1 title, but Nintendo has revealed that Mario Kart World was originally designed for the earlier console, until they realised it was never going to work on the older hardware. Nintendo has revealed that they first started thinking about the new game way back in March 2017, when the Switch 1 was originally released. That’s not too surprising, as Mario Kart 8 Deluxe is a Wii U port and technically there’s never been a new Mario Kart made for the Switch 1. ‘I felt that in Mario Kart 8 Deluxe, we were able to perfect the formula that we’d been following in the series up to that point, where players race on individual courses,’ says Mario Kart World producer Kosuke Yabuki in a new interview. ‘That’s why, this time, we wanted the gameplay to involve players driving around a large world, and we began creating a world map like this.’ The idea that Mario Kart 8 takes the original idea as far as it can go has been echoed by many, with Mario Kart World appearing to be the series’ Breath Of The Wild moment, where it radically changes the franchise’s approach. No previous Mario Kart game has had an open world environment but according to programming director Kenta Sato, the current Switch 1 couldn’t handle the ideas they wanted to include. ‘We discussed things like toning down the visuals, lowering the resolution, and we even considered dropping the frame rate to 30 fps in some cases. It was a tough situation,’ he admitted. ‘Once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we’d originally set out to.’ According to Sato, the decision to move the game to the Switch 2 took place ‘around 2020’, which enabled the team to expand their plans and ensure the game would still run at 60fps. More Trending ‘Back then, we already had an idea of the next console’s expected specs, but it wasn’t until a bit later that we actually received working development units. Until then, we just had to proceed with development based on provisional estimates,’ says Sato. ‘Once we decided to release this game on Switch 2, we expected our worries to evaporate all at once. I remember being overjoyed when I discovered we could express even more than we’d originally set out to.’ The whole four part interview is well worth a read on Nintendo’s website, with a lot of insight on both the game and how Nintendo prepares for the launch of a new console. Mario Kart World will launch alongside the console itself on June 5. Controversially, the game is £75 for the physical edition and £67 for a digital download. However, the game is effectively half price if bought along with the console, as part of the official hardware bundle. All this and now you can play as a cow too (Nintendo) Email gamecentral@metro.co.uk, leave a comment below, follow us on Twitter, and sign-up to our newsletter. To submit Inbox letters and Reader’s Features more easily, without the need to send an email, just use our Submit Stuff page here. For more stories like this, check our Gaming page. GameCentral Sign up for exclusive analysis, latest releases, and bonus community content. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply. Your information will be used in line with our Privacy Policy
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • Jupiter Was Formerly Twice Its Current Size and Had a Much Stronger Magnetic Field

    A new study reveals that about 3.8 million years after the solar system's first solids formed, Jupiter was twice its current size with a magnetic field 50 times stronger, profoundly influencing the structure of the early solar system. Phys.Org reports:andapproached this question by studying Jupiter's tiny moons Amalthea and Thebe, which orbit even closer to Jupiter than Io, the smallest and nearest of the planet's four large Galilean moons. Because Amalthea and Thebe have slightly tilted orbits, Batygin and Adams analyzed these small orbital discrepancies to calculate Jupiter's original size: approximately twice its current radius, with a predicted volume that is the equivalent of over 2,000 Earths. The researchers also determined that Jupiter's magnetic field at that time was approximately 50 times stronger than it is today.

    Adams highlights the remarkable imprint the past has left on today's solar system: "It's astonishing that even after 4.5 billion years, enough clues remain to let us reconstruct Jupiter's physical state at the dawn of its existence." Importantly, these insights were achieved through independent constraints that bypass traditional uncertainties in planetary formation models -- which often rely on assumptions about gas opacity, accretion rate, or the mass of the heavy element core. Instead, the team focused on the orbital dynamics of Jupiter's moons and the conservation of the planet's angular momentum -- quantities that are directly measurable.

    Their analysis establishes a clear snapshot of Jupiter at the moment the surrounding solar nebula evaporated, a pivotal transition point when the building materials for planet formation disappeared and the primordial architecture of the solar system was locked in. The results add crucial details to existing planet formation theories, which suggest that Jupiter and other giant planets around other stars formed via core accretion, a process by which a rocky and icy core rapidly gathers gas.
    The findings have been published in the journal Nature Astronomy.

    of this story at Slashdot.
    #jupiter #was #formerly #twice #its
    Jupiter Was Formerly Twice Its Current Size and Had a Much Stronger Magnetic Field
    A new study reveals that about 3.8 million years after the solar system's first solids formed, Jupiter was twice its current size with a magnetic field 50 times stronger, profoundly influencing the structure of the early solar system. Phys.Org reports:andapproached this question by studying Jupiter's tiny moons Amalthea and Thebe, which orbit even closer to Jupiter than Io, the smallest and nearest of the planet's four large Galilean moons. Because Amalthea and Thebe have slightly tilted orbits, Batygin and Adams analyzed these small orbital discrepancies to calculate Jupiter's original size: approximately twice its current radius, with a predicted volume that is the equivalent of over 2,000 Earths. The researchers also determined that Jupiter's magnetic field at that time was approximately 50 times stronger than it is today. Adams highlights the remarkable imprint the past has left on today's solar system: "It's astonishing that even after 4.5 billion years, enough clues remain to let us reconstruct Jupiter's physical state at the dawn of its existence." Importantly, these insights were achieved through independent constraints that bypass traditional uncertainties in planetary formation models -- which often rely on assumptions about gas opacity, accretion rate, or the mass of the heavy element core. Instead, the team focused on the orbital dynamics of Jupiter's moons and the conservation of the planet's angular momentum -- quantities that are directly measurable. Their analysis establishes a clear snapshot of Jupiter at the moment the surrounding solar nebula evaporated, a pivotal transition point when the building materials for planet formation disappeared and the primordial architecture of the solar system was locked in. The results add crucial details to existing planet formation theories, which suggest that Jupiter and other giant planets around other stars formed via core accretion, a process by which a rocky and icy core rapidly gathers gas. The findings have been published in the journal Nature Astronomy. of this story at Slashdot. #jupiter #was #formerly #twice #its
    SCIENCE.SLASHDOT.ORG
    Jupiter Was Formerly Twice Its Current Size and Had a Much Stronger Magnetic Field
    A new study reveals that about 3.8 million years after the solar system's first solids formed, Jupiter was twice its current size with a magnetic field 50 times stronger, profoundly influencing the structure of the early solar system. Phys.Org reports: [Konstantin Batygin, professor of planetary science at Caltech] and [Fred C. Adams, professor of physics and astronomy at the University of Michigan] approached this question by studying Jupiter's tiny moons Amalthea and Thebe, which orbit even closer to Jupiter than Io, the smallest and nearest of the planet's four large Galilean moons. Because Amalthea and Thebe have slightly tilted orbits, Batygin and Adams analyzed these small orbital discrepancies to calculate Jupiter's original size: approximately twice its current radius, with a predicted volume that is the equivalent of over 2,000 Earths. The researchers also determined that Jupiter's magnetic field at that time was approximately 50 times stronger than it is today. Adams highlights the remarkable imprint the past has left on today's solar system: "It's astonishing that even after 4.5 billion years, enough clues remain to let us reconstruct Jupiter's physical state at the dawn of its existence." Importantly, these insights were achieved through independent constraints that bypass traditional uncertainties in planetary formation models -- which often rely on assumptions about gas opacity, accretion rate, or the mass of the heavy element core. Instead, the team focused on the orbital dynamics of Jupiter's moons and the conservation of the planet's angular momentum -- quantities that are directly measurable. Their analysis establishes a clear snapshot of Jupiter at the moment the surrounding solar nebula evaporated, a pivotal transition point when the building materials for planet formation disappeared and the primordial architecture of the solar system was locked in. The results add crucial details to existing planet formation theories, which suggest that Jupiter and other giant planets around other stars formed via core accretion, a process by which a rocky and icy core rapidly gathers gas. The findings have been published in the journal Nature Astronomy. Read more of this story at Slashdot.
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • The data center boom in the desert

    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.”
    Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe.
    It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills. 
    He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space. 
    Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part.
    “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average.
    The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained. 
    The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways.
    As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”
    #data #center #boom #desert
    The data center boom in the desert
    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.  He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.  Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.  The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.” #data #center #boom #desert
    WWW.TECHNOLOGYREVIEW.COM
    The data center boom in the desert
    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.  He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.  Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend $400 million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park (although several lots are available for resale following the failed gamble of one crypto tenant). When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.  The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside. (The research has since been peer-reviewed and is awaiting publication.) What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity. (“Consumed” here means the water is evaporated, not merely withdrawn and returned to the engineered water system.) The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than $4 billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline from [the Truckee Meadows Water Authority] to our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific details [of] our plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”
    0 Yorumlar 0 hisse senetleri 0 önizleme
Arama Sonuçları
CGShares https://cgshares.com