• Venice Biennale 2025 round-up: what else to see?

    This edition of the Venice Biennale includes 65 national pavilions, 11 collateral events, and over 750 participants in the international exhibition curated by Italian architect and engineer Carlo Ratti.
    Entitled Intelligens: Natural Artificial Collective, its stated aim is to make Venice a ‘living laboratory’. But Ratti’s exhibition in the Arsenale has been hit by mixed reviews. The AJ’s Rob Wilson described it as ‘a bit of a confusing mess’, while other media outlets have called the robot-heavy exhibit of future-facing building-focused solutions to the climate crisis a ‘tech-bro fever dream’ and a ‘mind-boggling rollercoaster’ to mention a few.
    It is a distinct shift away from the biennale of two years ago twhen Ghanaian-Scottish architect Lesley Lokko curated the main exhibitions, including 89 participants – of which more than half were from Africa or the African diaspora – in a convincing reset of the architectural conversation.Advertisement

    This year’s National Pavilions and collateral exhibits, by contrast, have tackled the largest themes in architecture and the world right now in a less constrained way than the main exhibitions. The exhibits are radical and work as a useful gauge for understanding what’s important in each country: decarbonisation, climate resilience, the reconstruction of Gaza, and an issue more prevalent in politics closer to home: gender wars.
    What's not to miss in the Giardini?
    British PavilionUK Pavilion
    The British Pavilion this year, which won a special mention from the Venetian jury, is housing a show by a British-Kenyan collab titled GBR – Geology of Britannic Repair. In it, the curators explore the links between colonialism, the built environment and geological extraction.
    Focusing on the Rift Valley, which runs from east Africa to the Middle East, including Palestine, the exhibition was curated by the Nairobi-based studio cave_bureau, UK-based curator, writer and Farrell Centre director Owen Hopkins and Queen Mary University professor Kathryn Yusoff.
    The pavilion’s façade is cloaked by a beaded veil of agricultural waste briquettes and clay and glass beads, produced in Kenya and India, echoing both Maasai practices and beads once made on Venice’s Murano, as currency for the exchange of metals, minerals and slaves.
    The pavilion’s six gallery spaces include multisensory installations such as the Earth Compass, a series of celestial maps connecting London and Nairobi; the Rift Room, tracing one of humans’ earliest migration routes; and the Shimoni Slave Cave, featuring a large-scale bronze cast of a valley cave historically used as a holding pen for enslaved people.Advertisement

    The show also includes Objects of Repair, a project by design-led research group Palestine Regeneration Team, looking at how salvaged materials could help rebuild war-torn Gaza, the only exhibit anywhere in the Biennale that tackled the reconstruction of Gaza face-on – doing so impressively, both politically and sensitively. here.
    Danish PavilionDemark Pavilion
    A firm favourite by most this year, the Danish exhibition Build of Site, curated by Søren Pihlmann of Pihlmann Architects, transforms the pavilion, which requires renovation anyway, into both a renovation site and archive of materials.
    Clever, simple and very methodical, the building is being both renewed while at the same time showcasing innovative methods to reuse surplus materials uncovered during the construction process – as an alternative to using new resources to build a temporary exhibition.
    The renovation of the 1950s Peter Koch-designed section of the pavilion began in December 2024 and will be completed following the biennale, having been suspended for its duration. On display are archetypal elements including podiums, ramps, benches and tables – all constructed from the surplus materials unearthed during the renovation, such as wood, limestone, concrete, stone, sand, silt and clay.
    Belgian PavilionBelgium Pavilion
    If you need a relaxing break from the intensity of the biennale, then the oldest national pavilion in the Giardini is the one for you. Belgium’s Building Biospheres: A New Alliance between Nature and Architecture brings ‘plant intelligence’ to the fore.
    Commissioned by the Flanders Architecture Institute and curated by landscape architect Bas Smets and neurobiologist Stefano Mancuso, the exhibit investigates how the natural ‘intelligence’ of plants can be used to produce an indoor climate – elevating the role of landscape design and calling for it to no longer serve as a backdrop for architecture.
    Inside, more than 200 plants occupy the central area beneath the skylight, becoming the pavilion’s centrepiece, with the rear space visualising ‘real-time’ data on the prototype’s climate control performance.
    Spanish PavilionSpain Pavilion
    One for the pure architecture lovers out there, models, installations, photographs and timber structures fill the Spanish Pavilion in abundance. Neatly curated by architects Roi Salgueiro Barrio and Manuel Bouzas Barcala, Internalities shows a series of existing and research projects that have contributed to decarbonising construction in Spain.
    The outcome? An extensive collection of work exploring the use of very local and very specific regenerative and low-carbon construction and materials – including stone, wood and soil. The joy of this pavilion comes from the 16 beautiful timber frames constructed from wood from communal forests in Galicia.
    Polish PavilionPoland Pavilion
    Poland’s pavilion was like Marmite this year. Some loved its playful approach while others found it silly. Lares and Penates, taking its name from ancient Roman deities of protection, has been curated by Aleksandra Kędziorek and looks at what it means and takes to have a sense of security in architecture.
    Speaking to many different anxieties, it refers to the unspoken assumption of treating architecture as a safe haven against the elements, catastrophes and wars – showcasing and elevating the mundane solutions and signage derived from building, fire and health regulations. The highlight? An ornate niche decorated with tiles and stones just for … a fire extinguisher.
    Dutch PavilionNetherlands Pavilion
    Punchy and straight to the point, SIDELINED: A Space to Rethink Togetherness takes sports as a lens for looking at how spatial design can both reveal and disrupt the often-exclusionary dynamics of everyday environments. Within the pavilion, the exhibit looks beyond the large-scale arena of the stadium and gymnasium to investigate the more localised and intimate context of the sports bar, as well as three alternative sports – a site of both social production and identity formation – as a metaphor for uniting diverse communities.
    The pavilion-turned-sports bar, designed by Koos Breen and Jeannette Slütter and inspired by Asger Jorn’s three-sided sports field, is a space for fluidity and experimentation where binary oppositions, social hierarchies and cultural values are contested and reshaped – complete with jerseys and football scarfsworn by players in the alternative Anonymous Allyship aligning the walls. Read Derin Fadina’s review for the AJ here.
    Performance inside the Nordic Countries PavilionNordic Countries Pavilion
    Probably the most impactful national pavilion this year, the Nordic Countries have presented an installation with performance work. Curated by Kaisa Karvinen, Industry Muscle: Five Scores for Architecture continues Finnish artist Teo Ala-Ruona’s work on trans embodiment and ecology by considering the trans body as a lens through which to examine modern architecture and the built environment.
    The three-day exhibition opening featured a two-hour performance each day with Ala-Ruona and his troupe crawling, climbing and writhing around the space, creating a bodily dialogue with the installations and pavilion building itself, which was designed by celebrated Modernist architect Sverre Fehn.
    The American pavilion next door, loudlyturns its back on what’s going on in its own country by just celebrating the apathetical porch, making the Nordic Countries seem even more relevant in this crucial time. Read Derin Fadina’s review for the AJ here.
    German PavilionGermany Pavilion
    An exhibit certainly grabbing the issue of climate change by its neck is the German contribution, Stresstest. Curated by Nicola Borgmann, Elisabeth Endres, Gabriele G Kiefer and Daniele Santucci, the pavilion has turned climate change into a literal physical and psychological experience for visitors by creating contrasting ‘stress’ and ‘de-stress’ rooms.
    In the dark stress room, a large metal sculpture creates a cramped and hot space using heating mats hung from the ceiling and powered by PVs. Opposite is a calmer space demonstrating strategies that could be used to reduce the heat of cities, and between the two spaces is a film focusing on the impacts of cities becoming hotter. If this doesn’t highlight the urgency of the situation, I’m not sure what will.
    Best bits of the Arsenale outside the main exhibitions
    Bahrain PavilionBahrain Pavilion
    Overall winner of this year’s Golden Lion for best national participation, Bahrain’s pavilion in the historic Artiglierie of the Arsenale is a proposal for living and working through heat conditions. Heatwave, curated by architect Andrea Faraguna, reimagines public space design by exploring passive cooling strategies rooted in the Arab country’s climate, as well as cultural context.
    A geothermal well and solar chimney are connected through a thermo-hygrometric axis that links underground conditions with the air outside. The inhabitable space that hosts visitors is thus compressed and defined by its earth-covered floor and suspended ceiling, and is surrounded by memorable sandbags, highlighting its scalability for particularly hot construction sites in the Gulf where a huge amount of construction is taking place.
    In the Arsenale’s exhibition space, where excavation wasn’t feasible, this system has been adapted into mechanical ventilation, bringing in air from the canal side and channelling it through ductwork to create a microclimate.
    Slovenian PavilionSlovenia Pavilion
    The AJ’s Rob Wilson’s top pavilion tip this year provides an enjoyable take on the theme of the main exhibition, highlighting how the tacit knowledge and on-site techniques and skills of construction workers and craftspeople are still the key constituent in architectural production despite all the heat and light about robotics, prefabrication, artificial intelligence and 3D printing.
    Master Builders, curated by Ana Kosi and Ognen Arsov and organised by the Museum of Architecture and Designin Ljubljana, presents a series of ‘totems’ –accumulative sculpture-like structures that are formed of conglomerations of differently worked materials, finishes and building elements. These are stacked up into crazy tower forms, which showcase various on-site construction skills and techniques, their construction documented in accompanying films.
    Uzbekistan PavilionUzbekistan Pavilion
    Uzbekistan’s contribution explores the Soviet era solar furnace and Modernist legacy. Architecture studio GRACE, led by curators Ekaterina Golovatyuk and Giacomo Cantoni have curated A Matter of Radiance. The focus is the Sun Institute of Material Science – originally known as the Sun Heliocomplex – an incredible large-scale scientific structure built in 1987 on a natural, seismic-free foundation near Tashkent and one of only two that study material behaviour under extreme temperatures. The exhibition examines the solar oven’s site’s historical and contemporary significance while reflecting on its scientific legacy and influence moving beyond just national borders.
    Applied Arts PavilionV&A Applied Arts Pavilion
    Diller Scofidio + Renfrois having a moment. The US-based practice, in collaboration with V&A chief curator Brendan Cormier, has curated On Storage, which aptly explores global storage architectures in a pavilion that strongly links to the V&A’s recent opening of Storehouse, its newcollections archive in east London.
    Featured is a six-channelfilm entitled Boxed: The Mild Boredom of Order, directed by the practice itself and following a toothbrush, as a metaphor for an everyday consumer product, on its journey through different forms of storage across the globe – from warehouse to distribution centre to baggage handlers down to the compact space of a suitcase.
    Also on display are large-format photographs of V&A East Storehouse, DS+R’s original architectural model and sketchbook and behind-the-scenes photography of Storehouse at work, taken by emerging east London-based photographers.
    Canal CaféCanal café
    Golden Lion for the best participation in the actual exhibition went to Canal Café, an intervention designed by V&A East Storehouse’s architect DS+R with Natural Systems Utilities, SODAI, Aaron Betsky and Davide Oldani.
    Serving up canal-water espresso, the installation is a demonstration of how Venice itself can be a laboratory to understand how to live on the water in a time of water scarcity. The structure, located on the edge of the Arsenale’s building complex, draws water from its lagoon before filtering it onsite via a hybrid of natural and artificial methods, including a mini wetland with grasses.
    The project was recognised for its persistence, having started almost 20 years ago, just showing how water scarcity, contamination and flooding are still major concerns both globally and, more locally, in the tourist-heavy city of Venice.
    And what else?
    Holy See PavilionThe Holy See
    Much like the Danish Pavilion, the Pavilion of the Holy See is also taking on an approach of renewal this year. Over the next six months, Opera Aperta will breathe new life into the Santa Maria Ausiliatrice Complex in the Castello district of Venice. Founded as a hospice for pilgrims in 1171, the building later became the oldest hospital and was converted into school in the 18th century. In 2001, the City of Venice allocated it for cultural use and for the next four years it will be managed by the Dicastery for Culture and Education of the Holy See to oversee its restoration.
    Curated by architect, curator and researcher Marina Otero Verzier and artistic director of Fondaco Italia, Giovanna Zabotti, the complex has been turned into a constant ‘living laboratory’ of collective repair – and received a special mention in the biennale awards.
    The restoration works, open from Tuesday to Friday, are being carried out by local artisans and specialised restorers with expertise in recovering stone, marble, terracotta, mural and canvas painting, stucco, wood and metal artworks.
    The beauty, however, lies in the photogenic fabrics, lit by a warm yellow glow, hanging from the walls within, gently wrapping the building’s surfaces, leaving openings that allow movement and offer glimpses of the ongoing restoration. Mobile scaffolding, used to support the works, also doubles up as furniture, providing space for equipment and subdividing the interior.
    Togo PavilionTogo Pavilion
    The Republic of Togo has presented its first pavilion ever at the biennale this year with the project Considering Togo’s Architectural Heritage, which sits intriguingly at the back of a second-hand furniture shop. The inaugural pavilion is curated by Lomé and Berlin-based Studio NEiDA and is in Venice’s Squero Castello.
    Exploring Togo’s architectural narratives from the early 20th century, and key ongoing restoration efforts, it documents key examples of the west African country’s heritage, highlighting both traditional and more modern building techniques – from Nôk cave dwellings to Afro-Brazilian architecture developed by freed slaves to post-independence Modernist buildings. Some buildings showcased are in disrepair, despite most of the modern structures remaining in use today, including Hotel de la Paix and the Bourse du Travail, suggestive of a future of repair and celebration.
    Estonian PavilionEstonia Pavilion
    Another firm favourite this year is the Estonian exhibition on Riva dei Sette Martiri on the waterfront between Corso Garibaldi and the Giardini.  The Guardian’s Olly Wainwright said that outside the Giardini, it packed ‘the most powerful punch of all.’
    Simple and effective, Let Me Warm You, curated by trio of architects Keiti Lige, Elina Liiva and Helena Männa, asks whether current insulation-driven renovations are merely a ‘checkbox’ to meet European energy targets or ‘a real chance’ to enhance the spatial and social quality of mass housing.
    The façade of the historic Venetian palazzetto in which it is housed is clad with fibre-cement insulation panels in the same process used in Estonia itself for its mass housing – a powerful visual statement showcasing a problematic disregard for the character and potential of typical habitable spaces. Inside, the ground floor is wrapped in plastic and exhibits how the dynamics between different stakeholders influence spatial solutions, including named stickers to encourage discussion among your peers.
    Venice ProcuratieSMACTimed to open to the public at the same time as the biennale, SMAC is a new permanent arts institution in Piazza San Marco, on the second floor of the Procuratie, which is owned by Generali. The exhibition space, open to the public for the first time in 500 years, comprises 16 galleries arranged along a continuous corridor stretching over 80m, recently restored by David Chipperfield Architects.
    Visitors can expect access through a private courtyard leading on to a monumental staircase and experience a typically sensitive Chipperfield restoration, which has revived the building’s original details: walls covered in a light grey Venetian marmorino made from crushed marble and floors of white terrazzo.
    During the summer, its inaugural programme features two solo exhibitions dedicated to Australian modern architect Harry Seidler and Korean landscape designer Jung Youngsun.
    Holcim's installationHolcim x Elemental
    Concrete manufacturer Holcim makes an appearance for a third time at Venice, this time partnering with Chilean Pritzker Prize-winning Alejandro Aravena’s practice Elemental – curator of the 2016 biennale – to launch a resilient housing prototype that follows on from the Norman Foster-designed Essential Homes Project.
    The ‘carbon-neutral’ structure incorporates Holcim’s range of low-carbon concrete ECOPact and is on display as part of the Time Space Existence exhibition organised by the European Cultural Centre in their gardens.
    It also applies Holcim’s ‘biochar’ technology for the first time, a concrete mix with 100 per cent recycled aggregates, in a full-scale Basic Services Unit. This follows an incremental design approach, which could entail fast and efficient construction via the provision of only essential housing components, and via self-build.
    The Next Earth at Palazzo DiedoThe Next Earth
    At Palazzo Diedo’s incredible dedicated Berggruen Arts and Culture space, MIT’s department of architecture and think tank Antikytherahave come together to create the exhibition The Next Earth: Computation, Crisis, Cosmology, which questions how philosophy and architecture must and can respond to various planet-wide crises.
    Antikythera’s The Noocene: Computation and Cosmology from Antikythera to AI looks at the evolution of ‘planetary computation’ as an ‘accidental’ megastructure through which systems, from the molecular to atmospheric scales, become both comprehensible and composable. What is actually on display is an architectural scale video monolith and short films on AI, astronomy and artificial life, as well as selected artefacts. MIT’s Climate Work: Un/Worlding the Planet features 37 works-in-progress, each looking at material supply chains, energy expenditure, modes of practice and deep-time perspectives. Take from it what you will.
    The 19th International Venice Architecture Biennale remains open until Sunday, 23 November 2025.
    #venice #biennale #roundup #what #else
    Venice Biennale 2025 round-up: what else to see?
    This edition of the Venice Biennale includes 65 national pavilions, 11 collateral events, and over 750 participants in the international exhibition curated by Italian architect and engineer Carlo Ratti. Entitled Intelligens: Natural Artificial Collective, its stated aim is to make Venice a ‘living laboratory’. But Ratti’s exhibition in the Arsenale has been hit by mixed reviews. The AJ’s Rob Wilson described it as ‘a bit of a confusing mess’, while other media outlets have called the robot-heavy exhibit of future-facing building-focused solutions to the climate crisis a ‘tech-bro fever dream’ and a ‘mind-boggling rollercoaster’ to mention a few. It is a distinct shift away from the biennale of two years ago twhen Ghanaian-Scottish architect Lesley Lokko curated the main exhibitions, including 89 participants – of which more than half were from Africa or the African diaspora – in a convincing reset of the architectural conversation.Advertisement This year’s National Pavilions and collateral exhibits, by contrast, have tackled the largest themes in architecture and the world right now in a less constrained way than the main exhibitions. The exhibits are radical and work as a useful gauge for understanding what’s important in each country: decarbonisation, climate resilience, the reconstruction of Gaza, and an issue more prevalent in politics closer to home: gender wars. What's not to miss in the Giardini? British PavilionUK Pavilion The British Pavilion this year, which won a special mention from the Venetian jury, is housing a show by a British-Kenyan collab titled GBR – Geology of Britannic Repair. In it, the curators explore the links between colonialism, the built environment and geological extraction. Focusing on the Rift Valley, which runs from east Africa to the Middle East, including Palestine, the exhibition was curated by the Nairobi-based studio cave_bureau, UK-based curator, writer and Farrell Centre director Owen Hopkins and Queen Mary University professor Kathryn Yusoff. The pavilion’s façade is cloaked by a beaded veil of agricultural waste briquettes and clay and glass beads, produced in Kenya and India, echoing both Maasai practices and beads once made on Venice’s Murano, as currency for the exchange of metals, minerals and slaves. The pavilion’s six gallery spaces include multisensory installations such as the Earth Compass, a series of celestial maps connecting London and Nairobi; the Rift Room, tracing one of humans’ earliest migration routes; and the Shimoni Slave Cave, featuring a large-scale bronze cast of a valley cave historically used as a holding pen for enslaved people.Advertisement The show also includes Objects of Repair, a project by design-led research group Palestine Regeneration Team, looking at how salvaged materials could help rebuild war-torn Gaza, the only exhibit anywhere in the Biennale that tackled the reconstruction of Gaza face-on – doing so impressively, both politically and sensitively. here. Danish PavilionDemark Pavilion A firm favourite by most this year, the Danish exhibition Build of Site, curated by Søren Pihlmann of Pihlmann Architects, transforms the pavilion, which requires renovation anyway, into both a renovation site and archive of materials. Clever, simple and very methodical, the building is being both renewed while at the same time showcasing innovative methods to reuse surplus materials uncovered during the construction process – as an alternative to using new resources to build a temporary exhibition. The renovation of the 1950s Peter Koch-designed section of the pavilion began in December 2024 and will be completed following the biennale, having been suspended for its duration. On display are archetypal elements including podiums, ramps, benches and tables – all constructed from the surplus materials unearthed during the renovation, such as wood, limestone, concrete, stone, sand, silt and clay. Belgian PavilionBelgium Pavilion If you need a relaxing break from the intensity of the biennale, then the oldest national pavilion in the Giardini is the one for you. Belgium’s Building Biospheres: A New Alliance between Nature and Architecture brings ‘plant intelligence’ to the fore. Commissioned by the Flanders Architecture Institute and curated by landscape architect Bas Smets and neurobiologist Stefano Mancuso, the exhibit investigates how the natural ‘intelligence’ of plants can be used to produce an indoor climate – elevating the role of landscape design and calling for it to no longer serve as a backdrop for architecture. Inside, more than 200 plants occupy the central area beneath the skylight, becoming the pavilion’s centrepiece, with the rear space visualising ‘real-time’ data on the prototype’s climate control performance. Spanish PavilionSpain Pavilion One for the pure architecture lovers out there, models, installations, photographs and timber structures fill the Spanish Pavilion in abundance. Neatly curated by architects Roi Salgueiro Barrio and Manuel Bouzas Barcala, Internalities shows a series of existing and research projects that have contributed to decarbonising construction in Spain. The outcome? An extensive collection of work exploring the use of very local and very specific regenerative and low-carbon construction and materials – including stone, wood and soil. The joy of this pavilion comes from the 16 beautiful timber frames constructed from wood from communal forests in Galicia. Polish PavilionPoland Pavilion Poland’s pavilion was like Marmite this year. Some loved its playful approach while others found it silly. Lares and Penates, taking its name from ancient Roman deities of protection, has been curated by Aleksandra Kędziorek and looks at what it means and takes to have a sense of security in architecture. Speaking to many different anxieties, it refers to the unspoken assumption of treating architecture as a safe haven against the elements, catastrophes and wars – showcasing and elevating the mundane solutions and signage derived from building, fire and health regulations. The highlight? An ornate niche decorated with tiles and stones just for … a fire extinguisher. Dutch PavilionNetherlands Pavilion Punchy and straight to the point, SIDELINED: A Space to Rethink Togetherness takes sports as a lens for looking at how spatial design can both reveal and disrupt the often-exclusionary dynamics of everyday environments. Within the pavilion, the exhibit looks beyond the large-scale arena of the stadium and gymnasium to investigate the more localised and intimate context of the sports bar, as well as three alternative sports – a site of both social production and identity formation – as a metaphor for uniting diverse communities. The pavilion-turned-sports bar, designed by Koos Breen and Jeannette Slütter and inspired by Asger Jorn’s three-sided sports field, is a space for fluidity and experimentation where binary oppositions, social hierarchies and cultural values are contested and reshaped – complete with jerseys and football scarfsworn by players in the alternative Anonymous Allyship aligning the walls. Read Derin Fadina’s review for the AJ here. Performance inside the Nordic Countries PavilionNordic Countries Pavilion Probably the most impactful national pavilion this year, the Nordic Countries have presented an installation with performance work. Curated by Kaisa Karvinen, Industry Muscle: Five Scores for Architecture continues Finnish artist Teo Ala-Ruona’s work on trans embodiment and ecology by considering the trans body as a lens through which to examine modern architecture and the built environment. The three-day exhibition opening featured a two-hour performance each day with Ala-Ruona and his troupe crawling, climbing and writhing around the space, creating a bodily dialogue with the installations and pavilion building itself, which was designed by celebrated Modernist architect Sverre Fehn. The American pavilion next door, loudlyturns its back on what’s going on in its own country by just celebrating the apathetical porch, making the Nordic Countries seem even more relevant in this crucial time. Read Derin Fadina’s review for the AJ here. German PavilionGermany Pavilion An exhibit certainly grabbing the issue of climate change by its neck is the German contribution, Stresstest. Curated by Nicola Borgmann, Elisabeth Endres, Gabriele G Kiefer and Daniele Santucci, the pavilion has turned climate change into a literal physical and psychological experience for visitors by creating contrasting ‘stress’ and ‘de-stress’ rooms. In the dark stress room, a large metal sculpture creates a cramped and hot space using heating mats hung from the ceiling and powered by PVs. Opposite is a calmer space demonstrating strategies that could be used to reduce the heat of cities, and between the two spaces is a film focusing on the impacts of cities becoming hotter. If this doesn’t highlight the urgency of the situation, I’m not sure what will. Best bits of the Arsenale outside the main exhibitions Bahrain PavilionBahrain Pavilion Overall winner of this year’s Golden Lion for best national participation, Bahrain’s pavilion in the historic Artiglierie of the Arsenale is a proposal for living and working through heat conditions. Heatwave, curated by architect Andrea Faraguna, reimagines public space design by exploring passive cooling strategies rooted in the Arab country’s climate, as well as cultural context. A geothermal well and solar chimney are connected through a thermo-hygrometric axis that links underground conditions with the air outside. The inhabitable space that hosts visitors is thus compressed and defined by its earth-covered floor and suspended ceiling, and is surrounded by memorable sandbags, highlighting its scalability for particularly hot construction sites in the Gulf where a huge amount of construction is taking place. In the Arsenale’s exhibition space, where excavation wasn’t feasible, this system has been adapted into mechanical ventilation, bringing in air from the canal side and channelling it through ductwork to create a microclimate. Slovenian PavilionSlovenia Pavilion The AJ’s Rob Wilson’s top pavilion tip this year provides an enjoyable take on the theme of the main exhibition, highlighting how the tacit knowledge and on-site techniques and skills of construction workers and craftspeople are still the key constituent in architectural production despite all the heat and light about robotics, prefabrication, artificial intelligence and 3D printing. Master Builders, curated by Ana Kosi and Ognen Arsov and organised by the Museum of Architecture and Designin Ljubljana, presents a series of ‘totems’ –accumulative sculpture-like structures that are formed of conglomerations of differently worked materials, finishes and building elements. These are stacked up into crazy tower forms, which showcase various on-site construction skills and techniques, their construction documented in accompanying films. Uzbekistan PavilionUzbekistan Pavilion Uzbekistan’s contribution explores the Soviet era solar furnace and Modernist legacy. Architecture studio GRACE, led by curators Ekaterina Golovatyuk and Giacomo Cantoni have curated A Matter of Radiance. The focus is the Sun Institute of Material Science – originally known as the Sun Heliocomplex – an incredible large-scale scientific structure built in 1987 on a natural, seismic-free foundation near Tashkent and one of only two that study material behaviour under extreme temperatures. The exhibition examines the solar oven’s site’s historical and contemporary significance while reflecting on its scientific legacy and influence moving beyond just national borders. Applied Arts PavilionV&A Applied Arts Pavilion Diller Scofidio + Renfrois having a moment. The US-based practice, in collaboration with V&A chief curator Brendan Cormier, has curated On Storage, which aptly explores global storage architectures in a pavilion that strongly links to the V&A’s recent opening of Storehouse, its newcollections archive in east London. Featured is a six-channelfilm entitled Boxed: The Mild Boredom of Order, directed by the practice itself and following a toothbrush, as a metaphor for an everyday consumer product, on its journey through different forms of storage across the globe – from warehouse to distribution centre to baggage handlers down to the compact space of a suitcase. Also on display are large-format photographs of V&A East Storehouse, DS+R’s original architectural model and sketchbook and behind-the-scenes photography of Storehouse at work, taken by emerging east London-based photographers. Canal CaféCanal café Golden Lion for the best participation in the actual exhibition went to Canal Café, an intervention designed by V&A East Storehouse’s architect DS+R with Natural Systems Utilities, SODAI, Aaron Betsky and Davide Oldani. Serving up canal-water espresso, the installation is a demonstration of how Venice itself can be a laboratory to understand how to live on the water in a time of water scarcity. The structure, located on the edge of the Arsenale’s building complex, draws water from its lagoon before filtering it onsite via a hybrid of natural and artificial methods, including a mini wetland with grasses. The project was recognised for its persistence, having started almost 20 years ago, just showing how water scarcity, contamination and flooding are still major concerns both globally and, more locally, in the tourist-heavy city of Venice. And what else? Holy See PavilionThe Holy See Much like the Danish Pavilion, the Pavilion of the Holy See is also taking on an approach of renewal this year. Over the next six months, Opera Aperta will breathe new life into the Santa Maria Ausiliatrice Complex in the Castello district of Venice. Founded as a hospice for pilgrims in 1171, the building later became the oldest hospital and was converted into school in the 18th century. In 2001, the City of Venice allocated it for cultural use and for the next four years it will be managed by the Dicastery for Culture and Education of the Holy See to oversee its restoration. Curated by architect, curator and researcher Marina Otero Verzier and artistic director of Fondaco Italia, Giovanna Zabotti, the complex has been turned into a constant ‘living laboratory’ of collective repair – and received a special mention in the biennale awards. The restoration works, open from Tuesday to Friday, are being carried out by local artisans and specialised restorers with expertise in recovering stone, marble, terracotta, mural and canvas painting, stucco, wood and metal artworks. The beauty, however, lies in the photogenic fabrics, lit by a warm yellow glow, hanging from the walls within, gently wrapping the building’s surfaces, leaving openings that allow movement and offer glimpses of the ongoing restoration. Mobile scaffolding, used to support the works, also doubles up as furniture, providing space for equipment and subdividing the interior. Togo PavilionTogo Pavilion The Republic of Togo has presented its first pavilion ever at the biennale this year with the project Considering Togo’s Architectural Heritage, which sits intriguingly at the back of a second-hand furniture shop. The inaugural pavilion is curated by Lomé and Berlin-based Studio NEiDA and is in Venice’s Squero Castello. Exploring Togo’s architectural narratives from the early 20th century, and key ongoing restoration efforts, it documents key examples of the west African country’s heritage, highlighting both traditional and more modern building techniques – from Nôk cave dwellings to Afro-Brazilian architecture developed by freed slaves to post-independence Modernist buildings. Some buildings showcased are in disrepair, despite most of the modern structures remaining in use today, including Hotel de la Paix and the Bourse du Travail, suggestive of a future of repair and celebration. Estonian PavilionEstonia Pavilion Another firm favourite this year is the Estonian exhibition on Riva dei Sette Martiri on the waterfront between Corso Garibaldi and the Giardini.  The Guardian’s Olly Wainwright said that outside the Giardini, it packed ‘the most powerful punch of all.’ Simple and effective, Let Me Warm You, curated by trio of architects Keiti Lige, Elina Liiva and Helena Männa, asks whether current insulation-driven renovations are merely a ‘checkbox’ to meet European energy targets or ‘a real chance’ to enhance the spatial and social quality of mass housing. The façade of the historic Venetian palazzetto in which it is housed is clad with fibre-cement insulation panels in the same process used in Estonia itself for its mass housing – a powerful visual statement showcasing a problematic disregard for the character and potential of typical habitable spaces. Inside, the ground floor is wrapped in plastic and exhibits how the dynamics between different stakeholders influence spatial solutions, including named stickers to encourage discussion among your peers. Venice ProcuratieSMACTimed to open to the public at the same time as the biennale, SMAC is a new permanent arts institution in Piazza San Marco, on the second floor of the Procuratie, which is owned by Generali. The exhibition space, open to the public for the first time in 500 years, comprises 16 galleries arranged along a continuous corridor stretching over 80m, recently restored by David Chipperfield Architects. Visitors can expect access through a private courtyard leading on to a monumental staircase and experience a typically sensitive Chipperfield restoration, which has revived the building’s original details: walls covered in a light grey Venetian marmorino made from crushed marble and floors of white terrazzo. During the summer, its inaugural programme features two solo exhibitions dedicated to Australian modern architect Harry Seidler and Korean landscape designer Jung Youngsun. Holcim's installationHolcim x Elemental Concrete manufacturer Holcim makes an appearance for a third time at Venice, this time partnering with Chilean Pritzker Prize-winning Alejandro Aravena’s practice Elemental – curator of the 2016 biennale – to launch a resilient housing prototype that follows on from the Norman Foster-designed Essential Homes Project. The ‘carbon-neutral’ structure incorporates Holcim’s range of low-carbon concrete ECOPact and is on display as part of the Time Space Existence exhibition organised by the European Cultural Centre in their gardens. It also applies Holcim’s ‘biochar’ technology for the first time, a concrete mix with 100 per cent recycled aggregates, in a full-scale Basic Services Unit. This follows an incremental design approach, which could entail fast and efficient construction via the provision of only essential housing components, and via self-build. The Next Earth at Palazzo DiedoThe Next Earth At Palazzo Diedo’s incredible dedicated Berggruen Arts and Culture space, MIT’s department of architecture and think tank Antikytherahave come together to create the exhibition The Next Earth: Computation, Crisis, Cosmology, which questions how philosophy and architecture must and can respond to various planet-wide crises. Antikythera’s The Noocene: Computation and Cosmology from Antikythera to AI looks at the evolution of ‘planetary computation’ as an ‘accidental’ megastructure through which systems, from the molecular to atmospheric scales, become both comprehensible and composable. What is actually on display is an architectural scale video monolith and short films on AI, astronomy and artificial life, as well as selected artefacts. MIT’s Climate Work: Un/Worlding the Planet features 37 works-in-progress, each looking at material supply chains, energy expenditure, modes of practice and deep-time perspectives. Take from it what you will. The 19th International Venice Architecture Biennale remains open until Sunday, 23 November 2025. #venice #biennale #roundup #what #else
    WWW.ARCHITECTSJOURNAL.CO.UK
    Venice Biennale 2025 round-up: what else to see?
    This edition of the Venice Biennale includes 65 national pavilions, 11 collateral events, and over 750 participants in the international exhibition curated by Italian architect and engineer Carlo Ratti. Entitled Intelligens: Natural Artificial Collective, its stated aim is to make Venice a ‘living laboratory’. But Ratti’s exhibition in the Arsenale has been hit by mixed reviews. The AJ’s Rob Wilson described it as ‘a bit of a confusing mess’, while other media outlets have called the robot-heavy exhibit of future-facing building-focused solutions to the climate crisis a ‘tech-bro fever dream’ and a ‘mind-boggling rollercoaster’ to mention a few. It is a distinct shift away from the biennale of two years ago twhen Ghanaian-Scottish architect Lesley Lokko curated the main exhibitions, including 89 participants – of which more than half were from Africa or the African diaspora – in a convincing reset of the architectural conversation.Advertisement This year’s National Pavilions and collateral exhibits, by contrast, have tackled the largest themes in architecture and the world right now in a less constrained way than the main exhibitions. The exhibits are radical and work as a useful gauge for understanding what’s important in each country: decarbonisation, climate resilience, the reconstruction of Gaza, and an issue more prevalent in politics closer to home: gender wars. What's not to miss in the Giardini? British Pavilion (photography: Chris Lane) UK Pavilion The British Pavilion this year, which won a special mention from the Venetian jury, is housing a show by a British-Kenyan collab titled GBR – Geology of Britannic Repair. In it, the curators explore the links between colonialism, the built environment and geological extraction. Focusing on the Rift Valley, which runs from east Africa to the Middle East, including Palestine, the exhibition was curated by the Nairobi-based studio cave_bureau, UK-based curator, writer and Farrell Centre director Owen Hopkins and Queen Mary University professor Kathryn Yusoff. The pavilion’s façade is cloaked by a beaded veil of agricultural waste briquettes and clay and glass beads, produced in Kenya and India, echoing both Maasai practices and beads once made on Venice’s Murano, as currency for the exchange of metals, minerals and slaves. The pavilion’s six gallery spaces include multisensory installations such as the Earth Compass, a series of celestial maps connecting London and Nairobi; the Rift Room, tracing one of humans’ earliest migration routes; and the Shimoni Slave Cave, featuring a large-scale bronze cast of a valley cave historically used as a holding pen for enslaved people.Advertisement The show also includes Objects of Repair, a project by design-led research group Palestine Regeneration Team (PART), looking at how salvaged materials could help rebuild war-torn Gaza, the only exhibit anywhere in the Biennale that tackled the reconstruction of Gaza face-on – doing so impressively, both politically and sensitively. Read more here. Danish Pavilion (photography: Hampus Berndtson) Demark Pavilion A firm favourite by most this year, the Danish exhibition Build of Site, curated by Søren Pihlmann of Pihlmann Architects, transforms the pavilion, which requires renovation anyway, into both a renovation site and archive of materials. Clever, simple and very methodical, the building is being both renewed while at the same time showcasing innovative methods to reuse surplus materials uncovered during the construction process – as an alternative to using new resources to build a temporary exhibition. The renovation of the 1950s Peter Koch-designed section of the pavilion began in December 2024 and will be completed following the biennale, having been suspended for its duration. On display are archetypal elements including podiums, ramps, benches and tables – all constructed from the surplus materials unearthed during the renovation, such as wood, limestone, concrete, stone, sand, silt and clay. Belgian Pavilion (photography: Michiel De Cleene) Belgium Pavilion If you need a relaxing break from the intensity of the biennale, then the oldest national pavilion in the Giardini is the one for you. Belgium’s Building Biospheres: A New Alliance between Nature and Architecture brings ‘plant intelligence’ to the fore. Commissioned by the Flanders Architecture Institute and curated by landscape architect Bas Smets and neurobiologist Stefano Mancuso, the exhibit investigates how the natural ‘intelligence’ of plants can be used to produce an indoor climate – elevating the role of landscape design and calling for it to no longer serve as a backdrop for architecture. Inside, more than 200 plants occupy the central area beneath the skylight, becoming the pavilion’s centrepiece, with the rear space visualising ‘real-time’ data on the prototype’s climate control performance. Spanish Pavilion (photography: Luca Capuano) Spain Pavilion One for the pure architecture lovers out there, models (32!), installations, photographs and timber structures fill the Spanish Pavilion in abundance. Neatly curated by architects Roi Salgueiro Barrio and Manuel Bouzas Barcala, Internalities shows a series of existing and research projects that have contributed to decarbonising construction in Spain. The outcome? An extensive collection of work exploring the use of very local and very specific regenerative and low-carbon construction and materials – including stone, wood and soil. The joy of this pavilion comes from the 16 beautiful timber frames constructed from wood from communal forests in Galicia. Polish Pavilion (photography: Luca Capuano) Poland Pavilion Poland’s pavilion was like Marmite this year. Some loved its playful approach while others found it silly. Lares and Penates, taking its name from ancient Roman deities of protection, has been curated by Aleksandra Kędziorek and looks at what it means and takes to have a sense of security in architecture. Speaking to many different anxieties, it refers to the unspoken assumption of treating architecture as a safe haven against the elements, catastrophes and wars – showcasing and elevating the mundane solutions and signage derived from building, fire and health regulations. The highlight? An ornate niche decorated with tiles and stones just for … a fire extinguisher. Dutch Pavilion (photography: Cristiano Corte) Netherlands Pavilion Punchy and straight to the point, SIDELINED: A Space to Rethink Togetherness takes sports as a lens for looking at how spatial design can both reveal and disrupt the often-exclusionary dynamics of everyday environments. Within the pavilion, the exhibit looks beyond the large-scale arena of the stadium and gymnasium to investigate the more localised and intimate context of the sports bar, as well as three alternative sports – a site of both social production and identity formation – as a metaphor for uniting diverse communities. The pavilion-turned-sports bar, designed by Koos Breen and Jeannette Slütter and inspired by Asger Jorn’s three-sided sports field, is a space for fluidity and experimentation where binary oppositions, social hierarchies and cultural values are contested and reshaped – complete with jerseys and football scarfs (currently a must-have fashion item) worn by players in the alternative Anonymous Allyship aligning the walls. Read Derin Fadina’s review for the AJ here. Performance inside the Nordic Countries Pavilion (photography: Venla Helenius) Nordic Countries Pavilion Probably the most impactful national pavilion this year (and with the best tote bag by far), the Nordic Countries have presented an installation with performance work. Curated by Kaisa Karvinen, Industry Muscle: Five Scores for Architecture continues Finnish artist Teo Ala-Ruona’s work on trans embodiment and ecology by considering the trans body as a lens through which to examine modern architecture and the built environment. The three-day exhibition opening featured a two-hour performance each day with Ala-Ruona and his troupe crawling, climbing and writhing around the space, creating a bodily dialogue with the installations and pavilion building itself, which was designed by celebrated Modernist architect Sverre Fehn. The American pavilion next door, loudly (country music!) turns its back on what’s going on in its own country by just celebrating the apathetical porch, making the Nordic Countries seem even more relevant in this crucial time. Read Derin Fadina’s review for the AJ here. German Pavilion (photography: Luca Capuano) Germany Pavilion An exhibit certainly grabbing the issue of climate change by its neck is the German contribution, Stresstest. Curated by Nicola Borgmann, Elisabeth Endres, Gabriele G Kiefer and Daniele Santucci, the pavilion has turned climate change into a literal physical and psychological experience for visitors by creating contrasting ‘stress’ and ‘de-stress’ rooms. In the dark stress room, a large metal sculpture creates a cramped and hot space using heating mats hung from the ceiling and powered by PVs. Opposite is a calmer space demonstrating strategies that could be used to reduce the heat of cities, and between the two spaces is a film focusing on the impacts of cities becoming hotter. If this doesn’t highlight the urgency of the situation, I’m not sure what will. Best bits of the Arsenale outside the main exhibitions Bahrain Pavilion (photography: Andrea Avezzù) Bahrain Pavilion Overall winner of this year’s Golden Lion for best national participation, Bahrain’s pavilion in the historic Artiglierie of the Arsenale is a proposal for living and working through heat conditions. Heatwave, curated by architect Andrea Faraguna, reimagines public space design by exploring passive cooling strategies rooted in the Arab country’s climate, as well as cultural context. A geothermal well and solar chimney are connected through a thermo-hygrometric axis that links underground conditions with the air outside. The inhabitable space that hosts visitors is thus compressed and defined by its earth-covered floor and suspended ceiling, and is surrounded by memorable sandbags, highlighting its scalability for particularly hot construction sites in the Gulf where a huge amount of construction is taking place. In the Arsenale’s exhibition space, where excavation wasn’t feasible, this system has been adapted into mechanical ventilation, bringing in air from the canal side and channelling it through ductwork to create a microclimate. Slovenian Pavilion (photography: Andrea Avezzù) Slovenia Pavilion The AJ’s Rob Wilson’s top pavilion tip this year provides an enjoyable take on the theme of the main exhibition, highlighting how the tacit knowledge and on-site techniques and skills of construction workers and craftspeople are still the key constituent in architectural production despite all the heat and light about robotics, prefabrication, artificial intelligence and 3D printing. Master Builders, curated by Ana Kosi and Ognen Arsov and organised by the Museum of Architecture and Design (MAO) in Ljubljana, presents a series of ‘totems’ –accumulative sculpture-like structures that are formed of conglomerations of differently worked materials, finishes and building elements. These are stacked up into crazy tower forms, which showcase various on-site construction skills and techniques, their construction documented in accompanying films. Uzbekistan Pavilion (photography: Luca Capuano) Uzbekistan Pavilion Uzbekistan’s contribution explores the Soviet era solar furnace and Modernist legacy. Architecture studio GRACE, led by curators Ekaterina Golovatyuk and Giacomo Cantoni have curated A Matter of Radiance. The focus is the Sun Institute of Material Science – originally known as the Sun Heliocomplex – an incredible large-scale scientific structure built in 1987 on a natural, seismic-free foundation near Tashkent and one of only two that study material behaviour under extreme temperatures. The exhibition examines the solar oven’s site’s historical and contemporary significance while reflecting on its scientific legacy and influence moving beyond just national borders. Applied Arts Pavilion (photography: Andrea Avezzù) V&A Applied Arts Pavilion Diller Scofidio + Renfro (DS+R) is having a moment. The US-based practice, in collaboration with V&A chief curator Brendan Cormier, has curated On Storage, which aptly explores global storage architectures in a pavilion that strongly links to the V&A’s recent opening of Storehouse, its new (and free) collections archive in east London. Featured is a six-channel (and screen) film entitled Boxed: The Mild Boredom of Order, directed by the practice itself and following a toothbrush, as a metaphor for an everyday consumer product, on its journey through different forms of storage across the globe – from warehouse to distribution centre to baggage handlers down to the compact space of a suitcase. Also on display are large-format photographs of V&A East Storehouse, DS+R’s original architectural model and sketchbook and behind-the-scenes photography of Storehouse at work, taken by emerging east London-based photographers. Canal Café (photography: Marco Zorzanello) Canal café Golden Lion for the best participation in the actual exhibition went to Canal Café, an intervention designed by V&A East Storehouse’s architect DS+R with Natural Systems Utilities, SODAI, Aaron Betsky and Davide Oldani. Serving up canal-water espresso, the installation is a demonstration of how Venice itself can be a laboratory to understand how to live on the water in a time of water scarcity. The structure, located on the edge of the Arsenale’s building complex, draws water from its lagoon before filtering it onsite via a hybrid of natural and artificial methods, including a mini wetland with grasses. The project was recognised for its persistence, having started almost 20 years ago, just showing how water scarcity, contamination and flooding are still major concerns both globally and, more locally, in the tourist-heavy city of Venice. And what else? Holy See Pavilion (photography: Andrea Avezzù) The Holy See Much like the Danish Pavilion, the Pavilion of the Holy See is also taking on an approach of renewal this year. Over the next six months, Opera Aperta will breathe new life into the Santa Maria Ausiliatrice Complex in the Castello district of Venice. Founded as a hospice for pilgrims in 1171, the building later became the oldest hospital and was converted into school in the 18th century. In 2001, the City of Venice allocated it for cultural use and for the next four years it will be managed by the Dicastery for Culture and Education of the Holy See to oversee its restoration. Curated by architect, curator and researcher Marina Otero Verzier and artistic director of Fondaco Italia, Giovanna Zabotti, the complex has been turned into a constant ‘living laboratory’ of collective repair – and received a special mention in the biennale awards. The restoration works, open from Tuesday to Friday, are being carried out by local artisans and specialised restorers with expertise in recovering stone, marble, terracotta, mural and canvas painting, stucco, wood and metal artworks. The beauty, however, lies in the photogenic fabrics, lit by a warm yellow glow, hanging from the walls within, gently wrapping the building’s surfaces, leaving openings that allow movement and offer glimpses of the ongoing restoration. Mobile scaffolding, used to support the works, also doubles up as furniture, providing space for equipment and subdividing the interior. Togo Pavilion (photography: Andrea Avezzù) Togo Pavilion The Republic of Togo has presented its first pavilion ever at the biennale this year with the project Considering Togo’s Architectural Heritage, which sits intriguingly at the back of a second-hand furniture shop. The inaugural pavilion is curated by Lomé and Berlin-based Studio NEiDA and is in Venice’s Squero Castello. Exploring Togo’s architectural narratives from the early 20th century, and key ongoing restoration efforts, it documents key examples of the west African country’s heritage, highlighting both traditional and more modern building techniques – from Nôk cave dwellings to Afro-Brazilian architecture developed by freed slaves to post-independence Modernist buildings. Some buildings showcased are in disrepair, despite most of the modern structures remaining in use today, including Hotel de la Paix and the Bourse du Travail, suggestive of a future of repair and celebration. Estonian Pavilion (photography: Joosep Kivimäe) Estonia Pavilion Another firm favourite this year is the Estonian exhibition on Riva dei Sette Martiri on the waterfront between Corso Garibaldi and the Giardini.  The Guardian’s Olly Wainwright said that outside the Giardini, it packed ‘the most powerful punch of all.’ Simple and effective, Let Me Warm You, curated by trio of architects Keiti Lige, Elina Liiva and Helena Männa, asks whether current insulation-driven renovations are merely a ‘checkbox’ to meet European energy targets or ‘a real chance’ to enhance the spatial and social quality of mass housing. The façade of the historic Venetian palazzetto in which it is housed is clad with fibre-cement insulation panels in the same process used in Estonia itself for its mass housing – a powerful visual statement showcasing a problematic disregard for the character and potential of typical habitable spaces. Inside, the ground floor is wrapped in plastic and exhibits how the dynamics between different stakeholders influence spatial solutions, including named stickers to encourage discussion among your peers. Venice Procuratie (photography: Mike Merkenschlager) SMAC (San Marco Art Centre) Timed to open to the public at the same time as the biennale, SMAC is a new permanent arts institution in Piazza San Marco, on the second floor of the Procuratie, which is owned by Generali. The exhibition space, open to the public for the first time in 500 years, comprises 16 galleries arranged along a continuous corridor stretching over 80m, recently restored by David Chipperfield Architects. Visitors can expect access through a private courtyard leading on to a monumental staircase and experience a typically sensitive Chipperfield restoration, which has revived the building’s original details: walls covered in a light grey Venetian marmorino made from crushed marble and floors of white terrazzo. During the summer, its inaugural programme features two solo exhibitions dedicated to Australian modern architect Harry Seidler and Korean landscape designer Jung Youngsun. Holcim's installation (photography: Celestia Studio) Holcim x Elemental Concrete manufacturer Holcim makes an appearance for a third time at Venice, this time partnering with Chilean Pritzker Prize-winning Alejandro Aravena’s practice Elemental – curator of the 2016 biennale – to launch a resilient housing prototype that follows on from the Norman Foster-designed Essential Homes Project. The ‘carbon-neutral’ structure incorporates Holcim’s range of low-carbon concrete ECOPact and is on display as part of the Time Space Existence exhibition organised by the European Cultural Centre in their gardens. It also applies Holcim’s ‘biochar’ technology for the first time, a concrete mix with 100 per cent recycled aggregates, in a full-scale Basic Services Unit. This follows an incremental design approach, which could entail fast and efficient construction via the provision of only essential housing components, and via self-build. The Next Earth at Palazzo Diedo (photography: Joan Porcel) The Next Earth At Palazzo Diedo’s incredible dedicated Berggruen Arts and Culture space, MIT’s department of architecture and think tank Antikythera (apparently taking its name from the first-known computer) have come together to create the exhibition The Next Earth: Computation, Crisis, Cosmology, which questions how philosophy and architecture must and can respond to various planet-wide crises. Antikythera’s The Noocene: Computation and Cosmology from Antikythera to AI looks at the evolution of ‘planetary computation’ as an ‘accidental’ megastructure through which systems, from the molecular to atmospheric scales, become both comprehensible and composable. What is actually on display is an architectural scale video monolith and short films on AI, astronomy and artificial life, as well as selected artefacts. MIT’s Climate Work: Un/Worlding the Planet features 37 works-in-progress, each looking at material supply chains, energy expenditure, modes of practice and deep-time perspectives. Take from it what you will. The 19th International Venice Architecture Biennale remains open until Sunday, 23 November 2025.
    Like
    Love
    Wow
    Sad
    Angry
    632
    0 Commentarii 0 Distribuiri
  • Dell, Nvidia, and Department of Energy join forces on "Doudna" supercomputer for science and AI

    What just happened? The Department of Energy has announced plans for a new supercomputer designed to significantly accelerate research across a wide range of scientific fields. The initiative highlights the growing convergence between commercial AI development and the computational demands of cutting-edge scientific discovery.
    The advanced system, to be housed at Lawrence Berkeley National Laboratory and scheduled to become operational in 2026, will be named "Doudna" in honor of Nobel laureate Jennifer Doudna, whose groundbreaking work on CRISPR gene editing has revolutionized molecular biology.
    Dell Technologies has been selected to deliver the Doudna supercomputer, marking a significant shift in the landscape of government-funded high-performance computing.
    While companies like Hewlett Packard Enterprise have traditionally dominated this space, Dell's successful bid signals a new chapter. "A big win for Dell," said Addison Snell, CEO of Intersect360 Research, in an interview with The New York Times, noting the company's historically limited presence in this domain.
    Dell executives explained that the Doudna project enabled them to move beyond the longstanding practice of building custom systems for individual laboratories. Instead, they focused on developing a flexible platform capable of serving a broad array of users. "This market had shifted into some form of autopilot. What we did was disengage the autopilot," said Paul Perez, senior vice president and technology fellow at Dell.

    The Perlmutter supercomputer at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory.
    A defining feature of Doudna will be its use of Nvidia's Vera Rubin platform, engineered to combine the strengths of traditional scientific simulations with the power of modern AI. Unlike previous Department of Energy supercomputers, which relied on processors from Intel or AMD, Doudna will incorporate a general-purpose Arm-based CPU from Nvidia, paired with the company's Rubin AI chips designed specifically for artificial intelligence and simulation workloads.
    // Related Stories

    The architecture aims to meet the needs of the laboratory's 11,000 users, who increasingly depend on both high-precision modeling and rapid AI-driven data analysis.
    Jensen Huang, founder and CEO of Nvidia, described the new system with enthusiasm. "Doudna is a time machine for science – compressing years of discovery into days," he said, adding that it will let "scientists delve deeper and think bigger to seek the fundamental truths of the universe."
    In terms of performance, Doudna is expected to be over 10 times faster than the lab's current flagship system, making it the Department of Energy's most powerful resource for training AI models and conducting advanced simulations. Jonathan Carter, associate lab director for computing sciences at Berkeley Lab, said the system's architecture was shaped by the evolving needs of researchers – many of whom are now using AI to augment simulations in areas like geothermal energy and quantum computing.
    Doudna's design reflects a broader shift in supercomputing. Traditional systems have prioritized 64-bit calculations for maximum numerical accuracy, but modern AI workloads often benefit from lower-precision operationsthat enable faster processing speeds. Dion Harris, Nvidia's head of data center product marketing, noted that the flexibility to combine different levels of precision opens new frontiers for scientific research.
    The supercomputer will also be tightly integrated with the Energy Sciences Network, allowing researchers nationwide to stream data directly into Doudna for real-time analysis. Sudip Dosanjh, director of the National Energy Research Scientific Computing Center, described the new system as "designed to accelerate a broad set of scientific workflows."
    #dell #nvidia #department #energy #join
    Dell, Nvidia, and Department of Energy join forces on "Doudna" supercomputer for science and AI
    What just happened? The Department of Energy has announced plans for a new supercomputer designed to significantly accelerate research across a wide range of scientific fields. The initiative highlights the growing convergence between commercial AI development and the computational demands of cutting-edge scientific discovery. The advanced system, to be housed at Lawrence Berkeley National Laboratory and scheduled to become operational in 2026, will be named "Doudna" in honor of Nobel laureate Jennifer Doudna, whose groundbreaking work on CRISPR gene editing has revolutionized molecular biology. Dell Technologies has been selected to deliver the Doudna supercomputer, marking a significant shift in the landscape of government-funded high-performance computing. While companies like Hewlett Packard Enterprise have traditionally dominated this space, Dell's successful bid signals a new chapter. "A big win for Dell," said Addison Snell, CEO of Intersect360 Research, in an interview with The New York Times, noting the company's historically limited presence in this domain. Dell executives explained that the Doudna project enabled them to move beyond the longstanding practice of building custom systems for individual laboratories. Instead, they focused on developing a flexible platform capable of serving a broad array of users. "This market had shifted into some form of autopilot. What we did was disengage the autopilot," said Paul Perez, senior vice president and technology fellow at Dell. The Perlmutter supercomputer at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory. A defining feature of Doudna will be its use of Nvidia's Vera Rubin platform, engineered to combine the strengths of traditional scientific simulations with the power of modern AI. Unlike previous Department of Energy supercomputers, which relied on processors from Intel or AMD, Doudna will incorporate a general-purpose Arm-based CPU from Nvidia, paired with the company's Rubin AI chips designed specifically for artificial intelligence and simulation workloads. // Related Stories The architecture aims to meet the needs of the laboratory's 11,000 users, who increasingly depend on both high-precision modeling and rapid AI-driven data analysis. Jensen Huang, founder and CEO of Nvidia, described the new system with enthusiasm. "Doudna is a time machine for science – compressing years of discovery into days," he said, adding that it will let "scientists delve deeper and think bigger to seek the fundamental truths of the universe." In terms of performance, Doudna is expected to be over 10 times faster than the lab's current flagship system, making it the Department of Energy's most powerful resource for training AI models and conducting advanced simulations. Jonathan Carter, associate lab director for computing sciences at Berkeley Lab, said the system's architecture was shaped by the evolving needs of researchers – many of whom are now using AI to augment simulations in areas like geothermal energy and quantum computing. Doudna's design reflects a broader shift in supercomputing. Traditional systems have prioritized 64-bit calculations for maximum numerical accuracy, but modern AI workloads often benefit from lower-precision operationsthat enable faster processing speeds. Dion Harris, Nvidia's head of data center product marketing, noted that the flexibility to combine different levels of precision opens new frontiers for scientific research. The supercomputer will also be tightly integrated with the Energy Sciences Network, allowing researchers nationwide to stream data directly into Doudna for real-time analysis. Sudip Dosanjh, director of the National Energy Research Scientific Computing Center, described the new system as "designed to accelerate a broad set of scientific workflows." #dell #nvidia #department #energy #join
    WWW.TECHSPOT.COM
    Dell, Nvidia, and Department of Energy join forces on "Doudna" supercomputer for science and AI
    What just happened? The Department of Energy has announced plans for a new supercomputer designed to significantly accelerate research across a wide range of scientific fields. The initiative highlights the growing convergence between commercial AI development and the computational demands of cutting-edge scientific discovery. The advanced system, to be housed at Lawrence Berkeley National Laboratory and scheduled to become operational in 2026, will be named "Doudna" in honor of Nobel laureate Jennifer Doudna, whose groundbreaking work on CRISPR gene editing has revolutionized molecular biology. Dell Technologies has been selected to deliver the Doudna supercomputer, marking a significant shift in the landscape of government-funded high-performance computing. While companies like Hewlett Packard Enterprise have traditionally dominated this space, Dell's successful bid signals a new chapter. "A big win for Dell," said Addison Snell, CEO of Intersect360 Research, in an interview with The New York Times, noting the company's historically limited presence in this domain. Dell executives explained that the Doudna project enabled them to move beyond the longstanding practice of building custom systems for individual laboratories. Instead, they focused on developing a flexible platform capable of serving a broad array of users. "This market had shifted into some form of autopilot. What we did was disengage the autopilot," said Paul Perez, senior vice president and technology fellow at Dell. The Perlmutter supercomputer at the National Energy Research Scientific Computing Center at Lawrence Berkeley National Laboratory. A defining feature of Doudna will be its use of Nvidia's Vera Rubin platform, engineered to combine the strengths of traditional scientific simulations with the power of modern AI. Unlike previous Department of Energy supercomputers, which relied on processors from Intel or AMD, Doudna will incorporate a general-purpose Arm-based CPU from Nvidia, paired with the company's Rubin AI chips designed specifically for artificial intelligence and simulation workloads. // Related Stories The architecture aims to meet the needs of the laboratory's 11,000 users, who increasingly depend on both high-precision modeling and rapid AI-driven data analysis. Jensen Huang, founder and CEO of Nvidia, described the new system with enthusiasm. "Doudna is a time machine for science – compressing years of discovery into days," he said, adding that it will let "scientists delve deeper and think bigger to seek the fundamental truths of the universe." In terms of performance, Doudna is expected to be over 10 times faster than the lab's current flagship system, making it the Department of Energy's most powerful resource for training AI models and conducting advanced simulations. Jonathan Carter, associate lab director for computing sciences at Berkeley Lab, said the system's architecture was shaped by the evolving needs of researchers – many of whom are now using AI to augment simulations in areas like geothermal energy and quantum computing. Doudna's design reflects a broader shift in supercomputing. Traditional systems have prioritized 64-bit calculations for maximum numerical accuracy, but modern AI workloads often benefit from lower-precision operations (such as 16-bit or 8-bit) that enable faster processing speeds. Dion Harris, Nvidia's head of data center product marketing, noted that the flexibility to combine different levels of precision opens new frontiers for scientific research. The supercomputer will also be tightly integrated with the Energy Sciences Network, allowing researchers nationwide to stream data directly into Doudna for real-time analysis. Sudip Dosanjh, director of the National Energy Research Scientific Computing Center, described the new system as "designed to accelerate a broad set of scientific workflows."
    0 Commentarii 0 Distribuiri
  • Three circular volumes create Villa Noon in Sotogrande designed by Fran Silvestre Arquitectos

    Submitted by WA Contents
    Three circular volumes create Villa Noon in Sotogrande designed by Fran Silvestre Arquitectos

    Spain Architecture News - May 30, 2025 - 12:29  

    html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" ";
    Valencia-based architecture practice Fran Silvestre Arquitectos has revealed design for a house composed of three circular volumes in Sotogrande, Spain.Named Villa Noon, the house is thoughtfully incorporated into a topographically defined setting, utilizing the slope of the ground to open each volume onto a distinct horizon. In order to achieve visual harmony with the natural environment without leaving an unwelcome footprint on the landscape, the house's layout, which is divided into five circular sections, was designed to seamlessly blend with the surroundings.Every one of these pieces has a rear patio that offers protection from the strongest winds and a front terrace that shields from the sun. In addition to optimizing orientation, this circular geometry reduces the volumetric impact, enabling the building to interact with the landscape in a controlled and deliberate manner. The outside materiality gradually integrates into its surroundings because it was constructed from indigenous Sierra Elvira stone, which has white veins and grey tones. This organic texture will eventually blend in as though it were a natural feature of the relief of the ground.A spacious, open vestibule that serves as a threshold between the outside and the inside of the villa is the entryway. It is surrounded by a curved wall. Visitors are introduced to a series of chambers that adjust to the various terrain levels in this transitional area. While the day area unfolds on a lower platform, open to the landscape and directly connected to the outdoors, the night area is located on a higher level, apart from the other functions.The well-being areas, like the gym, are located on the same floor and provide both practical and visual connections to the separate visitor area. Each zone's privacy is protected by this tiered arrangement, which also keeps the composition's overall spatial continuity flexible. By combining geothermal and aerothermal technologies, the house produces an excess of electricity and becomes energy self-sufficient. A mechanism for atmospheric water condensation is also included, which draws moisture from the air for household usage.Techniques including choosing native plant species based on their water requirements, utilizing natural mulch to prevent evaporation, and installing a drip irrigation system that only turns on when required are used to cut down on water usage in the garden. While infiltration trenches, also known as swales, filter and direct rainfall, green roofs enhance insulation and collect rainfall. This system is completed by permeable surfaces and cisterns, which enable the collected water to be stored and used again.By taking these steps, the house also becomes self-sufficient in water, which is a very reasonable objective in this region of Spain, which is the wettest in the nation due to the Sierra de Grazalema."We have always been fascinated by how the Namib Desert beetle collects water: in an extremely arid environment, this insect tilts its body into the wind to condense fog on its shell, whose surface combines areas that attract water and others that repel it, allowing the droplets to slide directly into its mouth," said Fran Silvestre Arquitectos."A natural lesson in efficiency that inspires and reinforces the logic of this system," the firm added.The idea is reminiscent of architectural works like Kazuyo Sejima's Villa in the Forest and Arne Jacobsen's Leo Henriksen House, whose circular shapes and attention to the environment served as inspiration. In contrast to radiocentric solutions, this proposal chooses what we refer to as "the squaring of the circle": service areas are included into irregularly shaped zones, while residential spaces are resolved through an orthogonal floor plan. In the end, we anticipate that this architecture will blend in with its surroundings over time, appearing to be a component of a karstic relief.SketchRoof level planFirst floor planGround floor planBasement floor planSectionRecently, Fran Silvestre Arquitectos unveiled design for a winery with curvacious form adressing winemaking process in Zayas de Báscones, Soria, Spain. In addition, the firm completed a house featuring irregularly shifted volumes on an irregularly shaped plot within Altos de Valderrama, in Sotogrande, Spain.Project factsProject name: Villa NoonArchitects: Fran Silvestre ArquitectosLocation: Sotogrande, Spain.Developer: Cork Oak MansionAll renderings & drawings courtesy of Fran Silvestre Arquitectos.> via Fran Silvestre Arquitectos
    #three #circular #volumes #create #villa
    Three circular volumes create Villa Noon in Sotogrande designed by Fran Silvestre Arquitectos
    Submitted by WA Contents Three circular volumes create Villa Noon in Sotogrande designed by Fran Silvestre Arquitectos Spain Architecture News - May 30, 2025 - 12:29   html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "; Valencia-based architecture practice Fran Silvestre Arquitectos has revealed design for a house composed of three circular volumes in Sotogrande, Spain.Named Villa Noon, the house is thoughtfully incorporated into a topographically defined setting, utilizing the slope of the ground to open each volume onto a distinct horizon. In order to achieve visual harmony with the natural environment without leaving an unwelcome footprint on the landscape, the house's layout, which is divided into five circular sections, was designed to seamlessly blend with the surroundings.Every one of these pieces has a rear patio that offers protection from the strongest winds and a front terrace that shields from the sun. In addition to optimizing orientation, this circular geometry reduces the volumetric impact, enabling the building to interact with the landscape in a controlled and deliberate manner. The outside materiality gradually integrates into its surroundings because it was constructed from indigenous Sierra Elvira stone, which has white veins and grey tones. This organic texture will eventually blend in as though it were a natural feature of the relief of the ground.A spacious, open vestibule that serves as a threshold between the outside and the inside of the villa is the entryway. It is surrounded by a curved wall. Visitors are introduced to a series of chambers that adjust to the various terrain levels in this transitional area. While the day area unfolds on a lower platform, open to the landscape and directly connected to the outdoors, the night area is located on a higher level, apart from the other functions.The well-being areas, like the gym, are located on the same floor and provide both practical and visual connections to the separate visitor area. Each zone's privacy is protected by this tiered arrangement, which also keeps the composition's overall spatial continuity flexible. By combining geothermal and aerothermal technologies, the house produces an excess of electricity and becomes energy self-sufficient. A mechanism for atmospheric water condensation is also included, which draws moisture from the air for household usage.Techniques including choosing native plant species based on their water requirements, utilizing natural mulch to prevent evaporation, and installing a drip irrigation system that only turns on when required are used to cut down on water usage in the garden. While infiltration trenches, also known as swales, filter and direct rainfall, green roofs enhance insulation and collect rainfall. This system is completed by permeable surfaces and cisterns, which enable the collected water to be stored and used again.By taking these steps, the house also becomes self-sufficient in water, which is a very reasonable objective in this region of Spain, which is the wettest in the nation due to the Sierra de Grazalema."We have always been fascinated by how the Namib Desert beetle collects water: in an extremely arid environment, this insect tilts its body into the wind to condense fog on its shell, whose surface combines areas that attract water and others that repel it, allowing the droplets to slide directly into its mouth," said Fran Silvestre Arquitectos."A natural lesson in efficiency that inspires and reinforces the logic of this system," the firm added.The idea is reminiscent of architectural works like Kazuyo Sejima's Villa in the Forest and Arne Jacobsen's Leo Henriksen House, whose circular shapes and attention to the environment served as inspiration. In contrast to radiocentric solutions, this proposal chooses what we refer to as "the squaring of the circle": service areas are included into irregularly shaped zones, while residential spaces are resolved through an orthogonal floor plan. In the end, we anticipate that this architecture will blend in with its surroundings over time, appearing to be a component of a karstic relief.SketchRoof level planFirst floor planGround floor planBasement floor planSectionRecently, Fran Silvestre Arquitectos unveiled design for a winery with curvacious form adressing winemaking process in Zayas de Báscones, Soria, Spain. In addition, the firm completed a house featuring irregularly shifted volumes on an irregularly shaped plot within Altos de Valderrama, in Sotogrande, Spain.Project factsProject name: Villa NoonArchitects: Fran Silvestre ArquitectosLocation: Sotogrande, Spain.Developer: Cork Oak MansionAll renderings & drawings courtesy of Fran Silvestre Arquitectos.> via Fran Silvestre Arquitectos #three #circular #volumes #create #villa
    WORLDARCHITECTURE.ORG
    Three circular volumes create Villa Noon in Sotogrande designed by Fran Silvestre Arquitectos
    Submitted by WA Contents Three circular volumes create Villa Noon in Sotogrande designed by Fran Silvestre Arquitectos Spain Architecture News - May 30, 2025 - 12:29   html PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN" "http://www.w3.org/TR/REC-html40/loose.dtd" Valencia-based architecture practice Fran Silvestre Arquitectos has revealed design for a house composed of three circular volumes in Sotogrande, Spain.Named Villa Noon, the house is thoughtfully incorporated into a topographically defined setting, utilizing the slope of the ground to open each volume onto a distinct horizon. In order to achieve visual harmony with the natural environment without leaving an unwelcome footprint on the landscape, the house's layout, which is divided into five circular sections, was designed to seamlessly blend with the surroundings.Every one of these pieces has a rear patio that offers protection from the strongest winds and a front terrace that shields from the sun. In addition to optimizing orientation, this circular geometry reduces the volumetric impact, enabling the building to interact with the landscape in a controlled and deliberate manner. The outside materiality gradually integrates into its surroundings because it was constructed from indigenous Sierra Elvira stone, which has white veins and grey tones. This organic texture will eventually blend in as though it were a natural feature of the relief of the ground.A spacious, open vestibule that serves as a threshold between the outside and the inside of the villa is the entryway. It is surrounded by a curved wall. Visitors are introduced to a series of chambers that adjust to the various terrain levels in this transitional area. While the day area unfolds on a lower platform, open to the landscape and directly connected to the outdoors, the night area is located on a higher level, apart from the other functions.The well-being areas, like the gym, are located on the same floor and provide both practical and visual connections to the separate visitor area. Each zone's privacy is protected by this tiered arrangement, which also keeps the composition's overall spatial continuity flexible. By combining geothermal and aerothermal technologies, the house produces an excess of electricity and becomes energy self-sufficient. A mechanism for atmospheric water condensation is also included, which draws moisture from the air for household usage.Techniques including choosing native plant species based on their water requirements, utilizing natural mulch to prevent evaporation, and installing a drip irrigation system that only turns on when required are used to cut down on water usage in the garden. While infiltration trenches, also known as swales, filter and direct rainfall, green roofs enhance insulation and collect rainfall. This system is completed by permeable surfaces and cisterns, which enable the collected water to be stored and used again.By taking these steps, the house also becomes self-sufficient in water, which is a very reasonable objective in this region of Spain, which is the wettest in the nation due to the Sierra de Grazalema."We have always been fascinated by how the Namib Desert beetle collects water: in an extremely arid environment, this insect tilts its body into the wind to condense fog on its shell, whose surface combines areas that attract water and others that repel it, allowing the droplets to slide directly into its mouth," said Fran Silvestre Arquitectos."A natural lesson in efficiency that inspires and reinforces the logic of this system," the firm added.The idea is reminiscent of architectural works like Kazuyo Sejima's Villa in the Forest and Arne Jacobsen's Leo Henriksen House, whose circular shapes and attention to the environment served as inspiration. In contrast to radiocentric solutions, this proposal chooses what we refer to as "the squaring of the circle": service areas are included into irregularly shaped zones, while residential spaces are resolved through an orthogonal floor plan. In the end, we anticipate that this architecture will blend in with its surroundings over time, appearing to be a component of a karstic relief.SketchRoof level planFirst floor planGround floor planBasement floor planSectionRecently, Fran Silvestre Arquitectos unveiled design for a winery with curvacious form adressing winemaking process in Zayas de Báscones, Soria, Spain. In addition, the firm completed a house featuring irregularly shifted volumes on an irregularly shaped plot within Altos de Valderrama, in Sotogrande, Spain.Project factsProject name: Villa NoonArchitects: Fran Silvestre ArquitectosLocation: Sotogrande, Spain.Developer: Cork Oak MansionAll renderings & drawings courtesy of Fran Silvestre Arquitectos.> via Fran Silvestre Arquitectos
    0 Commentarii 0 Distribuiri
  • Opinion: Europe must warm up to geothermal before it’s too late

    While Europe races to phase out fossil fuels and electrify everything from cars to heating systems, it’s turning a blind eye to a reliable and proven source of clean energy lying right beneath our feet. 
    Geothermal energy offers exactly what the continent needs most: clean, local, always-on power. Yet, it only accounted for 0.2% of power generation on the continent in 2024. Something needs to change.
    The recent blackout in Spain, triggered by a failure in the high-voltage grid, serves as a warning shot. While solar and wind are vital pillars of decarbonisation, they’re variable by nature. Without steady, around-the-clock sources of electricity, Europe risks swapping one form of energy insecurity for another.
    A much bigger wake-up call came in 2022 when Russia launched a full-scale invasion of Ukraine. For years, European governments had built an energy system dependent on imports of natural gas. When that stack of cards shattered, it triggered an energy crisis that exposed the vulnerable underbelly of Europe’s power system. 
    The answer to these problems lies, in part, a few kilometres underground. According to the International Energy Agency, geothermal energy has the potential to power the planet 150 times over. But it’s not just about electricity — geothermal can also deliver clean, reliable heat. That makes it especially valuable in Europe, where millions of homes already rely on radiators and district heating systems, many of them still powered by natural gas.
    Geothermal plants also come with a smaller footprint. They require far less land than an equivalent solar farm or wind park. What’s more, the materials and infrastructure needed to build them — like drilling rigs and turbines — can be largely sourced locally. That’s a sharp contrast to solar panels and batteries, most of which are imported from China.  
    Geothermal energy is not theoretical. It doesn’t require scientific breakthroughs. We’ve been drilling wells and extracting energy from the Earth for centuries. The know-how exists, and so does the workforce.
    Decades of oil and gas exploration have built a deep bench of geologists, drillers, reservoir engineers, and project managers. Instead of letting this expertise fade, we can redeploy it to build geothermal plants. The infrastructure, such as drilling rigs, can also be repurposed for a cleaner cause. Geothermal could be the ultimate redemption arc for oil and gas.
    Sure, drilling deep isn’t cheap — yet. But a new crop of startups is rewriting the playbook. Armed with everything from plasma pulse drills to giant radiators, these companies could finally crack the cost barrier — and make geothermal available pretty much anywhere. Just as SpaceX disrupted a sclerotic rocket industry with its cheap launches, these startups are poised to succeed where the geothermal industry has failed. 
    All that’s missing is investment. While billions are being funnelled into high-risk technologies like fusion or nuclear fission reactors, funding for geothermal tech is minuscule in comparison, especially in Europe. Yet, unlike those technologies, geothermal is ready right now.  
    If Europe wants to achieve climate neutrality and energy sovereignty, it must stop ignoring geothermal. We need bold investment, regulatory reform, and a clear signal to industry: don’t let geothermal become a forgotten renewable.
    Grid failures, missed climate targets, deeper energy dependence — these are the risks Europe faces. It’s time to start drilling, before it’s too late. 
    Want to discover the next big thing in tech? Then take a trip to TNW Conference, where thousands of founders, investors, and corporate innovators will share their ideas. The event takes place on June 19–20 in Amsterdam and tickets are on sale now. Use the code TNWXMEDIA2025 at the checkout to get 30% off.

    Story by

    Siôn Geschwindt

    Siôn is a freelance science and technology reporter, specialising in climate and energy. From nuclear fusion breakthroughs to electric vehicSiôn is a freelance science and technology reporter, specialising in climate and energy. From nuclear fusion breakthroughs to electric vehicles, he's happiest sourcing a scoop, investigating the impact of emerging technologies, and even putting them to the test. He has five years of journalism experience and holds a dual degree in media and environmental science from the University of Cape Town, South Africa. When he's not writing, you can probably find Siôn out hiking, surfing, playing the drums or catering to his moderate caffeine addiction. You can contact him at: sion.geschwindtprotonmailcom

    Get the TNW newsletter
    Get the most important tech news in your inbox each week.

    Also tagged with
    #opinion #europe #must #warm #geothermal
    Opinion: Europe must warm up to geothermal before it’s too late
    While Europe races to phase out fossil fuels and electrify everything from cars to heating systems, it’s turning a blind eye to a reliable and proven source of clean energy lying right beneath our feet.  Geothermal energy offers exactly what the continent needs most: clean, local, always-on power. Yet, it only accounted for 0.2% of power generation on the continent in 2024. Something needs to change. The recent blackout in Spain, triggered by a failure in the high-voltage grid, serves as a warning shot. While solar and wind are vital pillars of decarbonisation, they’re variable by nature. Without steady, around-the-clock sources of electricity, Europe risks swapping one form of energy insecurity for another. A much bigger wake-up call came in 2022 when Russia launched a full-scale invasion of Ukraine. For years, European governments had built an energy system dependent on imports of natural gas. When that stack of cards shattered, it triggered an energy crisis that exposed the vulnerable underbelly of Europe’s power system.  The answer to these problems lies, in part, a few kilometres underground. According to the International Energy Agency, geothermal energy has the potential to power the planet 150 times over. But it’s not just about electricity — geothermal can also deliver clean, reliable heat. That makes it especially valuable in Europe, where millions of homes already rely on radiators and district heating systems, many of them still powered by natural gas. Geothermal plants also come with a smaller footprint. They require far less land than an equivalent solar farm or wind park. What’s more, the materials and infrastructure needed to build them — like drilling rigs and turbines — can be largely sourced locally. That’s a sharp contrast to solar panels and batteries, most of which are imported from China.   Geothermal energy is not theoretical. It doesn’t require scientific breakthroughs. We’ve been drilling wells and extracting energy from the Earth for centuries. The know-how exists, and so does the workforce. Decades of oil and gas exploration have built a deep bench of geologists, drillers, reservoir engineers, and project managers. Instead of letting this expertise fade, we can redeploy it to build geothermal plants. The infrastructure, such as drilling rigs, can also be repurposed for a cleaner cause. Geothermal could be the ultimate redemption arc for oil and gas. Sure, drilling deep isn’t cheap — yet. But a new crop of startups is rewriting the playbook. Armed with everything from plasma pulse drills to giant radiators, these companies could finally crack the cost barrier — and make geothermal available pretty much anywhere. Just as SpaceX disrupted a sclerotic rocket industry with its cheap launches, these startups are poised to succeed where the geothermal industry has failed.  All that’s missing is investment. While billions are being funnelled into high-risk technologies like fusion or nuclear fission reactors, funding for geothermal tech is minuscule in comparison, especially in Europe. Yet, unlike those technologies, geothermal is ready right now.   If Europe wants to achieve climate neutrality and energy sovereignty, it must stop ignoring geothermal. We need bold investment, regulatory reform, and a clear signal to industry: don’t let geothermal become a forgotten renewable. Grid failures, missed climate targets, deeper energy dependence — these are the risks Europe faces. It’s time to start drilling, before it’s too late.  Want to discover the next big thing in tech? Then take a trip to TNW Conference, where thousands of founders, investors, and corporate innovators will share their ideas. The event takes place on June 19–20 in Amsterdam and tickets are on sale now. Use the code TNWXMEDIA2025 at the checkout to get 30% off. Story by Siôn Geschwindt Siôn is a freelance science and technology reporter, specialising in climate and energy. From nuclear fusion breakthroughs to electric vehicSiôn is a freelance science and technology reporter, specialising in climate and energy. From nuclear fusion breakthroughs to electric vehicles, he's happiest sourcing a scoop, investigating the impact of emerging technologies, and even putting them to the test. He has five years of journalism experience and holds a dual degree in media and environmental science from the University of Cape Town, South Africa. When he's not writing, you can probably find Siôn out hiking, surfing, playing the drums or catering to his moderate caffeine addiction. You can contact him at: sion.geschwindtprotonmailcom Get the TNW newsletter Get the most important tech news in your inbox each week. Also tagged with #opinion #europe #must #warm #geothermal
    THENEXTWEB.COM
    Opinion: Europe must warm up to geothermal before it’s too late
    While Europe races to phase out fossil fuels and electrify everything from cars to heating systems, it’s turning a blind eye to a reliable and proven source of clean energy lying right beneath our feet.  Geothermal energy offers exactly what the continent needs most: clean, local, always-on power. Yet, it only accounted for 0.2% of power generation on the continent in 2024. Something needs to change. The recent blackout in Spain, triggered by a failure in the high-voltage grid, serves as a warning shot. While solar and wind are vital pillars of decarbonisation, they’re variable by nature. Without steady, around-the-clock sources of electricity, Europe risks swapping one form of energy insecurity for another. A much bigger wake-up call came in 2022 when Russia launched a full-scale invasion of Ukraine. For years, European governments had built an energy system dependent on imports of natural gas. When that stack of cards shattered, it triggered an energy crisis that exposed the vulnerable underbelly of Europe’s power system.  The answer to these problems lies, in part, a few kilometres underground. According to the International Energy Agency, geothermal energy has the potential to power the planet 150 times over. But it’s not just about electricity — geothermal can also deliver clean, reliable heat. That makes it especially valuable in Europe, where millions of homes already rely on radiators and district heating systems, many of them still powered by natural gas. Geothermal plants also come with a smaller footprint. They require far less land than an equivalent solar farm or wind park. What’s more, the materials and infrastructure needed to build them — like drilling rigs and turbines — can be largely sourced locally. That’s a sharp contrast to solar panels and batteries, most of which are imported from China.   Geothermal energy is not theoretical. It doesn’t require scientific breakthroughs. We’ve been drilling wells and extracting energy from the Earth for centuries. The know-how exists, and so does the workforce. Decades of oil and gas exploration have built a deep bench of geologists, drillers, reservoir engineers, and project managers. Instead of letting this expertise fade, we can redeploy it to build geothermal plants. The infrastructure, such as drilling rigs, can also be repurposed for a cleaner cause. Geothermal could be the ultimate redemption arc for oil and gas. Sure, drilling deep isn’t cheap — yet. But a new crop of startups is rewriting the playbook. Armed with everything from plasma pulse drills to giant radiators, these companies could finally crack the cost barrier — and make geothermal available pretty much anywhere. Just as SpaceX disrupted a sclerotic rocket industry with its cheap launches, these startups are poised to succeed where the geothermal industry has failed.  All that’s missing is investment. While billions are being funnelled into high-risk technologies like fusion or nuclear fission reactors, funding for geothermal tech is minuscule in comparison, especially in Europe. Yet, unlike those technologies, geothermal is ready right now.   If Europe wants to achieve climate neutrality and energy sovereignty, it must stop ignoring geothermal. We need bold investment, regulatory reform, and a clear signal to industry: don’t let geothermal become a forgotten renewable. Grid failures, missed climate targets, deeper energy dependence — these are the risks Europe faces. It’s time to start drilling, before it’s too late.  Want to discover the next big thing in tech? Then take a trip to TNW Conference, where thousands of founders, investors, and corporate innovators will share their ideas. The event takes place on June 19–20 in Amsterdam and tickets are on sale now. Use the code TNWXMEDIA2025 at the checkout to get 30% off. Story by Siôn Geschwindt Siôn is a freelance science and technology reporter, specialising in climate and energy. From nuclear fusion breakthroughs to electric vehic (show all) Siôn is a freelance science and technology reporter, specialising in climate and energy. From nuclear fusion breakthroughs to electric vehicles, he's happiest sourcing a scoop, investigating the impact of emerging technologies, and even putting them to the test. He has five years of journalism experience and holds a dual degree in media and environmental science from the University of Cape Town, South Africa. When he's not writing, you can probably find Siôn out hiking, surfing, playing the drums or catering to his moderate caffeine addiction. You can contact him at: sion.geschwindt [at] protonmail [dot] com Get the TNW newsletter Get the most important tech news in your inbox each week. Also tagged with
    0 Commentarii 0 Distribuiri
  • AI could keep us dependent on natural gas for decades to come

    The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturersand after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone. When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center.
    Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.”
    The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost, and you know how to scale it and get it approved,” says Victor. “Even forcompanies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for per million Btu; last year, it averaged just the lowest annual priceever reported, according to the US Energy Information Administration.

    Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.   But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035, relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestrationat power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.   But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US. 
    Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave.
    The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models, can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI, a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says.
    Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.  In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much.
    Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is sayingneeds around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to takeword for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”       In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over million to support the Richland Parish data centers with new infrastructure, including roads and water systems.  Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayerssmall businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tellwhat they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come.
    #could #keep #dependent #natural #gas
    AI could keep us dependent on natural gas for decades to come
    The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturersand after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone. When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center. Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.” The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost, and you know how to scale it and get it approved,” says Victor. “Even forcompanies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for per million Btu; last year, it averaged just the lowest annual priceever reported, according to the US Energy Information Administration. Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.   But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035, relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestrationat power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.   But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US.  Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave. The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models, can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI, a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says. Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.  In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much. Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is sayingneeds around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to takeword for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”       In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over million to support the Richland Parish data centers with new infrastructure, including roads and water systems.  Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayerssmall businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tellwhat they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come. #could #keep #dependent #natural #gas
    WWW.TECHNOLOGYREVIEW.COM
    AI could keep us dependent on natural gas for decades to come
    The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturers (no takers) and after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive $10 billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone (the electricity for cooling and other building needs will add to that). When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend $3.2 billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center. Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.” The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost (more or less), and you know how to scale it and get it approved,” says Victor. “Even for [AI] companies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for $13 per million Btu (a measure of thermal energy); last year, it averaged just $2.21, the lowest annual price (adjusting for inflation) ever reported, according to the US Energy Information Administration (EIA). Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.   But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035 (roughly equivalent to the emissions from a large US state such as Florida), relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestration (CCS) at power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.   But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US.  Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave. The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models (such as Meta’s facility in Richland Parish), can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI (the Electric Power Research Institute), a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says. Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.  In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much. Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is saying [Meta] needs around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to take [Entergy’s] word for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”       In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over $200 million to support the Richland Parish data centers with new infrastructure, including roads and water systems.  Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayers [and] small businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tell [public utility commissions] what they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come.
    0 Commentarii 0 Distribuiri
  • The data center boom in the desert

    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.”
    Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe.
    It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills. 
    He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space. 
    Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part.
    “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average.
    The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained. 
    The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways.
    As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”
    #data #center #boom #desert
    The data center boom in the desert
    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.  He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.  Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.  The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.” #data #center #boom #desert
    WWW.TECHNOLOGYREVIEW.COM
    The data center boom in the desert
    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.  He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.  Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend $400 million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park (although several lots are available for resale following the failed gamble of one crypto tenant). When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.  The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside. (The research has since been peer-reviewed and is awaiting publication.) What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity. (“Consumed” here means the water is evaporated, not merely withdrawn and returned to the engineered water system.) The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than $4 billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline from [the Truckee Meadows Water Authority] to our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific details [of] our plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”
    0 Commentarii 0 Distribuiri