• AI could keep us dependent on natural gas for decades to come

    The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturersand after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone. When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center.
    Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.”
    The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost, and you know how to scale it and get it approved,” says Victor. “Even forcompanies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for per million Btu; last year, it averaged just the lowest annual priceever reported, according to the US Energy Information Administration.

    Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.   But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035, relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestrationat power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.   But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US. 
    Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave.
    The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models, can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI, a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says.
    Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.  In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much.
    Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is sayingneeds around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to takeword for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”       In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over million to support the Richland Parish data centers with new infrastructure, including roads and water systems.  Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayerssmall businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tellwhat they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come.
    #could #keep #dependent #natural #gas
    AI could keep us dependent on natural gas for decades to come
    The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturersand after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone. When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center. Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.” The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost, and you know how to scale it and get it approved,” says Victor. “Even forcompanies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for per million Btu; last year, it averaged just the lowest annual priceever reported, according to the US Energy Information Administration. Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.   But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035, relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestrationat power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.   But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US.  Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave. The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models, can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI, a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says. Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.  In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much. Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is sayingneeds around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to takeword for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”       In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over million to support the Richland Parish data centers with new infrastructure, including roads and water systems.  Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayerssmall businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tellwhat they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come. #could #keep #dependent #natural #gas
    WWW.TECHNOLOGYREVIEW.COM
    AI could keep us dependent on natural gas for decades to come
    The thousands of sprawling acres in rural northeast Louisiana had gone unwanted for nearly two decades. Louisiana authorities bought the land in Richland Parish in 2006 to promote economic development in one of the poorest regions in the state. For years, they marketed the former agricultural fields as the Franklin Farm mega site, first to auto manufacturers (no takers) and after that to other industries that might want to occupy more than a thousand acres just off the interstate. This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. So it’s no wonder that state and local politicians were exuberant when Meta showed up. In December, the company announced plans to build a massive $10 billion data center for training its artificial-intelligence models at the site, with operations to begin in 2028. “A game changer,” declared Governor Jeff Landry, citing 5,000 construction jobs and 500 jobs at the data center that are expected to be created and calling it the largest private capital investment in the state’s history. From a rural backwater to the heart of the booming AI revolution! The AI data center also promises to transform the state’s energy future. Stretching in length for more than a mile, it will be Meta’s largest in the world, and it will have an enormous appetite for electricity, requiring two gigawatts for computation alone (the electricity for cooling and other building needs will add to that). When it’s up and running, it will be the equivalent of suddenly adding a decent-size city to the region’s grid—one that never sleeps and needs a steady, uninterrupted flow of electricity. To power the data center, Entergy aims to spend $3.2 billion to build three large natural-gas power plants with a total capacity of 2.3 gigawatts and upgrade the grid to accommodate the huge jump in anticipated demand. In its filing to the state’s power regulatory agency, Entergy acknowledged that natural-gas plants “emit significant amounts of CO2” but said the energy source was the only affordable choice given the need to quickly meet the 24-7 electricity demand from the huge data center. Meta said it will work with Entergy to eventually bring online at least 1.5 gigawatts of new renewables, including solar, but that it had not yet decided which specific projects to fund or when those investments will be made. Meanwhile, the new natural-gas plants, which are scheduled to be up and running starting in 2028 and will have a typical lifetime of around 30 years, will further lock in the state’s commitment to the fossil fuel. The development has sparked interest from the US Congress; last week, Sheldon Whitehouse, the ranking member of the Senate Committee on Environment and Public Works issued a letter to Meta that called out the company's plan to power its data center with “new and unabated natural gas generation” and said its promises to offset the resulting emissions "by funding carbon capture and a solar project are vague and offer little reassurance.” The choice of natural gas as the go-to solution to meet the growing demand for power from AI is not unique to Louisiana. The fossil fuel is already the country’s chief source of electricity generation, and large natural-gas plants are being built around the country to feed electricity to new and planned AI data centers. While some climate advocates have hoped that cleaner renewable power would soon overtake it, the booming power demand from data centers is all but wiping out any prospect that the US will wean itself off natural gas anytime soon. The reality on the ground is that natural gas is “the default” to meet the exploding power demand from AI data centers, says David Victor, a political scientist at the University of California, San Diego, and co-director of its Deep Decarbonization Project. “The natural-gas plant is the thing that you know how to build, you know what it’s going to cost (more or less), and you know how to scale it and get it approved,” says Victor. “Even for [AI] companies that want to have low emissions profiles and who are big pushers of low or zero carbon, they won’t have a choice but to use gas.” The preference for natural gas is particularly pronounced in the American South, where plans for multiple large gas-fired plants are in the works in states such as Virginia, North Carolina, South Carolina, and Georgia. Utilities in those states alone are planning some 20 gigawatts of new natural-gas power plants over the next 15 years, according to a recent report. And much of the new demand—particularly in Virginia, South Carolina and Georgia—is coming from data centers; in those 3 states data centers account for around 65 to 85% of projected load growth. “It’s a long-term commitment in absolutely the wrong direction,” says Greg Buppert, a senior attorney at the Southern Environmental Law Center in Charlottesville, Virginia. If all the proposed gas plants get built in the South over the next 15 years, he says, “we’ll just have to accept that we won’t meet emissions reduction goals.” But even as it looks more and more likely that natural gas will remain a sizable part of our energy future, questions abound over just what its continued dominance will look like. For one thing, no one is sure exactly how much electricity AI data centers will need in the future and how large an appetite companies will have for natural gas. Demand for AI could fizzle. Or AI companies could make a concerted effort to shift to renewable energy or nuclear power. Such possibilities mean that the US could be on a path to overbuild natural-gas capacity, which would leave regions saddled with unneeded and polluting fossil-fuel dinosaurs—and residents footing soaring electricity bills to pay off today’s investments. The good news is that such risks could likely be managed over the next few years, if—and it’s a big if—AI companies are more transparent about how flexible they can be in their seemingly insatiable energy demands. The reign of natural gas Natural gas in the US is cheap and abundant these days. Two decades ago, huge reserves were found in shale deposits scattered across the country. In 2008, as fracking started to make it possible to extract large quantities of the gas from shale, natural gas was selling for $13 per million Btu (a measure of thermal energy); last year, it averaged just $2.21, the lowest annual price (adjusting for inflation) ever reported, according to the US Energy Information Administration (EIA). Around 2016, natural gas overtook coal as the main fuel for electricity generation in the US. And today—despite the rapid rise of solar and wind power, and well-deserved enthusiasm for the falling price of such renewables—natural gas is still king, accounting for around 40% of electricity generated in the US. In Louisiana, which is also a big producer, that share is some 72%, according to a recent audit. Natural gas burns much cleaner than coal, producing roughly half as much carbon dioxide. In the early days of the gas revolution, many environmental activists and progressive politicians touted it as a valuable “bridge” to renewables and other sources of clean energy. And by some calculations, natural gas has fulfilled that promise. The power sector has been one of the few success stories in lowering US emissions, thanks to its use of natural gas as a replacement for coal.   But natural gas still produces a lot of carbon dioxide when it is burned in conventionally equipped power plants. And fracking causes local air and water pollution. Perhaps most worrisome, drilling and pipelines are releasing substantial amounts of methane, the main ingredient in natural gas, both accidentally and by intentional venting. Methane is a far more potent greenhouse gas than carbon dioxide, and the emissions are a growing concern to climate scientists, albeit one that’s difficult to quantify. Still, carbon emissions from the power sector will likely continue to drop as coal is further squeezed out and more renewables get built, according to the Rhodium Group, a research consultancy. But Rhodium also projects that if electricity demand from data centers remains high and natural-gas prices low, the fossil fuel will remain the dominant source of power generation at least through 2035 and the transition to cleaner electricity will be much delayed. Rhodium estimates that the continued reign of natural gas will lead to an additional 278 million metric tons of annual US carbon emissions by 2035 (roughly equivalent to the emissions from a large US state such as Florida), relative to a future in which the use of fossil fuel gradually winds down. Our addiction to natural gas, however, doesn’t have to be a total climate disaster, at least over the longer term. Large AI companies could use their vast leverage to insist that utilities install carbon capture and sequestration (CCS) at power plants and use natural gas sourced with limited methane emissions. Entergy, for one, says its new gas turbines will be able to incorporate CCS through future upgrades. And Meta says it will help to fund the installation of CCS equipment at one of Entergy’s existing natural-gas power plants in southern Louisiana to help prove out the technology.   But the transition to clean natural gas is a hope that will take decades to realize. Meanwhile, utilities across the country are facing a more imminent and practical challenge: how to meet the sudden demand for gigawatts more power in the next few years without inadvertently building far too much capacity. For many, adding more natural-gas power plants might seem like the safe bet. But what if the explosion in AI demand doesn’t show up? Times of stress AI companies tout the need for massive, power-hungry data centers. But estimates for just how much energy it will actually take to train and run AI models vary wildly. And the technology keeps changing, sometimes seemingly overnight. DeepSeek, the new Chinese model that debuted in January, may or may not signal a future of new energy-efficient AI, but it certainly raises the possibility that such advances are possible. Maybe we will find ways to use far more energy-efficient hardware. Or maybe the AI revolution will peter out and many of the massive data centers that companies think they’ll need will never get built. There are already signs that too many have been constructed in China and clues that it might be beginning to happen in the US.  Despite the uncertainty, power providers have the task of drawing up long-term plans for investments to accommodate projected demand. Too little capacity and their customers face blackouts; too much and those customers face outsize electricity bills to fund investments in unneeded power. There could be a way to lessen the risk of overbuilding natural-gas power, however. Plenty of power is available on average around the country and on most regional grids. Most utilities typically use only about 53% of their available capacity on average during the year, according to a Duke study. The problem is that utilities must be prepared for the few hours when demand spikes—say, because of severe winter weather or a summer heat wave. The soaring demand from AI data centers is prompting many power providers to plan new capacity to make sure they have plenty of what Tyler Norris, a fellow at Duke's Nicholas School of the Environment, and his colleagues call “headroom,” to meet any spikes in demand. But after analyzing data from power systems across the country, Norris and his coauthors found that if large AI facilities cut back their electricity use during hours of peak demand, many regional power grids could accommodate those AI customers without adding new generation capacity. Even a moderate level of flexibility would make a huge difference. The Duke researchers estimate that if data centers cut their electricity use by roughly half for just a few hours during the year, it will allow utilities to handle some additional 76 gigawatts of new demand. That means power providers could effectively absorb the 65 or so additional gigawatts that, according to some predictions, data centers will likely need by 2029. “The prevailing assumption is that data centers are 100% inflexible,” says Norris. That is, that they need to run at full power all the time. But Norris says AI data centers, particularly ones that are training large foundation models (such as Meta’s facility in Richland Parish), can avoid running at full capacity or shift their computation loads to other data centers around the country—or even ramp up their own backup power—during times when a grid is under stress. The increased flexibility could allow companies to get AI data centers up and running faster, without waiting for new power plants and upgrades to transmission lines—which can take years to get approved and built. It could also, Norris noted in testimony to the US Congress in early March, provide at least a short-term reprieve on the rush to build more natural-gas power, buying time for utilities to develop and plan for cleaner technologies such as advanced nuclear and enhanced geothermal. It could, he testified, prevent “a hasty overbuild of natural-gas infrastructure.” AI companies have expressed some interest in their ability to shift around demand for power. But there are still plenty of technology questions around how to make it happen. Late last year, EPRI (the Electric Power Research Institute), a nonprofit R&D group, started a three-year collaboration with power providers, grid operators, and AI companies including Meta and Google, to figure it out. “The potential is very large,” says David Porter, the EPRI vice president who runs the project, but we must show it works “beyond just something on a piece of paper or a computer screen.” Porter estimates that there are typically 80 to 90 hours a year when a local grid is under stress and it would help for a data center to reduce its energy use. But, he says, AI data centers still need to figure out how to throttle back at those times, and grid operators need to learn how to suddenly subtract and then add back hundreds of megawatts of electricity without disrupting their systems. “There’s still a lot of work to be done so that it’s seamless for the continuous operation of the data centers and seamless for the continuous operation of the grid,” he says. Footing the bill Ultimately, getting AI data centers to be more flexible in their power demands will require more than a technological fix. It will require a shift in how AI companies work with utilities and local communities, providing them with more information and insights into actual electricity needs. And it will take aggressive regulators to make sure utilities are rigorously evaluating the power requirements of data centers rather than just reflexively building more natural-gas plants. “The most important climate policymakers in the country right now are not in Washington. They’re in state capitals, and these are public utility commissioners,” says Costa Samaras, the director of Carnegie Mellon University’s Scott Institute for Energy Innovation. In Louisiana, those policymakers are the elected officials at the Louisiana Public Service Commission, who are expected to rule later this year on Entergy’s proposed new gas plants and grid upgrades. The LPSC commissioners will decide whether Entergy’s arguments about the huge energy requirements of Meta’s data center and need for full 24/7 power leave no alternative to natural gas.  In the application it filed last fall with LPSC, Entergy said natural-gas power was essential for it to meet demand “throughout the day and night.” Teaming up solar power with battery storage could work “in theory” but would be “prohibitively costly.” Entergy also ruled out nuclear, saying it would take too long and cost too much. Others are not satisfied with the utility’s judgment. In February, the New Orleans–based Alliance for Affordable Energy and the Union of Concerned Scientists filed a motion with the Louisiana regulators arguing that Entergy did not do a rigorous market evaluation of its options, as required by the commission’s rules. Part of the problem, the groups said, is that Entergy relied on “unsubstantiated assertions” from Meta on its load needs and timeline. “Entergy is saying [Meta] needs around-the-clock power,” says Paul Arbaje, an analyst for the climate and energy program at the Union of Concerned Scientists. “But we’re just being asked to take [Entergy’s] word for it. Regulators need to be asking tough questions and not just assume that these data centers need to be operated at essentially full capacity all the time.” And, he suggests, if the utility had “started to poke holes at the assumptions that are sometimes taken as a given,” it “would have found other cleaner options.”       In an email response to MIT Technology Review, Entergy said that it has discussed the operational aspects of the facility with Meta, but "as with all customers, Entergy Louisiana will not discuss sensitive matters on behalf of their customers.” In a letter filed with the state’s regulators in early April, Meta said Entergy’s understanding of its energy needs is, in fact, accurate. The February motion also raised concern over who will end up paying for the new gas plants. Entergy says Meta has signed a 15-year supply contract for the electricity that is meant to help cover the costs of building and running the power plants but didn't respond to requests by MIT Technology Review for further details of the deal, including what happens if Meta wants to terminate the contract early. Meta referred MIT Technology Review’s questions about the contract to Entergy but says its policy is to cover the full cost that utilities incur to serve its data centers, including grid upgrades. It also says it is spending over $200 million to support the Richland Parish data centers with new infrastructure, including roads and water systems.  Not everyone is convinced. The Alliance for Affordable Energy, which works on behalf of Louisiana residents, says that the large investments in new gas turbines could mean future rate hikes, in a state where residents already have high electricity bills and suffer from one of country’s most unreliable grids. Of special concern is what happens after the 15 years. “Our biggest long-term concern is that in 15 years, residential ratepayers [and] small businesses in Louisiana will be left holding the bag for three large gas generators,” says Logan Burke, the alliance’s executive director. Indeed, consumers across the country have good reasons to fear that their electricity bills will go up as utilities look to meet the increased demand from AI data centers by building new generation capacity. In a paper posted in March, researchers at Harvard Law School argued that utilities “are now forcing the public to pay for infrastructure designed to supply a handful of exceedingly wealthy corporations.” The Harvard authors write, “Utilities tell [public utility commissions] what they want to hear: that the deals for Big Tech isolate data center energy costs from other ratepayers’ bills and won’t increase consumers’ power prices.” But the complexity of the utilities’ payment data and lack of transparency in the accounting, they say, make verifying this claim “all but impossible.” The boom in AI data centers is making Big Tech a player in our energy infrastructure and electricity future in a way unimaginable just a few years ago. At their best, AI companies could greatly facilitate the move to cleaner energy by acting as reliable and well-paying customers that provide funding that utilities can use to invest in a more robust and flexible electricity grid. This change can happen without burdening other electricity customers with additional risks and costs. But it will take AI companies committed to that vision. And it will take state regulators who ask tough questions and don’t get carried away by the potential investments being dangled by AI companies. Huge new AI data centers like the one in Richland Parish could in fact be a huge economic boon by providing new jobs, but residents deserve transparency and input into the negotiations. This is, after all, public infrastructure. Meta may come and go, but Louisiana's residents will have to live with—and possibly pay for—the changes in the decades to come.
    0 التعليقات 0 المشاركات
  • How To Build Stylized Water Shader: Design & Implementation For Nimue

    NimueIntroductionFor three semesters, our student team has been hard at work on the prototype for Nimue, a 3D platformer in which you play an enchanted princess who lost her memories. She needs to find her way through the castle ruins on a misty lake to uncover her past. Water is a visual core element of this game prototype, so we took extra care in its development. In this article, we will take an in-depth look at the design and technical implementation of a lake water material.The first prototype of Nimue can be played on itch.io soon. A link to our shader for use in your own projects can be found at the end of this article.Taxonomy of WaterBefore we dive into the design decisions and technical implementation, we present a simplified taxonomy of visual water components to better understand the requirements of its representation:RiMEWind WavesWaves generated by wind, which form on an open water surface, can be divided into capillary waves and gravity waves. Capillary waves, or ripples, are small, short-wavelength waves caused by weak winds affecting surface tension in calm water. They can overlap longer and larger gravity waves. How these physically complex wave types are represented in stylized video games varies depending on the respective style. Both types are usually heavily simplified in form and motion, and capillary waves are sometimes omitted entirely to reduce detail.Sea of ThievesFoam PatternsFoam patterns refer to white foam crests that form on a water surface without breaking against an obstacle or shoreline. In reality, this effect occurs when different water layers collide, and waves become steeper until their peaks collapse, and the resulting bubbles and drops scatter the sunlight. Stylized foam patterns can be found in many video game water representations and can easily be abstracted into patterns. Such patterns contribute to a cartoon look and can sometimes even replace waveforms entirely.The Legend of Zelda: The Wind WakerFoam LinesFoam lines are a very common water element in video games, represented as white graphical lines surrounding shorelines and obstacles like rocks. They typically reference two different water phenomena: foam forming around obstacles due to wave breaking, and foam along shorelines, resulting from wave breaking and the mixing of algaes with organic and artificial substances.Foam lines can have different visual appearances depending on the surface angle: The shallower the angle, the wider the foam effect. Due to the weaker waves, distinctive foam lines are rarely observed on natural lakes, but they can be included in a stylization for aesthetic purposes. Animal Crossing: New HorizonsReflectionsWhen light hits a water surface, it can either be reflectedor transmitted into the water, where it may be absorbed, scattered, or reflected back through the surface. The Fresnel effect describes the perceived balance between reflection and transmission: at steep angles, more transmitted light reaches the eye, making the water appear more translucent, while at shallow angles, increased reflection makes it appear more opaqueIn stylized video games, implementations of water reflections vary: RiME, for example, imitates the Fresnel effect but does not reflect the environment at all, only a simple, otherwise invisible cube map. Wind Waker, on the other hand, completely foregoes reflection calculations and renders a flat-shaded water surface.RiMETranslucencyAs an inhomogeneous medium, water scatters some of the transmitted light before it can be reflected back to the surface. This is why water is described as translucent rather than transparent. Some scattered light is not reflected back but absorbed, reducing intensity and shifting color toward the least absorbed wavelengths, typically blue, blue-green, or turquoise. Increased distance amplifies scattering and absorption, altering color perception. Modern real-time engines simulate these effects, including absorption-based color variation with depth. However, stylized games often simplify or omit transmission entirely, rendering water as an opaque surface.RiMERefractionAn additional aspect of water transmission is refraction, the bending of light as it transitions between air and water due to their differing densities. This causes light to bend toward the normal upon entering the water, creating the apparent distortion of submerged objects. Refraction effects also commonly appear in stylized water rendering. Kirby's Forgotten Land, for example, showcases two key visual characteristics of refraction: distortion increases with steeper viewing angles and is amplified by ripples on the water's surface.Kirby and the Forgotten LandCausticsCaustic patterns form when light rays are focused by a curved water surface, projecting bundled light patterns onto underwater surfaces or even back to surfaces above water. These patterns are influenced by the clarity of the water, the depth of the water, and the strength of the light source. They contribute greatly to the atmosphere of virtual worlds and are often found in stylized games, although only as simplistic representations.The Legend of Zelda: Ocarina of Time 3DDesign DecisionsDue to the fact that the setting of Nimue is a lake with a calm overall atmosphere, the decision was made to use very reduced gravity waves, as a calm water surface underlines this atmosphere. Capillary waves have too high a level of detail for the stylistic requirements of Nimue and were, therefore, not implemented.NimueShapesThe mood in Nimue can be summarized as calm and mystical. The design language of Nimue is graphic, rounded, and elegant. Shapes are vertically elongated and highly abstracted. Convex corners are always rounded or have a strong bevel, while concave corners are pointed to prevent the overall mass of objects from becoming too rounded.ColorsNimue uses mostly broken colors and pastels to create a serene, reflective mood and highlight the player's character with her saturated blue tones. Platforms and obstacles are depicted with a lower tonal valueto increase their visibility. Overall, the game world is kept in very unsaturated shades of blue, with the atmospheric depth, i.e., the sky and objects in the distance, falling into the complementary orange range. Shades of green and yellow are either completely avoided or extremely desaturated. The resulting reduced color palette additionally supports the atmosphere and makes it appear more harmonious.Color gamut & value/tone tests Hue, Tone & SaturationSince the color of the water, with its hue, tone, and saturation, is technically achieved by several components, a 2D mockup was first designed to more easily compare different colors in the environment. Here it could be observed that both the low and the high tonal value formed too great a contrast to the rest of the environment and thus placed an undesirable focus on the water. Therefore, the medium tone value was chosen.The hue and saturation were tested in relativity to the sky, the player character, and the background. Here, too, the color variant that harmonizes the most with the rest of the environment and contrasts the least was chosen.Foam LinesFor the design of the foam lines, we proceeded in the same way as for the color selection: In this case, a screenshot of the test scene was used as the basis for three overpaints to try out different foam lines on the stones in the scene. Version 3 offers the greatest scope in terms of movement within the patterns. Due to this, and because of the greater visual interest, we opted for variant 3. Following the mockup, the texture was prepared so that it could be technically implemented.ReflectionThe reflection of the water surface contributes to how realistic the water looks, as one would always expect a reflection with natural water, depending on the angle. However, a reflection could also contribute to the overall appearance of the water becoming less calm. The romantic character created by the reflection of diffuse light on water is more present in version 1.In addition, the soft, wafting shapes created by the reflection fit in well with the art style. A reflection is desirable, but the reflections must not take up too much focus. Ideally, the water should be lighter in tone, and the reflections should be present but less pronounced. Reflection intensityRefraction & CausticsEven though most light in our water gets absorbed, we noticed an improvement in the believability of the ground right underneath the water's surface when utilizing refraction together with the waveforms. When it comes to caustics, the diffuse lighting conditions of our scene would make visible caustic patterns physically implausible, but it felt right aesthetically, which is why we included it anyway.Technical Realization in Unreal Engine 5When building a water material in Unreal, choosing the right shading model and blend mode is crucial. While a Default Lit Translucent material with Surface Forward Shading offers the most control, it is very costly to render. The more efficient choice is the Single Layer Water shading model introduced in Unreal 4.27, which supports light absorption, scattering, reflection, refraction, and shadowing at a lower instruction count. However, there are some downsides. For example, as it only uses a single depth layer, it lacks back-face rendering, making it less suitable for underwater views. And while still quite expensive by itself, its benefits outweigh the drawbacks of our stylized water material.WaveformsStarting with the waveforms, we used panning normal maps to simulate the rather calm low-altitude gravity waves. The approach here is simple: create a wave normal map in Substance 3D Designer, sample it twice, and continuously offset the samples' UV coordinates in opposing directions at different speeds. Give one of the two samples a higher speed and normal intensity to create a sense of wind direction. This panning operation does not need to run in the fragment shader, you can move it to the vertex shader through the Vertex Interpolator without quality loss and thereby reduce the instruction count.Texture RepetitionTo reduce visible tiling, we used three simple and fairly efficient tricks. First, we offset the UVs of the Wave Samplers with a large panning noise texture to dynamically distort the wave patterns. Second, we used another sampler of that noise texture with different tiling, speed, and direction to modulate the strength of the normal maps across the surface. We sampled this noise texture four times with different variables in the material, which is a lot, but we reused them many times for most of the visual features of our water. Third, we sampled the pixel depth of the surface to mask out the waves that were far from the camera so that there were no waves in the far distance.Vertex DisplacementWhile these normal waves are enough to create the illusion of altitude on the water surface itself, they are lacking when it comes to the intersections around objects in the water, as these intersections are static without any actual vertex displacement. To fix that, two very simple sine operationswere added to drive the World Position Offset of the water mesh on the Z-axis. To keep the polycounts in check, we built a simple blueprint grid system that spawns high-res plane meshes at the center in a variable radius, and low-res plane meshes around that. This enables the culling of non-visible planes and the use of a less complex version of the water material for distant planes, where features like WPO are not needed.ColorThe general transmission amount is controlled by the opacity input of the material output, but scattering and absorption are defined via the Single Layer Water material output. The inputs Scattering Coefficients and Absorption Coefficients, which are responsible for reproducing how and how far different wavelengths travel through water, are decisive here. We use two scattering colors as parameters, which are interpolated depending on the camera distance. Close to the camera, the blue scattering colordominates, while at a distance, the orange scattering colortakes over. The advantage is a separation of the water's color from the sky's color and, thus, higher artistic control.Reflections & RefractionReflections in the Single Layer Water shading model are as usual determined by the inputs for Specularand Roughness. In our case, however, we use Lumen reflections for their accuracy and quality, and as of Unreal 5.4, the Single Layer Water model’s roughness calculation does not work with Lumen reflections. It forces mirror reflections, no matter the value input, leaving the specular lobe unaffected. Instead, it only offsets the reflection brightness, as the specular input does.For our artistic purposes, this is fine, and we do use the roughness input to fine-tune the specular level while having the specular input as the base level. A very low value was set for the specular value to keep the reflection brightness low. We further stylized the reflections by decreasing this brightness near the camera by using the already mentioned masking method via camera to interpolate between two values. For refraction, the Pixel Normal Offset mode was used, and a scalar parameter interpolates between the base refraction and the output of the normal waves.CausticsFor the caustic effect, we created a Voronoi noise pattern by using Unreal's Noise node and exporting it with a render target. In Photoshop, the pattern was duplicated twice, rotated each, colored, and blended. This texture is then projected on the objects below by using the ColorScaleBehindWater input of the Single Layer Water Material output. The pattern is dynamically distorted by adding one of the aforementioned panning noise textures to the UV coordinates.FoamlinesWe started by creating custom meshes for foam lines and applied the earlier texture pattern, but quickly realized that such a workflow would be too cumbersome and inflexible for even a small scene, so we decided to do it procedurally. Two common methods for generating intersection masks on a plane are Depth Sampling and Distance Fields. The first works by subtracting the camera's distance to the water surface at the current pixelfrom the camera's distance to the closest scene object at that pixel. The second method is to use the node "DistanceToNearestSurface" which calculates the shortest distance between a point on the water surface and the nearest object by referencing the scene's global distance field. We used both methods to control the mask width, as each alone varies with the object's surface slope, causing undesirable variations. Combining them allowed us to switch between two different mask widths, turning off "Affect Distance Field Lighting" for shallow slopes where narrower lines are wanted.The added mask of all intersections is then used for two effects to create the foam lines: "edge foam"and "edge waves". Both are shaped with the noise samplers shown above to approximate the hand-drawn foam line texture.Foam PatternsThe same noise samplers are also used to create a sparkling foam effect, loosely imitating whitecaps/foam crests to add more visual interest to the water surface. Since it only reuses operations, this effect is very cheap. Similarly, the wave normals are used to create something like fake subsurface scattering to further distinguish the moving water surface. Interactive RipplesA third type of foam is added as interactive waves that ripple around the player character when walking through shallow water. This is done through a Render Target and particles, as demonstrated in this Unity tutorial by Minions Art. The steps described there are all easily applicable in Unreal with a Niagara System, a little Blueprint work, and common material nodes. We added a Height to Normal conversion for better visual integration into our existing wave setup. Finally, here are all those operations combined for the material inputs:NimueBest PracticesUse Single Layer Water for efficient translucency, but note it lacks back-face rendering and forces mirror reflections with Lumen;For simple low-altitude waves, pan two offset samples of a normal map at different speeds; move panning to Vertex Shader for better performance;Break up texture tiling efficiently by offsetting UVs with a large panning noise, modulating normal strength, and fading distant waves using pixel depth;Sampling one small noise texture at different scales can power this and many other features of a water shader efficiently;If high-altitude waves aren't needed, a simple sine-based WPO can suffice for vertex displacement; implement a grid system for LODs and culling of subdivided water meshes;Blend two scattering colors by camera distance for artistic watercolor control and separation from sky reflections;Combining depth sampling and distance fields to derive the foam lines allows for more flexible intersection widths but comes at a higher cost. Further ResourcesHere are some resources that helped us in the shader creation process:General shader theory and creation: tharlevfx, Ben Cloward;Interactive water in Unity: Minions Art;Another free stylized water material in Unreal by Fabian Lopez Arosa;Technical art wizardry: Ghislain Girardot.ConclusionWe hope this breakdown of our water material creation process will help you in your projects.If you want to take a look at our shader yourself or even use it for your own game projects, you can download the complete setup on Gumroad. We look forward to seeing your water shaders and exchanging ideas. Feel free to reach out if you have any questions or want to connect.Kolja Bopp, Academic SupervisorLeanna Geideck, Concept ArtistStephan zu Münster, Technical Artist
    #how #build #stylized #water #shader
    How To Build Stylized Water Shader: Design & Implementation For Nimue
    NimueIntroductionFor three semesters, our student team has been hard at work on the prototype for Nimue, a 3D platformer in which you play an enchanted princess who lost her memories. She needs to find her way through the castle ruins on a misty lake to uncover her past. Water is a visual core element of this game prototype, so we took extra care in its development. In this article, we will take an in-depth look at the design and technical implementation of a lake water material.The first prototype of Nimue can be played on itch.io soon. A link to our shader for use in your own projects can be found at the end of this article.Taxonomy of WaterBefore we dive into the design decisions and technical implementation, we present a simplified taxonomy of visual water components to better understand the requirements of its representation:RiMEWind WavesWaves generated by wind, which form on an open water surface, can be divided into capillary waves and gravity waves. Capillary waves, or ripples, are small, short-wavelength waves caused by weak winds affecting surface tension in calm water. They can overlap longer and larger gravity waves. How these physically complex wave types are represented in stylized video games varies depending on the respective style. Both types are usually heavily simplified in form and motion, and capillary waves are sometimes omitted entirely to reduce detail.Sea of ThievesFoam PatternsFoam patterns refer to white foam crests that form on a water surface without breaking against an obstacle or shoreline. In reality, this effect occurs when different water layers collide, and waves become steeper until their peaks collapse, and the resulting bubbles and drops scatter the sunlight. Stylized foam patterns can be found in many video game water representations and can easily be abstracted into patterns. Such patterns contribute to a cartoon look and can sometimes even replace waveforms entirely.The Legend of Zelda: The Wind WakerFoam LinesFoam lines are a very common water element in video games, represented as white graphical lines surrounding shorelines and obstacles like rocks. They typically reference two different water phenomena: foam forming around obstacles due to wave breaking, and foam along shorelines, resulting from wave breaking and the mixing of algaes with organic and artificial substances.Foam lines can have different visual appearances depending on the surface angle: The shallower the angle, the wider the foam effect. Due to the weaker waves, distinctive foam lines are rarely observed on natural lakes, but they can be included in a stylization for aesthetic purposes. Animal Crossing: New HorizonsReflectionsWhen light hits a water surface, it can either be reflectedor transmitted into the water, where it may be absorbed, scattered, or reflected back through the surface. The Fresnel effect describes the perceived balance between reflection and transmission: at steep angles, more transmitted light reaches the eye, making the water appear more translucent, while at shallow angles, increased reflection makes it appear more opaqueIn stylized video games, implementations of water reflections vary: RiME, for example, imitates the Fresnel effect but does not reflect the environment at all, only a simple, otherwise invisible cube map. Wind Waker, on the other hand, completely foregoes reflection calculations and renders a flat-shaded water surface.RiMETranslucencyAs an inhomogeneous medium, water scatters some of the transmitted light before it can be reflected back to the surface. This is why water is described as translucent rather than transparent. Some scattered light is not reflected back but absorbed, reducing intensity and shifting color toward the least absorbed wavelengths, typically blue, blue-green, or turquoise. Increased distance amplifies scattering and absorption, altering color perception. Modern real-time engines simulate these effects, including absorption-based color variation with depth. However, stylized games often simplify or omit transmission entirely, rendering water as an opaque surface.RiMERefractionAn additional aspect of water transmission is refraction, the bending of light as it transitions between air and water due to their differing densities. This causes light to bend toward the normal upon entering the water, creating the apparent distortion of submerged objects. Refraction effects also commonly appear in stylized water rendering. Kirby's Forgotten Land, for example, showcases two key visual characteristics of refraction: distortion increases with steeper viewing angles and is amplified by ripples on the water's surface.Kirby and the Forgotten LandCausticsCaustic patterns form when light rays are focused by a curved water surface, projecting bundled light patterns onto underwater surfaces or even back to surfaces above water. These patterns are influenced by the clarity of the water, the depth of the water, and the strength of the light source. They contribute greatly to the atmosphere of virtual worlds and are often found in stylized games, although only as simplistic representations.The Legend of Zelda: Ocarina of Time 3DDesign DecisionsDue to the fact that the setting of Nimue is a lake with a calm overall atmosphere, the decision was made to use very reduced gravity waves, as a calm water surface underlines this atmosphere. Capillary waves have too high a level of detail for the stylistic requirements of Nimue and were, therefore, not implemented.NimueShapesThe mood in Nimue can be summarized as calm and mystical. The design language of Nimue is graphic, rounded, and elegant. Shapes are vertically elongated and highly abstracted. Convex corners are always rounded or have a strong bevel, while concave corners are pointed to prevent the overall mass of objects from becoming too rounded.ColorsNimue uses mostly broken colors and pastels to create a serene, reflective mood and highlight the player's character with her saturated blue tones. Platforms and obstacles are depicted with a lower tonal valueto increase their visibility. Overall, the game world is kept in very unsaturated shades of blue, with the atmospheric depth, i.e., the sky and objects in the distance, falling into the complementary orange range. Shades of green and yellow are either completely avoided or extremely desaturated. The resulting reduced color palette additionally supports the atmosphere and makes it appear more harmonious.Color gamut & value/tone tests Hue, Tone & SaturationSince the color of the water, with its hue, tone, and saturation, is technically achieved by several components, a 2D mockup was first designed to more easily compare different colors in the environment. Here it could be observed that both the low and the high tonal value formed too great a contrast to the rest of the environment and thus placed an undesirable focus on the water. Therefore, the medium tone value was chosen.The hue and saturation were tested in relativity to the sky, the player character, and the background. Here, too, the color variant that harmonizes the most with the rest of the environment and contrasts the least was chosen.Foam LinesFor the design of the foam lines, we proceeded in the same way as for the color selection: In this case, a screenshot of the test scene was used as the basis for three overpaints to try out different foam lines on the stones in the scene. Version 3 offers the greatest scope in terms of movement within the patterns. Due to this, and because of the greater visual interest, we opted for variant 3. Following the mockup, the texture was prepared so that it could be technically implemented.ReflectionThe reflection of the water surface contributes to how realistic the water looks, as one would always expect a reflection with natural water, depending on the angle. However, a reflection could also contribute to the overall appearance of the water becoming less calm. The romantic character created by the reflection of diffuse light on water is more present in version 1.In addition, the soft, wafting shapes created by the reflection fit in well with the art style. A reflection is desirable, but the reflections must not take up too much focus. Ideally, the water should be lighter in tone, and the reflections should be present but less pronounced. Reflection intensityRefraction & CausticsEven though most light in our water gets absorbed, we noticed an improvement in the believability of the ground right underneath the water's surface when utilizing refraction together with the waveforms. When it comes to caustics, the diffuse lighting conditions of our scene would make visible caustic patterns physically implausible, but it felt right aesthetically, which is why we included it anyway.Technical Realization in Unreal Engine 5When building a water material in Unreal, choosing the right shading model and blend mode is crucial. While a Default Lit Translucent material with Surface Forward Shading offers the most control, it is very costly to render. The more efficient choice is the Single Layer Water shading model introduced in Unreal 4.27, which supports light absorption, scattering, reflection, refraction, and shadowing at a lower instruction count. However, there are some downsides. For example, as it only uses a single depth layer, it lacks back-face rendering, making it less suitable for underwater views. And while still quite expensive by itself, its benefits outweigh the drawbacks of our stylized water material.WaveformsStarting with the waveforms, we used panning normal maps to simulate the rather calm low-altitude gravity waves. The approach here is simple: create a wave normal map in Substance 3D Designer, sample it twice, and continuously offset the samples' UV coordinates in opposing directions at different speeds. Give one of the two samples a higher speed and normal intensity to create a sense of wind direction. This panning operation does not need to run in the fragment shader, you can move it to the vertex shader through the Vertex Interpolator without quality loss and thereby reduce the instruction count.Texture RepetitionTo reduce visible tiling, we used three simple and fairly efficient tricks. First, we offset the UVs of the Wave Samplers with a large panning noise texture to dynamically distort the wave patterns. Second, we used another sampler of that noise texture with different tiling, speed, and direction to modulate the strength of the normal maps across the surface. We sampled this noise texture four times with different variables in the material, which is a lot, but we reused them many times for most of the visual features of our water. Third, we sampled the pixel depth of the surface to mask out the waves that were far from the camera so that there were no waves in the far distance.Vertex DisplacementWhile these normal waves are enough to create the illusion of altitude on the water surface itself, they are lacking when it comes to the intersections around objects in the water, as these intersections are static without any actual vertex displacement. To fix that, two very simple sine operationswere added to drive the World Position Offset of the water mesh on the Z-axis. To keep the polycounts in check, we built a simple blueprint grid system that spawns high-res plane meshes at the center in a variable radius, and low-res plane meshes around that. This enables the culling of non-visible planes and the use of a less complex version of the water material for distant planes, where features like WPO are not needed.ColorThe general transmission amount is controlled by the opacity input of the material output, but scattering and absorption are defined via the Single Layer Water material output. The inputs Scattering Coefficients and Absorption Coefficients, which are responsible for reproducing how and how far different wavelengths travel through water, are decisive here. We use two scattering colors as parameters, which are interpolated depending on the camera distance. Close to the camera, the blue scattering colordominates, while at a distance, the orange scattering colortakes over. The advantage is a separation of the water's color from the sky's color and, thus, higher artistic control.Reflections & RefractionReflections in the Single Layer Water shading model are as usual determined by the inputs for Specularand Roughness. In our case, however, we use Lumen reflections for their accuracy and quality, and as of Unreal 5.4, the Single Layer Water model’s roughness calculation does not work with Lumen reflections. It forces mirror reflections, no matter the value input, leaving the specular lobe unaffected. Instead, it only offsets the reflection brightness, as the specular input does.For our artistic purposes, this is fine, and we do use the roughness input to fine-tune the specular level while having the specular input as the base level. A very low value was set for the specular value to keep the reflection brightness low. We further stylized the reflections by decreasing this brightness near the camera by using the already mentioned masking method via camera to interpolate between two values. For refraction, the Pixel Normal Offset mode was used, and a scalar parameter interpolates between the base refraction and the output of the normal waves.CausticsFor the caustic effect, we created a Voronoi noise pattern by using Unreal's Noise node and exporting it with a render target. In Photoshop, the pattern was duplicated twice, rotated each, colored, and blended. This texture is then projected on the objects below by using the ColorScaleBehindWater input of the Single Layer Water Material output. The pattern is dynamically distorted by adding one of the aforementioned panning noise textures to the UV coordinates.FoamlinesWe started by creating custom meshes for foam lines and applied the earlier texture pattern, but quickly realized that such a workflow would be too cumbersome and inflexible for even a small scene, so we decided to do it procedurally. Two common methods for generating intersection masks on a plane are Depth Sampling and Distance Fields. The first works by subtracting the camera's distance to the water surface at the current pixelfrom the camera's distance to the closest scene object at that pixel. The second method is to use the node "DistanceToNearestSurface" which calculates the shortest distance between a point on the water surface and the nearest object by referencing the scene's global distance field. We used both methods to control the mask width, as each alone varies with the object's surface slope, causing undesirable variations. Combining them allowed us to switch between two different mask widths, turning off "Affect Distance Field Lighting" for shallow slopes where narrower lines are wanted.The added mask of all intersections is then used for two effects to create the foam lines: "edge foam"and "edge waves". Both are shaped with the noise samplers shown above to approximate the hand-drawn foam line texture.Foam PatternsThe same noise samplers are also used to create a sparkling foam effect, loosely imitating whitecaps/foam crests to add more visual interest to the water surface. Since it only reuses operations, this effect is very cheap. Similarly, the wave normals are used to create something like fake subsurface scattering to further distinguish the moving water surface. Interactive RipplesA third type of foam is added as interactive waves that ripple around the player character when walking through shallow water. This is done through a Render Target and particles, as demonstrated in this Unity tutorial by Minions Art. The steps described there are all easily applicable in Unreal with a Niagara System, a little Blueprint work, and common material nodes. We added a Height to Normal conversion for better visual integration into our existing wave setup. Finally, here are all those operations combined for the material inputs:NimueBest PracticesUse Single Layer Water for efficient translucency, but note it lacks back-face rendering and forces mirror reflections with Lumen;For simple low-altitude waves, pan two offset samples of a normal map at different speeds; move panning to Vertex Shader for better performance;Break up texture tiling efficiently by offsetting UVs with a large panning noise, modulating normal strength, and fading distant waves using pixel depth;Sampling one small noise texture at different scales can power this and many other features of a water shader efficiently;If high-altitude waves aren't needed, a simple sine-based WPO can suffice for vertex displacement; implement a grid system for LODs and culling of subdivided water meshes;Blend two scattering colors by camera distance for artistic watercolor control and separation from sky reflections;Combining depth sampling and distance fields to derive the foam lines allows for more flexible intersection widths but comes at a higher cost. Further ResourcesHere are some resources that helped us in the shader creation process:General shader theory and creation: tharlevfx, Ben Cloward;Interactive water in Unity: Minions Art;Another free stylized water material in Unreal by Fabian Lopez Arosa;Technical art wizardry: Ghislain Girardot.ConclusionWe hope this breakdown of our water material creation process will help you in your projects.If you want to take a look at our shader yourself or even use it for your own game projects, you can download the complete setup on Gumroad. We look forward to seeing your water shaders and exchanging ideas. Feel free to reach out if you have any questions or want to connect.Kolja Bopp, Academic SupervisorLeanna Geideck, Concept ArtistStephan zu Münster, Technical Artist #how #build #stylized #water #shader
    80.LV
    How To Build Stylized Water Shader: Design & Implementation For Nimue
    NimueIntroductionFor three semesters, our student team has been hard at work on the prototype for Nimue, a 3D platformer in which you play an enchanted princess who lost her memories. She needs to find her way through the castle ruins on a misty lake to uncover her past. Water is a visual core element of this game prototype, so we took extra care in its development. In this article, we will take an in-depth look at the design and technical implementation of a lake water material.The first prototype of Nimue can be played on itch.io soon. A link to our shader for use in your own projects can be found at the end of this article.Taxonomy of WaterBefore we dive into the design decisions and technical implementation, we present a simplified taxonomy of visual water components to better understand the requirements of its representation:RiMEWind WavesWaves generated by wind, which form on an open water surface, can be divided into capillary waves and gravity waves. Capillary waves, or ripples, are small, short-wavelength waves caused by weak winds affecting surface tension in calm water. They can overlap longer and larger gravity waves. How these physically complex wave types are represented in stylized video games varies depending on the respective style. Both types are usually heavily simplified in form and motion, and capillary waves are sometimes omitted entirely to reduce detail.Sea of ThievesFoam PatternsFoam patterns refer to white foam crests that form on a water surface without breaking against an obstacle or shoreline. In reality, this effect occurs when different water layers collide, and waves become steeper until their peaks collapse, and the resulting bubbles and drops scatter the sunlight. Stylized foam patterns can be found in many video game water representations and can easily be abstracted into patterns. Such patterns contribute to a cartoon look and can sometimes even replace waveforms entirely.The Legend of Zelda: The Wind WakerFoam LinesFoam lines are a very common water element in video games, represented as white graphical lines surrounding shorelines and obstacles like rocks. They typically reference two different water phenomena: foam forming around obstacles due to wave breaking, and foam along shorelines, resulting from wave breaking and the mixing of algaes with organic and artificial substances.Foam lines can have different visual appearances depending on the surface angle: The shallower the angle, the wider the foam effect. Due to the weaker waves, distinctive foam lines are rarely observed on natural lakes, but they can be included in a stylization for aesthetic purposes. Animal Crossing: New HorizonsReflectionsWhen light hits a water surface, it can either be reflected (specular reflection) or transmitted into the water, where it may be absorbed, scattered, or reflected back through the surface. The Fresnel effect describes the perceived balance between reflection and transmission: at steep angles, more transmitted light reaches the eye, making the water appear more translucent, while at shallow angles, increased reflection makes it appear more opaqueIn stylized video games, implementations of water reflections vary: RiME, for example, imitates the Fresnel effect but does not reflect the environment at all, only a simple, otherwise invisible cube map. Wind Waker, on the other hand, completely foregoes reflection calculations and renders a flat-shaded water surface.RiMETranslucencyAs an inhomogeneous medium, water scatters some of the transmitted light before it can be reflected back to the surface. This is why water is described as translucent rather than transparent. Some scattered light is not reflected back but absorbed, reducing intensity and shifting color toward the least absorbed wavelengths, typically blue, blue-green, or turquoise. Increased distance amplifies scattering and absorption, altering color perception. Modern real-time engines simulate these effects, including absorption-based color variation with depth. However, stylized games often simplify or omit transmission entirely, rendering water as an opaque surface.RiMERefractionAn additional aspect of water transmission is refraction, the bending of light as it transitions between air and water due to their differing densities. This causes light to bend toward the normal upon entering the water, creating the apparent distortion of submerged objects. Refraction effects also commonly appear in stylized water rendering. Kirby's Forgotten Land, for example, showcases two key visual characteristics of refraction: distortion increases with steeper viewing angles and is amplified by ripples on the water's surface.Kirby and the Forgotten LandCausticsCaustic patterns form when light rays are focused by a curved water surface (caused by waves and ripples), projecting bundled light patterns onto underwater surfaces or even back to surfaces above water. These patterns are influenced by the clarity of the water, the depth of the water, and the strength of the light source. They contribute greatly to the atmosphere of virtual worlds and are often found in stylized games, although only as simplistic representations.The Legend of Zelda: Ocarina of Time 3DDesign DecisionsDue to the fact that the setting of Nimue is a lake with a calm overall atmosphere, the decision was made to use very reduced gravity waves, as a calm water surface underlines this atmosphere. Capillary waves have too high a level of detail for the stylistic requirements of Nimue and were, therefore, not implemented.NimueShapesThe mood in Nimue can be summarized as calm and mystical. The design language of Nimue is graphic, rounded, and elegant. Shapes are vertically elongated and highly abstracted. Convex corners are always rounded or have a strong bevel, while concave corners are pointed to prevent the overall mass of objects from becoming too rounded.ColorsNimue uses mostly broken colors and pastels to create a serene, reflective mood and highlight the player's character with her saturated blue tones. Platforms and obstacles are depicted with a lower tonal value (darker) to increase their visibility. Overall, the game world is kept in very unsaturated shades of blue, with the atmospheric depth, i.e., the sky and objects in the distance, falling into the complementary orange range. Shades of green and yellow are either completely avoided or extremely desaturated. The resulting reduced color palette additionally supports the atmosphere and makes it appear more harmonious.Color gamut & value/tone tests Hue, Tone & SaturationSince the color of the water, with its hue, tone, and saturation, is technically achieved by several components, a 2D mockup was first designed to more easily compare different colors in the environment. Here it could be observed that both the low and the high tonal value formed too great a contrast to the rest of the environment and thus placed an undesirable focus on the water. Therefore, the medium tone value was chosen.The hue and saturation were tested in relativity to the sky, the player character, and the background. Here, too, the color variant that harmonizes the most with the rest of the environment and contrasts the least was chosen.Foam LinesFor the design of the foam lines, we proceeded in the same way as for the color selection: In this case, a screenshot of the test scene was used as the basis for three overpaints to try out different foam lines on the stones in the scene. Version 3 offers the greatest scope in terms of movement within the patterns. Due to this, and because of the greater visual interest, we opted for variant 3. Following the mockup, the texture was prepared so that it could be technically implemented.ReflectionThe reflection of the water surface contributes to how realistic the water looks, as one would always expect a reflection with natural water, depending on the angle. However, a reflection could also contribute to the overall appearance of the water becoming less calm. The romantic character created by the reflection of diffuse light on water is more present in version 1.In addition, the soft, wafting shapes created by the reflection fit in well with the art style. A reflection is desirable, but the reflections must not take up too much focus. Ideally, the water should be lighter in tone, and the reflections should be present but less pronounced. Reflection intensityRefraction & CausticsEven though most light in our water gets absorbed, we noticed an improvement in the believability of the ground right underneath the water's surface when utilizing refraction together with the waveforms. When it comes to caustics, the diffuse lighting conditions of our scene would make visible caustic patterns physically implausible, but it felt right aesthetically, which is why we included it anyway (not being bound to physical plausibility is one of the perks of stylized graphics).Technical Realization in Unreal Engine 5When building a water material in Unreal, choosing the right shading model and blend mode is crucial. While a Default Lit Translucent material with Surface Forward Shading offers the most control, it is very costly to render. The more efficient choice is the Single Layer Water shading model introduced in Unreal 4.27, which supports light absorption, scattering, reflection, refraction, and shadowing at a lower instruction count. However, there are some downsides. For example, as it only uses a single depth layer, it lacks back-face rendering, making it less suitable for underwater views. And while still quite expensive by itself, its benefits outweigh the drawbacks of our stylized water material.WaveformsStarting with the waveforms, we used panning normal maps to simulate the rather calm low-altitude gravity waves. The approach here is simple: create a wave normal map in Substance 3D Designer, sample it twice, and continuously offset the samples' UV coordinates in opposing directions at different speeds. Give one of the two samples a higher speed and normal intensity to create a sense of wind direction. This panning operation does not need to run in the fragment shader, you can move it to the vertex shader through the Vertex Interpolator without quality loss and thereby reduce the instruction count.Texture RepetitionTo reduce visible tiling, we used three simple and fairly efficient tricks. First, we offset the UVs of the Wave Samplers with a large panning noise texture to dynamically distort the wave patterns. Second, we used another sampler of that noise texture with different tiling, speed, and direction to modulate the strength of the normal maps across the surface. We sampled this noise texture four times with different variables in the material, which is a lot, but we reused them many times for most of the visual features of our water. Third, we sampled the pixel depth of the surface to mask out the waves that were far from the camera so that there were no waves in the far distance.Vertex DisplacementWhile these normal waves are enough to create the illusion of altitude on the water surface itself, they are lacking when it comes to the intersections around objects in the water, as these intersections are static without any actual vertex displacement. To fix that, two very simple sine operations (one along the X-axis and the other on the Y-axis) were added to drive the World Position Offset of the water mesh on the Z-axis. To keep the polycounts in check, we built a simple blueprint grid system that spawns high-res plane meshes at the center in a variable radius, and low-res plane meshes around that. This enables the culling of non-visible planes and the use of a less complex version of the water material for distant planes, where features like WPO are not needed.ColorThe general transmission amount is controlled by the opacity input of the material output, but scattering and absorption are defined via the Single Layer Water material output. The inputs Scattering Coefficients and Absorption Coefficients, which are responsible for reproducing how and how far different wavelengths travel through water, are decisive here. We use two scattering colors as parameters, which are interpolated depending on the camera distance. Close to the camera, the blue scattering color (ScatteringColorNear) dominates, while at a distance, the orange scattering color (ScatteringColorFar) takes over. The advantage is a separation of the water's color from the sky's color and, thus, higher artistic control.Reflections & RefractionReflections in the Single Layer Water shading model are as usual determined by the inputs for Specular (reflection intensity) and Roughness (reflection diffusion). In our case, however, we use Lumen reflections for their accuracy and quality, and as of Unreal 5.4, the Single Layer Water model’s roughness calculation does not work with Lumen reflections. It forces mirror reflections (Roughness = 0), no matter the value input, leaving the specular lobe unaffected. Instead, it only offsets the reflection brightness, as the specular input does.For our artistic purposes, this is fine, and we do use the roughness input to fine-tune the specular level while having the specular input as the base level. A very low value was set for the specular value to keep the reflection brightness low. We further stylized the reflections by decreasing this brightness near the camera by using the already mentioned masking method via camera to interpolate between two values (RoughnessNear and RoughnessFar). For refraction, the Pixel Normal Offset mode was used, and a scalar parameter interpolates between the base refraction and the output of the normal waves.CausticsFor the caustic effect, we created a Voronoi noise pattern by using Unreal's Noise node and exporting it with a render target. In Photoshop, the pattern was duplicated twice, rotated each, colored, and blended. This texture is then projected on the objects below by using the ColorScaleBehindWater input of the Single Layer Water Material output. The pattern is dynamically distorted by adding one of the aforementioned panning noise textures to the UV coordinates.FoamlinesWe started by creating custom meshes for foam lines and applied the earlier texture pattern, but quickly realized that such a workflow would be too cumbersome and inflexible for even a small scene, so we decided to do it procedurally. Two common methods for generating intersection masks on a plane are Depth Sampling and Distance Fields. The first works by subtracting the camera's distance to the water surface at the current pixel (i.e., the "PixelDepth") from the camera's distance to the closest scene object at that pixel (i.e., the "SceneDepth"). The second method is to use the node "DistanceToNearestSurface" which calculates the shortest distance between a point on the water surface and the nearest object by referencing the scene's global distance field. We used both methods to control the mask width, as each alone varies with the object's surface slope, causing undesirable variations. Combining them allowed us to switch between two different mask widths, turning off "Affect Distance Field Lighting" for shallow slopes where narrower lines are wanted.The added mask of all intersections is then used for two effects to create the foam lines: "edge foam" (that does not depart from the intersection) and "edge waves" (which go outwards from the edge foam). Both are shaped with the noise samplers shown above to approximate the hand-drawn foam line texture.Foam PatternsThe same noise samplers are also used to create a sparkling foam effect, loosely imitating whitecaps/foam crests to add more visual interest to the water surface. Since it only reuses operations, this effect is very cheap. Similarly, the wave normals are used to create something like fake subsurface scattering to further distinguish the moving water surface. Interactive RipplesA third type of foam is added as interactive waves that ripple around the player character when walking through shallow water. This is done through a Render Target and particles, as demonstrated in this Unity tutorial by Minions Art. The steps described there are all easily applicable in Unreal with a Niagara System, a little Blueprint work, and common material nodes. We added a Height to Normal conversion for better visual integration into our existing wave setup. Finally, here are all those operations combined for the material inputs:NimueBest PracticesUse Single Layer Water for efficient translucency, but note it lacks back-face rendering and forces mirror reflections with Lumen;For simple low-altitude waves, pan two offset samples of a normal map at different speeds; move panning to Vertex Shader for better performance;Break up texture tiling efficiently by offsetting UVs with a large panning noise, modulating normal strength, and fading distant waves using pixel depth;Sampling one small noise texture at different scales can power this and many other features of a water shader efficiently;If high-altitude waves aren't needed, a simple sine-based WPO can suffice for vertex displacement; implement a grid system for LODs and culling of subdivided water meshes;Blend two scattering colors by camera distance for artistic watercolor control and separation from sky reflections;Combining depth sampling and distance fields to derive the foam lines allows for more flexible intersection widths but comes at a higher cost. Further ResourcesHere are some resources that helped us in the shader creation process:General shader theory and creation: tharlevfx, Ben Cloward;Interactive water in Unity: Minions Art;Another free stylized water material in Unreal by Fabian Lopez Arosa;Technical art wizardry: Ghislain Girardot.ConclusionWe hope this breakdown of our water material creation process will help you in your projects.If you want to take a look at our shader yourself or even use it for your own game projects, you can download the complete setup on Gumroad. We look forward to seeing your water shaders and exchanging ideas. Feel free to reach out if you have any questions or want to connect.Kolja Bopp, Academic SupervisorLeanna Geideck, Concept ArtistStephan zu Münster, Technical Artist
    0 التعليقات 0 المشاركات
  • #333;">Lessons must be learned from past PFI failures, government infrastructure advisor warns

    Comments from NISTA’s Matthew Vickerstaff come as ministers weigh up benefits of relaunching initiative next monthThe government’s new infrastructure advisory body has said ministers would need to “learn from the mistakes” of the past if a new generation of PFI contracts are launched as part of the upcoming infrastructure strategy.
    Matthew Vickerstaff, deputy chief executive of the The National Infrastructure and Service Transformation Authority (NISTA), said there was still a “constant drumbeat” of construction issues on schools built through private finance initiatives (PFI).
    Matthew Vickerstaff speaking at the Public Accounts Committee yesterday afternoon
    Chancellor Rachel Reeves is understood to be considering reinstating a form of private financing to pay for public projects, including social infrastructure schemes such as schools, ahead of the launch of its 10-Year Infrastructure Strategy next month.
    It would be the first major rollout of PFI in England since 2018, when then chancellor Philip Hammond declared the successor scheme to the original PFI programme as “inflexible and overly complex”.
    >> See also: PFI: Do the numbers add up?
    Speaking at a meeting of the Public Accounts Committee in Parliament yesterday, Vickerstaff highlighted issues that had blighted historic PFI schemes where construction risk had been transferred to the private sector.
    “Just what we’re seeing on school projects, leaking roofs is a consistent, constant drum beat, fire door stopping, acoustics, lighting levels, the ability of classrooms to be operable in a white board environment, problems around leisure centres or sports facilities, contamination of land, latent defects of refurbishments on old buildings creating real problems,” he said.
    “The dash to get the schools ready for September, I cannot tell you how many PFI schools have that problem, and we need to get the private sector to fix it.”
    But while Vickerstaff said he was “ambivalent” about a new generation of PFI contracts, he argued contractual arrangements on new schemes could contain less risk for the public purse if the government did decide to opt for this route in its infrastructure strategy.
    “I would say that compared with 25 years ago, the asset management, the building information systems and computer aided facilities management has vastly improved so we’re dealing with a generation of contracts that would certainly by improved whether it’s public sector or private sector,” he said.
    “I’m ambivalent but what we need to make sure is that we learn from the mistakes and definitely get them to fix what we’re experiencing in some situations.”
    Vickerstaff added: “In terms of lessons learned, making sure construction is monitored by a clerk of works and independently certified would be a really important factor moving forward, because construction defects have been a problem because the construction contracts whether it be public sector or private sector have not been well monitored or controlled.”
    Meanwhile, a new report by PwC has called on the government to explore a new generation of public-private finance in order to address the deficit in infrastructure including schools and healthcare.
    The research, published today, found “strong market appetite” for a new model of public-private partnerships which could be based on the Mutual Investment Model developed in Wales.
    PwC corporate finance associate director Dan Whittle said: “There is a strong view that public-private finance has a valuable role to play as a strategic tool to close the UK’s infrastructure gap, particularly at a time when we are constrained by fiscal rules.
    “There is no need to reinvent the fundamentals of the PPP model.
    What must continue to evolve is how we implement this model with refined risk allocation to reflect the current appetite of the market, smarter contract management, and a genuine partnership approach.”
    The government is expected to unveil its infrastructure strategy alongside its spending review in June.
    #0066cc;">#lessons #must #learned #from #past #pfi #failures #government #infrastructure #advisor #warns #comments #nistas #matthew #vickerstaff #come #ministers #weigh #benefits #relaunching #initiative #next #monththe #governments #new #advisory #body #has #said #would #need #learn #the #mistakes #generation #contracts #are #launched #part #upcoming #strategymatthew #deputy #chief #executive #national #and #service #transformation #authority #nista #there #was #still #constant #drumbeat #construction #issues #schools #built #through #private #finance #initiatives #pfimatthew #speaking #public #accounts #committee #yesterday #afternoonchancellor #rachel #reeves #understood #considering #reinstating #form #financing #pay #for #projects #including #social #schemes #such #ahead #launch #its #10year #strategy #monthit #first #major #rollout #england #since #when #then #chancellor #philip #hammond #declared #successor #scheme #original #programme #inflexible #overly #complexampgtampgt #see #alsopfi #numbers #add #upspeaking #meeting #parliament #highlighted #that #had #blighted #historic #where #risk #been #transferred #sectorjust #what #were #seeing #school #leaking #roofs #consistent #drum #beat #fire #door #stopping #acoustics #lighting #levels #ability #classrooms #operable #white #board #environment #problems #around #leisure #centres #sports #facilities #contamination #land #latent #defects #refurbishments #old #buildings #creating #real #saidthe #dash #get #ready #september #cannot #tell #you #how #many #have #problem #sector #fix #itbut #while #ambivalent #about #argued #contractual #arrangements #could #contain #less #purse #did #decide #opt #this #route #strategyi #say #compared #with #years #ago #asset #management #building #information #systems #computer #aided #vastly #improved #dealing #certainly #whether #saidim #but #make #sure #definitely #them #experiencing #some #situationsvickerstaff #added #terms #making #monitored #clerk #works #independently #certified #really #important #factor #moving #forward #because #not #well #controlledmeanwhile #report #pwc #called #explore #publicprivate #order #address #deficit #healthcarethe #research #published #today #found #strong #market #appetite #model #partnerships #which #based #mutual #investment #developed #walespwc #corporate #associate #director #dan #whittle #view #valuable #role #play #strategic #tool #close #uks #gap #particularly #time #constrained #fiscal #rulesthere #reinvent #fundamentals #ppp #modelwhat #continue #evolve #implement #refined #allocation #reflect #current #smarter #contract #genuine #partnership #approachthe #expected #unveil #alongside #spending #review #june
    Lessons must be learned from past PFI failures, government infrastructure advisor warns
    Comments from NISTA’s Matthew Vickerstaff come as ministers weigh up benefits of relaunching initiative next monthThe government’s new infrastructure advisory body has said ministers would need to “learn from the mistakes” of the past if a new generation of PFI contracts are launched as part of the upcoming infrastructure strategy. Matthew Vickerstaff, deputy chief executive of the The National Infrastructure and Service Transformation Authority (NISTA), said there was still a “constant drumbeat” of construction issues on schools built through private finance initiatives (PFI). Matthew Vickerstaff speaking at the Public Accounts Committee yesterday afternoon Chancellor Rachel Reeves is understood to be considering reinstating a form of private financing to pay for public projects, including social infrastructure schemes such as schools, ahead of the launch of its 10-Year Infrastructure Strategy next month. It would be the first major rollout of PFI in England since 2018, when then chancellor Philip Hammond declared the successor scheme to the original PFI programme as “inflexible and overly complex”. >> See also: PFI: Do the numbers add up? Speaking at a meeting of the Public Accounts Committee in Parliament yesterday, Vickerstaff highlighted issues that had blighted historic PFI schemes where construction risk had been transferred to the private sector. “Just what we’re seeing on school projects, leaking roofs is a consistent, constant drum beat, fire door stopping, acoustics, lighting levels, the ability of classrooms to be operable in a white board environment, problems around leisure centres or sports facilities, contamination of land, latent defects of refurbishments on old buildings creating real problems,” he said. “The dash to get the schools ready for September, I cannot tell you how many PFI schools have that problem, and we need to get the private sector to fix it.” But while Vickerstaff said he was “ambivalent” about a new generation of PFI contracts, he argued contractual arrangements on new schemes could contain less risk for the public purse if the government did decide to opt for this route in its infrastructure strategy. “I would say that compared with 25 years ago, the asset management, the building information systems and computer aided facilities management has vastly improved so we’re dealing with a generation of contracts that would certainly by improved whether it’s public sector or private sector,” he said. “I’m ambivalent but what we need to make sure is that we learn from the mistakes and definitely get them to fix what we’re experiencing in some situations.” Vickerstaff added: “In terms of lessons learned, making sure construction is monitored by a clerk of works and independently certified would be a really important factor moving forward, because construction defects have been a problem because the construction contracts whether it be public sector or private sector have not been well monitored or controlled.” Meanwhile, a new report by PwC has called on the government to explore a new generation of public-private finance in order to address the deficit in infrastructure including schools and healthcare. The research, published today, found “strong market appetite” for a new model of public-private partnerships which could be based on the Mutual Investment Model developed in Wales. PwC corporate finance associate director Dan Whittle said: “There is a strong view that public-private finance has a valuable role to play as a strategic tool to close the UK’s infrastructure gap, particularly at a time when we are constrained by fiscal rules. “There is no need to reinvent the fundamentals of the PPP model. What must continue to evolve is how we implement this model with refined risk allocation to reflect the current appetite of the market, smarter contract management, and a genuine partnership approach.” The government is expected to unveil its infrastructure strategy alongside its spending review in June.
    المصدر: www.bdonline.co.uk
    #lessons #must #learned #from #past #pfi #failures #government #infrastructure #advisor #warns #comments #nistas #matthew #vickerstaff #come #ministers #weigh #benefits #relaunching #initiative #next #monththe #governments #new #advisory #body #has #said #would #need #learn #the #mistakes #generation #contracts #are #launched #part #upcoming #strategymatthew #deputy #chief #executive #national #and #service #transformation #authority #nista #there #was #still #constant #drumbeat #construction #issues #schools #built #through #private #finance #initiatives #pfimatthew #speaking #public #accounts #committee #yesterday #afternoonchancellor #rachel #reeves #understood #considering #reinstating #form #financing #pay #for #projects #including #social #schemes #such #ahead #launch #its #10year #strategy #monthit #first #major #rollout #england #since #when #then #chancellor #philip #hammond #declared #successor #scheme #original #programme #inflexible #overly #complexampgtampgt #see #alsopfi #numbers #add #upspeaking #meeting #parliament #highlighted #that #had #blighted #historic #where #risk #been #transferred #sectorjust #what #were #seeing #school #leaking #roofs #consistent #drum #beat #fire #door #stopping #acoustics #lighting #levels #ability #classrooms #operable #white #board #environment #problems #around #leisure #centres #sports #facilities #contamination #land #latent #defects #refurbishments #old #buildings #creating #real #saidthe #dash #get #ready #september #cannot #tell #you #how #many #have #problem #sector #fix #itbut #while #ambivalent #about #argued #contractual #arrangements #could #contain #less #purse #did #decide #opt #this #route #strategyi #say #compared #with #years #ago #asset #management #building #information #systems #computer #aided #vastly #improved #dealing #certainly #whether #saidim #but #make #sure #definitely #them #experiencing #some #situationsvickerstaff #added #terms #making #monitored #clerk #works #independently #certified #really #important #factor #moving #forward #because #not #well #controlledmeanwhile #report #pwc #called #explore #publicprivate #order #address #deficit #healthcarethe #research #published #today #found #strong #market #appetite #model #partnerships #which #based #mutual #investment #developed #walespwc #corporate #associate #director #dan #whittle #view #valuable #role #play #strategic #tool #close #uks #gap #particularly #time #constrained #fiscal #rulesthere #reinvent #fundamentals #ppp #modelwhat #continue #evolve #implement #refined #allocation #reflect #current #smarter #contract #genuine #partnership #approachthe #expected #unveil #alongside #spending #review #june
    WWW.BDONLINE.CO.UK
    Lessons must be learned from past PFI failures, government infrastructure advisor warns
    Comments from NISTA’s Matthew Vickerstaff come as ministers weigh up benefits of relaunching initiative next monthThe government’s new infrastructure advisory body has said ministers would need to “learn from the mistakes” of the past if a new generation of PFI contracts are launched as part of the upcoming infrastructure strategy. Matthew Vickerstaff, deputy chief executive of the The National Infrastructure and Service Transformation Authority (NISTA), said there was still a “constant drumbeat” of construction issues on schools built through private finance initiatives (PFI). Matthew Vickerstaff speaking at the Public Accounts Committee yesterday afternoon Chancellor Rachel Reeves is understood to be considering reinstating a form of private financing to pay for public projects, including social infrastructure schemes such as schools, ahead of the launch of its 10-Year Infrastructure Strategy next month. It would be the first major rollout of PFI in England since 2018, when then chancellor Philip Hammond declared the successor scheme to the original PFI programme as “inflexible and overly complex”. >> See also: PFI: Do the numbers add up? Speaking at a meeting of the Public Accounts Committee in Parliament yesterday, Vickerstaff highlighted issues that had blighted historic PFI schemes where construction risk had been transferred to the private sector. “Just what we’re seeing on school projects, leaking roofs is a consistent, constant drum beat, fire door stopping, acoustics, lighting levels, the ability of classrooms to be operable in a white board environment, problems around leisure centres or sports facilities, contamination of land, latent defects of refurbishments on old buildings creating real problems,” he said. “The dash to get the schools ready for September, I cannot tell you how many PFI schools have that problem, and we need to get the private sector to fix it.” But while Vickerstaff said he was “ambivalent” about a new generation of PFI contracts, he argued contractual arrangements on new schemes could contain less risk for the public purse if the government did decide to opt for this route in its infrastructure strategy. “I would say that compared with 25 years ago, the asset management, the building information systems and computer aided facilities management has vastly improved so we’re dealing with a generation of contracts that would certainly by improved whether it’s public sector or private sector,” he said. “I’m ambivalent but what we need to make sure is that we learn from the mistakes and definitely get them to fix what we’re experiencing in some situations.” Vickerstaff added: “In terms of lessons learned, making sure construction is monitored by a clerk of works and independently certified would be a really important factor moving forward, because construction defects have been a problem because the construction contracts whether it be public sector or private sector have not been well monitored or controlled.” Meanwhile, a new report by PwC has called on the government to explore a new generation of public-private finance in order to address the deficit in infrastructure including schools and healthcare. The research, published today, found “strong market appetite” for a new model of public-private partnerships which could be based on the Mutual Investment Model developed in Wales. PwC corporate finance associate director Dan Whittle said: “There is a strong view that public-private finance has a valuable role to play as a strategic tool to close the UK’s infrastructure gap, particularly at a time when we are constrained by fiscal rules. “There is no need to reinvent the fundamentals of the PPP model. What must continue to evolve is how we implement this model with refined risk allocation to reflect the current appetite of the market, smarter contract management, and a genuine partnership approach.” The government is expected to unveil its infrastructure strategy alongside its spending review in June.
    0 التعليقات 0 المشاركات