• What happens to DOGE without Elon Musk?

    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency, his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government billion — well short of its ambitioustarget of cutting at least trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify.
    What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More:
    #what #happens #doge #without #elon
    What happens to DOGE without Elon Musk?
    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency, his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government billion — well short of its ambitioustarget of cutting at least trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify. What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More: #what #happens #doge #without #elon
    What happens to DOGE without Elon Musk?
    www.vox.com
    Elon Musk may be gone from the Trump administration — and his friendship status with President Donald Trump may be at best uncertain — but his whirlwind stint in government certainly left its imprint. The Department of Government Efficiency (DOGE), his pet government-slashing project, remains entrenched in Washington. During his 130-day tenure, Musk led DOGE in eliminating about 260,000 federal employee jobs and gutting agencies supporting scientific research and humanitarian aid. But to date, DOGE claims to have saved the government $180 billion — well short of its ambitious (and frankly never realistic) target of cutting at least $2 trillion from the federal budget. And with Musk’s departure still fresh, there are reports that the federal government is trying to rehire federal workers who quit or were let go. For Elaine Kamarck, senior fellow at the Brookings Institution, DOGE’s tactics will likely end up being disastrous in the long run. “DOGE came in with these huge cuts, which were not attached to a plan,” she told Today, Explained co-host Sean Rameswaram. Kamarck knows all about making government more efficient. In the 1990s, she ran the Clinton administration’s Reinventing Government program. “I was Elon Musk,” she told Today, Explained. With the benefit of that experience, she assesses Musk’s record at DOGE, and what, if anything, the billionaire’s loud efforts at cutting government spending added up to. Below is an excerpt of the conversation, edited for length and clarity. There’s much more in the full podcast, so listen to Today, Explained wherever you get podcasts, including Apple Podcasts, Pandora, and Spotify. What do you think Elon Musk’s legacy is? Well, he will not have totally, radically reshaped the federal government. Absolutely not. In fact, there’s a high probability that on January 20, 2029, when the next president takes over, the federal government is about the same size as it is now, and is probably doing the same stuff that it’s doing now. What he did manage to do was insert chaos, fear, and loathing into the federal workforce. There was reporting in the Washington Post late last week that these cuts were so ineffective that the White House is actually reaching out to various federal employees who were laid off and asking them to come back, from the FDA to the IRS to even USAID. Which cuts are sticking at this point and which ones aren’t?First of all, in a lot of cases, people went to court and the courts have reversed those earlier decisions. So the first thing that happened is, courts said, “No, no, no, you can’t do it this way. You have to bring them back.” The second thing that happened is that Cabinet officers started to get confirmed by the Senate. And remember that a lot of the most spectacular DOGE stuff was happening in February. In February, these Cabinet secretaries were preparing for their Senate hearings. They weren’t on the job. Now that their Cabinet secretary’s home, what’s happening is they’re looking at these cuts and they’re saying, “No, no, no! We can’t live with these cuts because we have a mission to do.”As the government tries to hire back the people they fired, they’re going to have a tough time, and they’re going to have a tough time for two reasons. First of all, they treated them like dirt, and they’ve said a lot of insulting things. Second, most of the people who work for the federal government are highly skilled. They’re not paper pushers. We have computers to push our paper, right? They’re scientists. They’re engineers. They’re people with high skills, and guess what? They can get jobs outside the government. So there’s going to be real lasting damage to the government from the way they did this. And it’s analogous to the lasting damage that they’re causing at universities, where we now have top scientists who used to invent great cures for cancer and things like that, deciding to go find jobs in Europe because this culture has gotten so bad.What happens to this agency now? Who’s in charge of it?Well, what they’ve done is DOGE employees have been embedded in each of the organizations in the government, okay? And they basically — and the president himself has said this — they basically report to the Cabinet secretaries. So if you are in the Transportation Department, you have to make sure that Sean Duffy, who’s the secretary of transportation, agrees with you on what you want to do. And Sean Duffy has already had a fight during a Cabinet meeting with Elon Musk. You know that he has not been thrilled with the advice he’s gotten from DOGE. So from now on, DOGE is going to have to work hand in hand with Donald Trump’s appointed leaders.And just to bring this around to what we’re here talking about now, they’re in this huge fight over wasteful spending with the so-called big, beautiful bill. Does this just look like the government as usual, ultimately?It’s actually worse than normal. Because the deficit impacts are bigger than normal. It’s adding more to the deficit than previous bills have done. And the second reason it’s worse than normal is that everybody is still living in a fantasy world. And the fantasy world says that somehow we can deal with our deficits by cutting waste, fraud, and abuse. That is pure nonsense. Let me say it: pure nonsense.Where does most of the government money go? Does it go to some bureaucrats sitting on Pennsylvania Avenue? It goes to us. It goes to your grandmother and her Social Security and her Medicare. It goes to veterans in veterans benefits. It goes to Americans. That’s why it’s so hard to cut it. It’s so hard to cut it because it’s us. And people are living on it. Now, there’s a whole other topic that nobody talks about, and it’s called entitlement reform, right? Could we reform Social Security? Could we make the retirement age go from 67 to 68? That would save a lot of money. Could we change the cost of living? Nobody, nobody, nobody is talking about that. And that’s because we are in this crazy, polarized environment where we can no longer have serious conversations about serious issues. See More:
    0 Comentários ·0 Compartilhamentos ·0 Anterior
  • Nexsan: Renewals Manager

    Job Title: Renewals ManagerLocation: Remote U.S.Department: Customer Success / Sales Operations / Revenue ManagementReports To: SVP of Sales & MarketingEmployment Type: Full-timeJob Summary:We are seeking a proactive and strategic Renewals Manager to lead our contract renewal efforts across enterprise and mid-market customers in the data storage industry. In this role, you will be responsible for driving renewals performance, minimizing churn, and ensuring a seamless customer experience at the end of each subscription or support term. You’ll collaborate across sales, customer success, and product teams to ensure high customer retention and long-term contract value in a rapidly evolving storage and data infrastructure environment.Key Responsibilities:Renewals Strategy & Execution: Own and manage the global renewals process for support contracts, ensuring timely and accurate renewal quotes, proposals, and negotiations.Customer Retention: Proactively engage with existing customers to drive successful contract extensions, upselling opportunities, and maintain high retention and Net Revenue Retentionrates.Forecasting & Reporting: Deliver accurate and data-driven renewal forecasts to executive leadership; track key metrics such as churn, renewal rate, and renewal velocity.Process Optimization: Standardize and scale global renewals workflows, leveraging automation tools and CRM systems to improve efficiency and visibility.Contract Management: Ensure all renewal terms align with business policies and customer entitlements; manage complex global accounts with multiple storage deployments and support contracts.Customer Advocacy: Act as a customer champion during the renewal cycle, ensuring that feedback and technical requirements are communicated internally to help inform roadmap and service improvements.Qualifications:Bachelor’s degree in Business, Technology, or related field5+ years of experience in a renewals, account management, or customer success role in the data storage, enterprise IT, or SaaS industry.Proficient in Microsoft Excel to create and manage client quotes with accuracy and efficiency.Attention to detail is critical—this role requires thorough system checks and meticulous review of all work to maintain accuracy and consistency.Understanding of storage technologies, support SLAs, and subscription-based business models.Strong negotiation, communication, and stakeholder management skills.Proficient in CRM and renewals platforms. NetSuite knowledge would be helpful.Experience managing renewals globally across time zones and regions.Preferred Skills:Familiarity with hybrid cloud storage, object storage, and data backup/archive solutions.Experience with channel partners or distributor-led renewal programs.Ability to analyze complex contracts and identify upsell/cross-sell opportunities.What We Offer:Competitive compensation and performance bonusesHealth, dental, vision, and 401or retirement planRemote work environmentExposure to cutting-edge data storage technologiesInclusive and collaborative company cultureJoin us to ensure our customers never miss a byte.Be a key player in powering the world’s most critical data infrastructures. Apply today.
    #nexsan #renewals #manager
    Nexsan: Renewals Manager
    Job Title: Renewals ManagerLocation: Remote U.S.Department: Customer Success / Sales Operations / Revenue ManagementReports To: SVP of Sales & MarketingEmployment Type: Full-timeJob Summary:We are seeking a proactive and strategic Renewals Manager to lead our contract renewal efforts across enterprise and mid-market customers in the data storage industry. In this role, you will be responsible for driving renewals performance, minimizing churn, and ensuring a seamless customer experience at the end of each subscription or support term. You’ll collaborate across sales, customer success, and product teams to ensure high customer retention and long-term contract value in a rapidly evolving storage and data infrastructure environment.Key Responsibilities:Renewals Strategy & Execution: Own and manage the global renewals process for support contracts, ensuring timely and accurate renewal quotes, proposals, and negotiations.Customer Retention: Proactively engage with existing customers to drive successful contract extensions, upselling opportunities, and maintain high retention and Net Revenue Retentionrates.Forecasting & Reporting: Deliver accurate and data-driven renewal forecasts to executive leadership; track key metrics such as churn, renewal rate, and renewal velocity.Process Optimization: Standardize and scale global renewals workflows, leveraging automation tools and CRM systems to improve efficiency and visibility.Contract Management: Ensure all renewal terms align with business policies and customer entitlements; manage complex global accounts with multiple storage deployments and support contracts.Customer Advocacy: Act as a customer champion during the renewal cycle, ensuring that feedback and technical requirements are communicated internally to help inform roadmap and service improvements.Qualifications:Bachelor’s degree in Business, Technology, or related field5+ years of experience in a renewals, account management, or customer success role in the data storage, enterprise IT, or SaaS industry.Proficient in Microsoft Excel to create and manage client quotes with accuracy and efficiency.Attention to detail is critical—this role requires thorough system checks and meticulous review of all work to maintain accuracy and consistency.Understanding of storage technologies, support SLAs, and subscription-based business models.Strong negotiation, communication, and stakeholder management skills.Proficient in CRM and renewals platforms. NetSuite knowledge would be helpful.Experience managing renewals globally across time zones and regions.Preferred Skills:Familiarity with hybrid cloud storage, object storage, and data backup/archive solutions.Experience with channel partners or distributor-led renewal programs.Ability to analyze complex contracts and identify upsell/cross-sell opportunities.What We Offer:Competitive compensation and performance bonusesHealth, dental, vision, and 401or retirement planRemote work environmentExposure to cutting-edge data storage technologiesInclusive and collaborative company cultureJoin us to ensure our customers never miss a byte.Be a key player in powering the world’s most critical data infrastructures. Apply today. #nexsan #renewals #manager
    Nexsan: Renewals Manager
    weworkremotely.com
    Job Title: Renewals ManagerLocation: Remote U.S.Department: Customer Success / Sales Operations / Revenue ManagementReports To: SVP of Sales & MarketingEmployment Type: Full-timeJob Summary:We are seeking a proactive and strategic Renewals Manager to lead our contract renewal efforts across enterprise and mid-market customers in the data storage industry. In this role, you will be responsible for driving renewals performance, minimizing churn, and ensuring a seamless customer experience at the end of each subscription or support term. You’ll collaborate across sales, customer success, and product teams to ensure high customer retention and long-term contract value in a rapidly evolving storage and data infrastructure environment.Key Responsibilities:Renewals Strategy & Execution: Own and manage the global renewals process for support contracts, ensuring timely and accurate renewal quotes, proposals, and negotiations.Customer Retention: Proactively engage with existing customers to drive successful contract extensions, upselling opportunities (e.g., expanded storage capacity, enhanced support tiers), and maintain high retention and Net Revenue Retention (NRR) rates.Forecasting & Reporting: Deliver accurate and data-driven renewal forecasts to executive leadership; track key metrics such as churn, renewal rate, and renewal velocity.Process Optimization: Standardize and scale global renewals workflows, leveraging automation tools and CRM systems to improve efficiency and visibility.Contract Management: Ensure all renewal terms align with business policies and customer entitlements; manage complex global accounts with multiple storage deployments and support contracts.Customer Advocacy: Act as a customer champion during the renewal cycle, ensuring that feedback and technical requirements are communicated internally to help inform roadmap and service improvements.Qualifications:Bachelor’s degree in Business, Technology, or related field5+ years of experience in a renewals, account management, or customer success role in the data storage, enterprise IT, or SaaS industry.Proficient in Microsoft Excel to create and manage client quotes with accuracy and efficiency.Attention to detail is critical—this role requires thorough system checks and meticulous review of all work to maintain accuracy and consistency.Understanding of storage technologies, support SLAs, and subscription-based business models.Strong negotiation, communication, and stakeholder management skills.Proficient in CRM and renewals platforms (Salesforce, Gainsight, Clari, CPQ tools). NetSuite knowledge would be helpful.Experience managing renewals globally across time zones and regions (e.g., North America, EMEA, APAC).Preferred Skills:Familiarity with hybrid cloud storage, object storage, and data backup/archive solutions.Experience with channel partners or distributor-led renewal programs.Ability to analyze complex contracts and identify upsell/cross-sell opportunities.What We Offer:Competitive compensation and performance bonusesHealth, dental, vision, and 401(k) or retirement planRemote work environmentExposure to cutting-edge data storage technologiesInclusive and collaborative company cultureJoin us to ensure our customers never miss a byte.Be a key player in powering the world’s most critical data infrastructures. Apply today.
    8 Comentários ·0 Compartilhamentos ·0 Anterior
  • The cruelest cut in the Republican budget bill, explained

    Medicaid may be about to change in a big way: Republicans in Congress are getting closer to passing a bill that, along with cutting taxes and imposing new immigration restrictions, would require people to work — or else risk losing government health benefits. It’s a change the party has long desired.Right now, if you qualify for Medicaid, the government health insurance program for low-income people, based on your earnings, assets, and life circumstances, you can receive health coverage through the program — no other questions asked. Nearly 80 million people are currently insured by Medicaid, making it the single largest insurance program in the US. Under the current law, there is no obligation to work or fulfill any other community service requirement in order to receive your Medicaid benefits.But now, the GOP’s One Big Beautiful Bill Act, which passed the House by one vote early Thursday morning, would establish nationwide work requirements for the program for the first time, starting at the end of 2026. If the bill becomes law, people who became eligible for the program under the Affordable Care Act — generally, adults without children living in or near poverty — would be required to report at least 80 hours of work every month or another community activity such as volunteering, or they could lose their benefits. Seniors, people with disabilities, caregivers for dependent people, and pregnant people are supposed to be exempted under the bill as currently written, but there’s some gray area here. Some of the decisions about how to implement the requirements will be left to the states: They could, for example, require Medicaid enrollees to report their activities every month or every six months.The Congressional Budget Office estimates that 10.3 million people would lose their Medicaid coverage by 2034 under the GOP bill, about half of them due to the work requirement provision. Other losses would result from a number of smaller provisions, according to the estimate. These include things like more frequent and stringent eligibility checks that will also require people to jump through more hoops to keep their benefits. Outside projections are even higher: The left-leaning Center for Budget Policy and Priorities estimated that as many as 14.4 million people could lose benefits in the next decade.In a recent New York Times op-ed, a number of Trump officials, including Health and Human Services Secretary Robert F. Kennedy Jr. and Medicare and Medicaid administrator Mehmet Oz, outlined the Republican rationale for the requirements: They argue that too many able-bodied people are choosing not to work so they can stay on Medicaid, and more stringent requirements will force those people into the workforce and into better-paying jobs that provide their own health insurance.“This is about opportunity,” they wrote. “We believe that work is transformative for the individual who moves from welfare to employment.”But based on the best available estimates and past experience, Medicaid work requirements likely won’t achieve what Republicans want. The Senate will now consider the legislation and more changes could still be made, but if these requirements do ultimately become law and take effect, millions of Americans could lose their insurance over the next decade — and not necessarily because they aren’t working. Why work requirements will cause a lot of harm without doing much goodMedicaid was founded in 1965 as an entitlement program: If you qualify by your income and you sign up, you get the benefits. No extra red tape. More than half of the program’s enrollment is estimated to be people over 65, people with disabilities, and children. Many of the adults covered by Medicaid are either pregnant mothers, who can receive up to a year of postpartum coverage, or parents of young children. For most of the program’s history, adults who were not disabled and did not have kids were not eligible in most states. But in 2010, the ACA changed that, extending eligibility to anyone whose income was 133 percent of the federal poverty levelor lower. The expansion added roughly 20 million people to the rolls.Republicans have long argued for work requirements for people who receive benefits across a variety of programs — Medicaid, food stamps, cash assistance — and even successfully instituted them for food stamps in 1996 as part of President Bill Clinton’s welfare reform legislation. After Medicaid was expanded by the ACA and then Republicans failed to repeal the law in 2017, work requirements became one of their top priorities for the program. Medicaid expansion had proven too popular to totally undo, but instituting work requirements would reduce coverage for a group many in the GOP do not want covered by Medicaid at all.But our country has already tested this — and it didn’t go so well: During President Donald Trump’s first term, the administration allowed states to apply to experiment with work requirements and approved the first-ever Medicaid work requirements in a handful of states on a preliminary basis. One state, Arkansas, actually implemented the policy before it was blocked by the courts.In just a matter of months, 18,000 Arkansans lost their health insurance — most of them losing coverage because they were found ineligible after not reporting their information correctly to the state. After the policy was implemented, respondents were required to report their work activities by paper forms, phone calls, or an online portal each month; that included those who were exempt. But Arkansas was criticized for the arduous reporting process and for failing to clearly explain the change: An analysis from Harvard researchers found that 70 percent of the people who were supposed to satisfy the work requirement were confused about the policy’s specifics and did not know whether it was actually in effect. During one month, according to state data, less than 15 percent of the people who were supposed to report their work activities to the state actually did.About 25 percent of the people who were supposed to comply with the new work requirement lost coverage from June 2018 to March 2019 — even though experts estimated 95 percent of the affected population should have been exempted or were meeting the obligations.As I have written before, the Medicaid population can be hard to reach: People with lower incomes are more likely to change addresses, more likely to have irregular work schedules, and less likely to have regular internet access. For all of these reasons, these people can struggle to send paperwork by mail or complete an online form. And if they fail to do so when a Medicaid work requirement is in effect, they will lose their health coverage.Based on the Arkansas experience, the Center for Budget Policy and Priorities estimated that two out of every three people who could lose coverage under the Republican bill would be people who are working or who should qualify for an exemption — but who fail to submit their paperwork properly or experience another administrative snafu.The GOP argues these incentives will lead to more people getting jobs or earning higher wages, but Arkansas’s experience doesn’t inspire much confidence there either. Employment rates stayed flat. The Harvard researchers concluded that the Arkansans who were subjected to the Medicaid work requirement did not see an increase in their employment or their earnings. The policy failed to push people into the workforce — it just pushed them out of the safety net. Multiple studies have found that Medicaid is literally a life-saving program. When people lose Medicaid benefits, they are more likely to face medical debt and push off medical care because of the cost. In Arkansas, according to the Harvard analysts, 56 percent of people who lost their coverage because of the work requirement said they delayed a medical service and 64 percent of them said they delayed taking a medication because of its cost.That’s the new world that the Republican bill would create, one in which failing to turn in the right paperwork could mean you can no longer afford your blood pressure or diabetes drugs.See More:
    #cruelest #cut #republican #budget #bill
    The cruelest cut in the Republican budget bill, explained
    Medicaid may be about to change in a big way: Republicans in Congress are getting closer to passing a bill that, along with cutting taxes and imposing new immigration restrictions, would require people to work — or else risk losing government health benefits. It’s a change the party has long desired.Right now, if you qualify for Medicaid, the government health insurance program for low-income people, based on your earnings, assets, and life circumstances, you can receive health coverage through the program — no other questions asked. Nearly 80 million people are currently insured by Medicaid, making it the single largest insurance program in the US. Under the current law, there is no obligation to work or fulfill any other community service requirement in order to receive your Medicaid benefits.But now, the GOP’s One Big Beautiful Bill Act, which passed the House by one vote early Thursday morning, would establish nationwide work requirements for the program for the first time, starting at the end of 2026. If the bill becomes law, people who became eligible for the program under the Affordable Care Act — generally, adults without children living in or near poverty — would be required to report at least 80 hours of work every month or another community activity such as volunteering, or they could lose their benefits. Seniors, people with disabilities, caregivers for dependent people, and pregnant people are supposed to be exempted under the bill as currently written, but there’s some gray area here. Some of the decisions about how to implement the requirements will be left to the states: They could, for example, require Medicaid enrollees to report their activities every month or every six months.The Congressional Budget Office estimates that 10.3 million people would lose their Medicaid coverage by 2034 under the GOP bill, about half of them due to the work requirement provision. Other losses would result from a number of smaller provisions, according to the estimate. These include things like more frequent and stringent eligibility checks that will also require people to jump through more hoops to keep their benefits. Outside projections are even higher: The left-leaning Center for Budget Policy and Priorities estimated that as many as 14.4 million people could lose benefits in the next decade.In a recent New York Times op-ed, a number of Trump officials, including Health and Human Services Secretary Robert F. Kennedy Jr. and Medicare and Medicaid administrator Mehmet Oz, outlined the Republican rationale for the requirements: They argue that too many able-bodied people are choosing not to work so they can stay on Medicaid, and more stringent requirements will force those people into the workforce and into better-paying jobs that provide their own health insurance.“This is about opportunity,” they wrote. “We believe that work is transformative for the individual who moves from welfare to employment.”But based on the best available estimates and past experience, Medicaid work requirements likely won’t achieve what Republicans want. The Senate will now consider the legislation and more changes could still be made, but if these requirements do ultimately become law and take effect, millions of Americans could lose their insurance over the next decade — and not necessarily because they aren’t working. Why work requirements will cause a lot of harm without doing much goodMedicaid was founded in 1965 as an entitlement program: If you qualify by your income and you sign up, you get the benefits. No extra red tape. More than half of the program’s enrollment is estimated to be people over 65, people with disabilities, and children. Many of the adults covered by Medicaid are either pregnant mothers, who can receive up to a year of postpartum coverage, or parents of young children. For most of the program’s history, adults who were not disabled and did not have kids were not eligible in most states. But in 2010, the ACA changed that, extending eligibility to anyone whose income was 133 percent of the federal poverty levelor lower. The expansion added roughly 20 million people to the rolls.Republicans have long argued for work requirements for people who receive benefits across a variety of programs — Medicaid, food stamps, cash assistance — and even successfully instituted them for food stamps in 1996 as part of President Bill Clinton’s welfare reform legislation. After Medicaid was expanded by the ACA and then Republicans failed to repeal the law in 2017, work requirements became one of their top priorities for the program. Medicaid expansion had proven too popular to totally undo, but instituting work requirements would reduce coverage for a group many in the GOP do not want covered by Medicaid at all.But our country has already tested this — and it didn’t go so well: During President Donald Trump’s first term, the administration allowed states to apply to experiment with work requirements and approved the first-ever Medicaid work requirements in a handful of states on a preliminary basis. One state, Arkansas, actually implemented the policy before it was blocked by the courts.In just a matter of months, 18,000 Arkansans lost their health insurance — most of them losing coverage because they were found ineligible after not reporting their information correctly to the state. After the policy was implemented, respondents were required to report their work activities by paper forms, phone calls, or an online portal each month; that included those who were exempt. But Arkansas was criticized for the arduous reporting process and for failing to clearly explain the change: An analysis from Harvard researchers found that 70 percent of the people who were supposed to satisfy the work requirement were confused about the policy’s specifics and did not know whether it was actually in effect. During one month, according to state data, less than 15 percent of the people who were supposed to report their work activities to the state actually did.About 25 percent of the people who were supposed to comply with the new work requirement lost coverage from June 2018 to March 2019 — even though experts estimated 95 percent of the affected population should have been exempted or were meeting the obligations.As I have written before, the Medicaid population can be hard to reach: People with lower incomes are more likely to change addresses, more likely to have irregular work schedules, and less likely to have regular internet access. For all of these reasons, these people can struggle to send paperwork by mail or complete an online form. And if they fail to do so when a Medicaid work requirement is in effect, they will lose their health coverage.Based on the Arkansas experience, the Center for Budget Policy and Priorities estimated that two out of every three people who could lose coverage under the Republican bill would be people who are working or who should qualify for an exemption — but who fail to submit their paperwork properly or experience another administrative snafu.The GOP argues these incentives will lead to more people getting jobs or earning higher wages, but Arkansas’s experience doesn’t inspire much confidence there either. Employment rates stayed flat. The Harvard researchers concluded that the Arkansans who were subjected to the Medicaid work requirement did not see an increase in their employment or their earnings. The policy failed to push people into the workforce — it just pushed them out of the safety net. Multiple studies have found that Medicaid is literally a life-saving program. When people lose Medicaid benefits, they are more likely to face medical debt and push off medical care because of the cost. In Arkansas, according to the Harvard analysts, 56 percent of people who lost their coverage because of the work requirement said they delayed a medical service and 64 percent of them said they delayed taking a medication because of its cost.That’s the new world that the Republican bill would create, one in which failing to turn in the right paperwork could mean you can no longer afford your blood pressure or diabetes drugs.See More: #cruelest #cut #republican #budget #bill
    The cruelest cut in the Republican budget bill, explained
    www.vox.com
    Medicaid may be about to change in a big way: Republicans in Congress are getting closer to passing a bill that, along with cutting taxes and imposing new immigration restrictions, would require people to work — or else risk losing government health benefits. It’s a change the party has long desired.Right now, if you qualify for Medicaid, the government health insurance program for low-income people, based on your earnings, assets, and life circumstances, you can receive health coverage through the program — no other questions asked. Nearly 80 million people are currently insured by Medicaid, making it the single largest insurance program in the US. Under the current law, there is no obligation to work or fulfill any other community service requirement in order to receive your Medicaid benefits.But now, the GOP’s One Big Beautiful Bill Act, which passed the House by one vote early Thursday morning, would establish nationwide work requirements for the program for the first time, starting at the end of 2026. If the bill becomes law, people who became eligible for the program under the Affordable Care Act — generally, adults without children living in or near poverty — would be required to report at least 80 hours of work every month or another community activity such as volunteering, or they could lose their benefits. Seniors, people with disabilities, caregivers for dependent people, and pregnant people are supposed to be exempted under the bill as currently written, but there’s some gray area here. Some of the decisions about how to implement the requirements will be left to the states: They could, for example, require Medicaid enrollees to report their activities every month or every six months. (Prior research has found that additional reporting requirements tend to lead to more people losing benefits.) The Congressional Budget Office estimates that 10.3 million people would lose their Medicaid coverage by 2034 under the GOP bill, about half of them due to the work requirement provision. Other losses would result from a number of smaller provisions, according to the estimate. These include things like more frequent and stringent eligibility checks that will also require people to jump through more hoops to keep their benefits. Outside projections are even higher: The left-leaning Center for Budget Policy and Priorities estimated that as many as 14.4 million people could lose benefits in the next decade.In a recent New York Times op-ed, a number of Trump officials, including Health and Human Services Secretary Robert F. Kennedy Jr. and Medicare and Medicaid administrator Mehmet Oz, outlined the Republican rationale for the requirements: They argue that too many able-bodied people are choosing not to work so they can stay on Medicaid, and more stringent requirements will force those people into the workforce and into better-paying jobs that provide their own health insurance.“This is about opportunity,” they wrote. “We believe that work is transformative for the individual who moves from welfare to employment.”But based on the best available estimates and past experience, Medicaid work requirements likely won’t achieve what Republicans want. The Senate will now consider the legislation and more changes could still be made, but if these requirements do ultimately become law and take effect, millions of Americans could lose their insurance over the next decade — and not necessarily because they aren’t working. Why work requirements will cause a lot of harm without doing much goodMedicaid was founded in 1965 as an entitlement program: If you qualify by your income and you sign up, you get the benefits. No extra red tape. More than half of the program’s enrollment is estimated to be people over 65 (10 percent), people with disabilities (13 percent), and children (34 percent). Many of the adults covered by Medicaid are either pregnant mothers, who can receive up to a year of postpartum coverage, or parents of young children. For most of the program’s history, adults who were not disabled and did not have kids were not eligible in most states. But in 2010, the ACA changed that, extending eligibility to anyone whose income was 133 percent of the federal poverty level ($20,800 for an individual; $35,000 for a family of three) or lower. The expansion added roughly 20 million people to the rolls.Republicans have long argued for work requirements for people who receive benefits across a variety of programs — Medicaid, food stamps, cash assistance — and even successfully instituted them for food stamps in 1996 as part of President Bill Clinton’s welfare reform legislation. After Medicaid was expanded by the ACA and then Republicans failed to repeal the law in 2017, work requirements became one of their top priorities for the program. Medicaid expansion had proven too popular to totally undo, but instituting work requirements would reduce coverage for a group many in the GOP do not want covered by Medicaid at all.But our country has already tested this — and it didn’t go so well: During President Donald Trump’s first term, the administration allowed states to apply to experiment with work requirements and approved the first-ever Medicaid work requirements in a handful of states on a preliminary basis. One state, Arkansas, actually implemented the policy before it was blocked by the courts.In just a matter of months, 18,000 Arkansans lost their health insurance — most of them losing coverage because they were found ineligible after not reporting their information correctly to the state. After the policy was implemented, respondents were required to report their work activities by paper forms, phone calls, or an online portal each month; that included those who were exempt. But Arkansas was criticized for the arduous reporting process and for failing to clearly explain the change: An analysis from Harvard researchers found that 70 percent of the people who were supposed to satisfy the work requirement were confused about the policy’s specifics and did not know whether it was actually in effect. During one month, according to state data, less than 15 percent of the people who were supposed to report their work activities to the state actually did.About 25 percent of the people who were supposed to comply with the new work requirement lost coverage from June 2018 to March 2019 — even though experts estimated 95 percent of the affected population should have been exempted or were meeting the obligations.As I have written before, the Medicaid population can be hard to reach: People with lower incomes are more likely to change addresses, more likely to have irregular work schedules, and less likely to have regular internet access. For all of these reasons, these people can struggle to send paperwork by mail or complete an online form. And if they fail to do so when a Medicaid work requirement is in effect, they will lose their health coverage.Based on the Arkansas experience, the Center for Budget Policy and Priorities estimated that two out of every three people who could lose coverage under the Republican bill would be people who are working or who should qualify for an exemption — but who fail to submit their paperwork properly or experience another administrative snafu.The GOP argues these incentives will lead to more people getting jobs or earning higher wages, but Arkansas’s experience doesn’t inspire much confidence there either. Employment rates stayed flat. The Harvard researchers concluded that the Arkansans who were subjected to the Medicaid work requirement did not see an increase in their employment or their earnings. The policy failed to push people into the workforce — it just pushed them out of the safety net. Multiple studies have found that Medicaid is literally a life-saving program. When people lose Medicaid benefits, they are more likely to face medical debt and push off medical care because of the cost. In Arkansas, according to the Harvard analysts, 56 percent of people who lost their coverage because of the work requirement said they delayed a medical service and 64 percent of them said they delayed taking a medication because of its cost.That’s the new world that the Republican bill would create, one in which failing to turn in the right paperwork could mean you can no longer afford your blood pressure or diabetes drugs.See More:
    0 Comentários ·0 Compartilhamentos ·0 Anterior
  • Let the AI Security War Games Begin

    In February 2024, CNN reported, “A finance worker at a multinational firm was tricked into paying out million to fraudsters using deepfake technology to pose as the company’s chief financial officer in a video conference call.” In Europe, a second firm experienced a multimillion-dollar fraud when a deepfake emulated a board member in a video allegedly approving a fraudulent transfer of funds. “Banks and financial institutions are particularly at risk,” said The Hack Academy. “A study by Deloitte found that over 50% of senior executives expect deepfake scams to target their organizations soon. These attacks can undermine trust and lead to significant financial loss.”  Hack Academy went on to say that AI-inspired security attacks weren’t confined to deepfakes. These attacks were also beginning to occur with increased regularity in the form of corporate espionage and misinformation campaigns. AI brings new, more dangerous tactics to traditional security attack methods like phishing, social engineering and the insertion of malware into systems. For CIOs, enterprise AI system developers, data scientists and IT network professionals, AI changes the rules and the tactics for security, given AI’s limitless potential for both good and bad. This is forcing a reset in how IT thinks about security against malicious actors and intruders. Related:How Bad Actors are Exploiting AI What exactly is IT up against? The AI tools that are available on the dark web and in public cyber marketplaces give security perpetrators a wide choice of AI weaponry. Also, IoT and edge networks now present much broader enterprise attack surfaces. Security threats can come in videos, phone calls, social media sites, corporate systems and networks, vendor clouds, IoT devices, network end points, and virtually any entry point into a corporate IT environment that electronic communications can penetrate. Here are some of the current AI-embellished security attacks that companies are seeing: Convincing deepfake videos of corporate executives and stakeholders that are intended to dupe companies in pursuing certain actions or transferring certain assets or funds. This deep faking also extends to voice simulations of key personnel that are left as voicemails in corporate phone systems.  Phishing and spearfishing attacks that send convincing emailsto employees, who mistakenly open them because they think the sender is their boss, the CEO or someone else they perceive as trusted. AI supercharges these attacks because it can automate and send out a large volume of emails that hit many employee email accounts. That AI continues to “learn” with the help of machine learning so it can discover new trusted sender candidates for future attacks.   Related:Adaptive messaging that uses generative AI to craft messages to users that correct grammar and that “learn” from corporate communication styles so they can more closely emulate corporate communications that make them seem legitimate. Mutating code that uses AI to change malware signatures on the fly so antivirus detection mechanisms can be evaded. Data poisoning that occurs when a corporate or cloud provider’s AI data repository is injected by malware that altersso the data produces erroneous and misleading results.  Fighting Back With Tech To combat these supercharged AI-based security threats, IT has number of tools, techniques and strategies it can consider. Fighting deepfakes. Deepfakes can come in the form of videos, voicemails and photos. Since deepfakes are unstructured data objects that can’t be parsed in their native forms like real data, there are new tools on the market that can convert these objects into graphical representations that can be analyzed to evaluate whether there is something in an object that should or shouldn’t be there. The goal is to confirm authenticity.  Related:Fighting phishing and spear phishing. A combination of policy and practice works best to combat phishing and spear phishing attacks. Both types of attacks are predicated on users being tricked into opening an email attachment that they believe is from a trusted sender, so the first line of defense is educatingusers on how to handle their email. For instance, a user should notify IT if they receive an email that seems unusual or unexpected, and they should never open it. IT should also review its current security tools. Is it still using older security monitoring software that doesn’t include more modern technologies like observability, which can check for security intrusions or malware at more atomic levels?  Is IT still using IAMsoftware to track user identities and activities at a top level in the cloud and on top and atomic levels on premises, or has it also added cloud identity entitlements management, which gives it an atomic level view of  user accesses and activities in the cloud? Better yet, has IT moved to identity governance administration, which can serve as an over-arching umbrella for IAM and CIEM plugins, plus provide detailed audit reports and automated compliance across all platforms? Fighting embedded malware code. Malware can lie dormant in systems for months, giving a bad actor the option to activate it whenever the timing is right. It’s all the more reason for IT to augment its security staff with new skillsets, such as that of the “threat hunter,” whose job is to examine networks, data and systems on a daily basis, hunting down malware that might be lurking within, and destroying it before it activates. Fighting with zero-trust networks. Internet of Thingsdevices come into companies with little or no security because IoT suppliers don’t pay much attention to it and there is a general expectation that corporate IT will configure devices to the appropriate security settings. The problem is, IT often forgets to do this. There are also times when users purchase their own IoT gear, and IT doesn’t know about it. Zero-trust networks help manage this, because they detect and report on everything that is added, subtracted or modified on the network. This gives IT visibility into new, potential security breach points. A second step is to formalize IT procedures for IoT devices so that no IoT device is deployed without the device’s security first being set to corporate standards.  Fighting AI data poisoning. AI models, systems and data should be continuously monitored for accuracy. As soon as they show lowered levels of accuracy or produce unusual conclusions, the data repository, inflows and outflows should be examined for quality and non-bias of data. If contamination is found, the system should be taken down, the data sanitized, and the sources of the contamination traced, tracked and disabled. Fighting AI with AI. Most every security tool on the market today contains AI functionality to detect anomalies, abnormal data patterns and user activities. Additionally, forensics AI can dissect a security breach that does occur, isolating how it happened, where it originated from and what caused it. Since most sites don’t have on-staff forensics experts, IT will have to train staff in forensics skills. Fighting with regular audits and vulnerability testing. Minimally, IT vulnerability testing should be performed on a quarterly basis, and full security audits on an annual basis. If sites use cloud providers, they should request each provider’s latest security audit for review. An outside auditor can also help sites prepare for future AI-driven security threats, because auditors stay on top of the industry, visit many different companies, and see many different situations. An advanced knowledge of threats that loom in the future helps sites prepare for new battles. Summary AI technology is moving faster than legal rulings and regulations. This leaves most IT departments “on their own” to develop security defenses against bad actors who use AI against them.  The good news is that IT already has insights into how bad actors intend to use AI, and there are tools on the market that can help defensive efforts. What’s been missing is a proactive and aggressive battle plan from IT. That has to start now. 
    #let #security #war #games #begin
    Let the AI Security War Games Begin
    In February 2024, CNN reported, “A finance worker at a multinational firm was tricked into paying out million to fraudsters using deepfake technology to pose as the company’s chief financial officer in a video conference call.” In Europe, a second firm experienced a multimillion-dollar fraud when a deepfake emulated a board member in a video allegedly approving a fraudulent transfer of funds. “Banks and financial institutions are particularly at risk,” said The Hack Academy. “A study by Deloitte found that over 50% of senior executives expect deepfake scams to target their organizations soon. These attacks can undermine trust and lead to significant financial loss.”  Hack Academy went on to say that AI-inspired security attacks weren’t confined to deepfakes. These attacks were also beginning to occur with increased regularity in the form of corporate espionage and misinformation campaigns. AI brings new, more dangerous tactics to traditional security attack methods like phishing, social engineering and the insertion of malware into systems. For CIOs, enterprise AI system developers, data scientists and IT network professionals, AI changes the rules and the tactics for security, given AI’s limitless potential for both good and bad. This is forcing a reset in how IT thinks about security against malicious actors and intruders. Related:How Bad Actors are Exploiting AI What exactly is IT up against? The AI tools that are available on the dark web and in public cyber marketplaces give security perpetrators a wide choice of AI weaponry. Also, IoT and edge networks now present much broader enterprise attack surfaces. Security threats can come in videos, phone calls, social media sites, corporate systems and networks, vendor clouds, IoT devices, network end points, and virtually any entry point into a corporate IT environment that electronic communications can penetrate. Here are some of the current AI-embellished security attacks that companies are seeing: Convincing deepfake videos of corporate executives and stakeholders that are intended to dupe companies in pursuing certain actions or transferring certain assets or funds. This deep faking also extends to voice simulations of key personnel that are left as voicemails in corporate phone systems.  Phishing and spearfishing attacks that send convincing emailsto employees, who mistakenly open them because they think the sender is their boss, the CEO or someone else they perceive as trusted. AI supercharges these attacks because it can automate and send out a large volume of emails that hit many employee email accounts. That AI continues to “learn” with the help of machine learning so it can discover new trusted sender candidates for future attacks.   Related:Adaptive messaging that uses generative AI to craft messages to users that correct grammar and that “learn” from corporate communication styles so they can more closely emulate corporate communications that make them seem legitimate. Mutating code that uses AI to change malware signatures on the fly so antivirus detection mechanisms can be evaded. Data poisoning that occurs when a corporate or cloud provider’s AI data repository is injected by malware that altersso the data produces erroneous and misleading results.  Fighting Back With Tech To combat these supercharged AI-based security threats, IT has number of tools, techniques and strategies it can consider. Fighting deepfakes. Deepfakes can come in the form of videos, voicemails and photos. Since deepfakes are unstructured data objects that can’t be parsed in their native forms like real data, there are new tools on the market that can convert these objects into graphical representations that can be analyzed to evaluate whether there is something in an object that should or shouldn’t be there. The goal is to confirm authenticity.  Related:Fighting phishing and spear phishing. A combination of policy and practice works best to combat phishing and spear phishing attacks. Both types of attacks are predicated on users being tricked into opening an email attachment that they believe is from a trusted sender, so the first line of defense is educatingusers on how to handle their email. For instance, a user should notify IT if they receive an email that seems unusual or unexpected, and they should never open it. IT should also review its current security tools. Is it still using older security monitoring software that doesn’t include more modern technologies like observability, which can check for security intrusions or malware at more atomic levels?  Is IT still using IAMsoftware to track user identities and activities at a top level in the cloud and on top and atomic levels on premises, or has it also added cloud identity entitlements management, which gives it an atomic level view of  user accesses and activities in the cloud? Better yet, has IT moved to identity governance administration, which can serve as an over-arching umbrella for IAM and CIEM plugins, plus provide detailed audit reports and automated compliance across all platforms? Fighting embedded malware code. Malware can lie dormant in systems for months, giving a bad actor the option to activate it whenever the timing is right. It’s all the more reason for IT to augment its security staff with new skillsets, such as that of the “threat hunter,” whose job is to examine networks, data and systems on a daily basis, hunting down malware that might be lurking within, and destroying it before it activates. Fighting with zero-trust networks. Internet of Thingsdevices come into companies with little or no security because IoT suppliers don’t pay much attention to it and there is a general expectation that corporate IT will configure devices to the appropriate security settings. The problem is, IT often forgets to do this. There are also times when users purchase their own IoT gear, and IT doesn’t know about it. Zero-trust networks help manage this, because they detect and report on everything that is added, subtracted or modified on the network. This gives IT visibility into new, potential security breach points. A second step is to formalize IT procedures for IoT devices so that no IoT device is deployed without the device’s security first being set to corporate standards.  Fighting AI data poisoning. AI models, systems and data should be continuously monitored for accuracy. As soon as they show lowered levels of accuracy or produce unusual conclusions, the data repository, inflows and outflows should be examined for quality and non-bias of data. If contamination is found, the system should be taken down, the data sanitized, and the sources of the contamination traced, tracked and disabled. Fighting AI with AI. Most every security tool on the market today contains AI functionality to detect anomalies, abnormal data patterns and user activities. Additionally, forensics AI can dissect a security breach that does occur, isolating how it happened, where it originated from and what caused it. Since most sites don’t have on-staff forensics experts, IT will have to train staff in forensics skills. Fighting with regular audits and vulnerability testing. Minimally, IT vulnerability testing should be performed on a quarterly basis, and full security audits on an annual basis. If sites use cloud providers, they should request each provider’s latest security audit for review. An outside auditor can also help sites prepare for future AI-driven security threats, because auditors stay on top of the industry, visit many different companies, and see many different situations. An advanced knowledge of threats that loom in the future helps sites prepare for new battles. Summary AI technology is moving faster than legal rulings and regulations. This leaves most IT departments “on their own” to develop security defenses against bad actors who use AI against them.  The good news is that IT already has insights into how bad actors intend to use AI, and there are tools on the market that can help defensive efforts. What’s been missing is a proactive and aggressive battle plan from IT. That has to start now.  #let #security #war #games #begin
    Let the AI Security War Games Begin
    www.informationweek.com
    In February 2024, CNN reported, “A finance worker at a multinational firm was tricked into paying out $25 million to fraudsters using deepfake technology to pose as the company’s chief financial officer in a video conference call.” In Europe, a second firm experienced a multimillion-dollar fraud when a deepfake emulated a board member in a video allegedly approving a fraudulent transfer of funds. “Banks and financial institutions are particularly at risk,” said The Hack Academy. “A study by Deloitte found that over 50% of senior executives expect deepfake scams to target their organizations soon. These attacks can undermine trust and lead to significant financial loss.”  Hack Academy went on to say that AI-inspired security attacks weren’t confined to deepfakes. These attacks were also beginning to occur with increased regularity in the form of corporate espionage and misinformation campaigns. AI brings new, more dangerous tactics to traditional security attack methods like phishing, social engineering and the insertion of malware into systems. For CIOs, enterprise AI system developers, data scientists and IT network professionals, AI changes the rules and the tactics for security, given AI’s limitless potential for both good and bad. This is forcing a reset in how IT thinks about security against malicious actors and intruders. Related:How Bad Actors are Exploiting AI What exactly is IT up against? The AI tools that are available on the dark web and in public cyber marketplaces give security perpetrators a wide choice of AI weaponry. Also, IoT and edge networks now present much broader enterprise attack surfaces. Security threats can come in videos, phone calls, social media sites, corporate systems and networks, vendor clouds, IoT devices, network end points, and virtually any entry point into a corporate IT environment that electronic communications can penetrate. Here are some of the current AI-embellished security attacks that companies are seeing: Convincing deepfake videos of corporate executives and stakeholders that are intended to dupe companies in pursuing certain actions or transferring certain assets or funds. This deep faking also extends to voice simulations of key personnel that are left as voicemails in corporate phone systems.  Phishing and spearfishing attacks that send convincing emails (some with malicious attachments) to employees, who mistakenly open them because they think the sender is their boss, the CEO or someone else they perceive as trusted. AI supercharges these attacks because it can automate and send out a large volume of emails that hit many employee email accounts. That AI continues to “learn” with the help of machine learning so it can discover new trusted sender candidates for future attacks.   Related:Adaptive messaging that uses generative AI to craft messages to users that correct grammar and that “learn” from corporate communication styles so they can more closely emulate corporate communications that make them seem legitimate. Mutating code that uses AI to change malware signatures on the fly so antivirus detection mechanisms can be evaded. Data poisoning that occurs when a corporate or cloud provider’s AI data repository is injected by malware that alters (“poisons”) so the data produces erroneous and misleading results.  Fighting Back With Tech To combat these supercharged AI-based security threats, IT has number of tools, techniques and strategies it can consider. Fighting deepfakes. Deepfakes can come in the form of videos, voicemails and photos. Since deepfakes are unstructured data objects that can’t be parsed in their native forms like real data, there are new tools on the market that can convert these objects into graphical representations that can be analyzed to evaluate whether there is something in an object that should or shouldn’t be there. The goal is to confirm authenticity.  Related:Fighting phishing and spear phishing. A combination of policy and practice works best to combat phishing and spear phishing attacks. Both types of attacks are predicated on users being tricked into opening an email attachment that they believe is from a trusted sender, so the first line of defense is educating (and repeat-educating) users on how to handle their email. For instance, a user should notify IT if they receive an email that seems unusual or unexpected, and they should never open it. IT should also review its current security tools. Is it still using older security monitoring software that doesn’t include more modern technologies like observability, which can check for security intrusions or malware at more atomic levels?  Is IT still using IAM (identity access management) software to track user identities and activities at a top level in the cloud and on top and atomic levels on premises, or has it also added cloud identity entitlements management (CIEM), which gives it an atomic level view of  user accesses and activities in the cloud? Better yet, has IT moved to identity governance administration (IGA), which can serve as an over-arching umbrella for IAM and CIEM plugins, plus provide detailed audit reports and automated compliance across all platforms? Fighting embedded malware code. Malware can lie dormant in systems for months, giving a bad actor the option to activate it whenever the timing is right. It’s all the more reason for IT to augment its security staff with new skillsets, such as that of the “threat hunter,” whose job is to examine networks, data and systems on a daily basis, hunting down malware that might be lurking within, and destroying it before it activates. Fighting with zero-trust networks. Internet of Things (IoT) devices come into companies with little or no security because IoT suppliers don’t pay much attention to it and there is a general expectation that corporate IT will configure devices to the appropriate security settings. The problem is, IT often forgets to do this. There are also times when users purchase their own IoT gear, and IT doesn’t know about it. Zero-trust networks help manage this, because they detect and report on everything that is added, subtracted or modified on the network. This gives IT visibility into new, potential security breach points. A second step is to formalize IT procedures for IoT devices so that no IoT device is deployed without the device’s security first being set to corporate standards.  Fighting AI data poisoning. AI models, systems and data should be continuously monitored for accuracy. As soon as they show lowered levels of accuracy or produce unusual conclusions, the data repository, inflows and outflows should be examined for quality and non-bias of data. If contamination is found, the system should be taken down, the data sanitized, and the sources of the contamination traced, tracked and disabled. Fighting AI with AI. Most every security tool on the market today contains AI functionality to detect anomalies, abnormal data patterns and user activities. Additionally, forensics AI can dissect a security breach that does occur, isolating how it happened, where it originated from and what caused it. Since most sites don’t have on-staff forensics experts, IT will have to train staff in forensics skills. Fighting with regular audits and vulnerability testing. Minimally, IT vulnerability testing should be performed on a quarterly basis, and full security audits on an annual basis. If sites use cloud providers, they should request each provider’s latest security audit for review. An outside auditor can also help sites prepare for future AI-driven security threats, because auditors stay on top of the industry, visit many different companies, and see many different situations. An advanced knowledge of threats that loom in the future helps sites prepare for new battles. Summary AI technology is moving faster than legal rulings and regulations. This leaves most IT departments “on their own” to develop security defenses against bad actors who use AI against them.  The good news is that IT already has insights into how bad actors intend to use AI, and there are tools on the market that can help defensive efforts. What’s been missing is a proactive and aggressive battle plan from IT. That has to start now. 
    0 Comentários ·0 Compartilhamentos ·0 Anterior
  • The data center boom in the desert

    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.”
    Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe.
    It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills. 
    He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space. 
    Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part.
    “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average.
    The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained. 
    The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways.
    As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”
    #data #center #boom #desert
    The data center boom in the desert
    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.  He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.  Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park. When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.  The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside.What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity.The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline fromto our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific detailsour plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.” #data #center #boom #desert
    The data center boom in the desert
    www.technologyreview.com
    In the high desert east of Reno, Nevada, construction crews are flattening the golden foothills of the Virginia Range, laying the foundations of a data center city. Google, Tract, Switch, EdgeCore, Novva, Vantage, and PowerHouse are all operating, building, or expanding huge facilities within the Tahoe Reno Industrial Center, a business park bigger than the city of Detroit.  This story is a part of MIT Technology Review’s series “Power Hungry: AI and our energy future,” on the energy demands and carbon costs of the artificial-intelligence revolution. Meanwhile, Microsoft acquired more than 225 acres of undeveloped property within the center and an even larger plot in nearby Silver Springs, Nevada. Apple is expanding its data center, located just across the Truckee River from the industrial park. OpenAI has said it’s considering building a data center in Nevada as well. The corporate race to amass computing resources to train and run artificial intelligence models and store information in the cloud has sparked a data center boom in the desert—just far enough away from Nevada’s communities to elude wide notice and, some fear, adequate scrutiny.  Switch, a data center company based in Las Vegas, says the full build-out of its campus at the Tahoe Reno Industrial Center could exceed seven million square feet.EMILY NAJERA The full scale and potential environmental impacts of the developments aren’t known, because the footprint, energy needs, and water requirements are often closely guarded corporate secrets. Most of the companies didn’t respond to inquiries from MIT Technology Review, or declined to provide additional information about the projects.  But there’s “a whole lot of construction going on,” says Kris Thompson, who served as the longtime project manager for the industrial center before stepping down late last year. “The last number I heard was 13 million square feet under construction right now, which is massive.” Indeed, it’s the equivalent of almost five Empire State Buildings laid out flat. In addition, public filings from NV Energy, the state’s near-monopoly utility, reveal that a dozen data-center projects, mostly in this area, have requested nearly six gigawatts of electricity capacity within the next decade.  That would make the greater Reno area—the biggest little city in the world—one of the largest data-center markets around the globe. It would also require expanding the state’s power sector by about 40%, all for a single industry in an explosive growth stage that may, or may not, prove sustainable. The energy needs, in turn, suggest those projects could consume billions of gallons of water per year, according to an analysis conducted for this story.  Construction crews are busy building data centers throughout the Tahoe Reno Industrial Center.EMILY NAJERA The build-out of a dense cluster of energy and water-hungry data centers in a small stretch of the nation’s driest state, where climate change is driving up temperatures faster than anywhere else in the country, has begun to raise alarms among water experts, environmental groups, and residents. That includes members of the Pyramid Lake Paiute Tribe, whose namesake water body lies within their reservation and marks the end point of the Truckee River, the region’s main source of water. Much of Nevada has suffered through severe drought conditions for years, farmers and communities are drawing down many of the state’s groundwater reservoirs faster than they can be refilled, and global warming is sucking more and more moisture out of the region’s streams, shrubs, and soils. “Telling entities that they can come in and stick more straws in the ground for data centers is raising a lot of questions about sound management,” says Kyle Roerink, executive director of the Great Basin Water Network, a nonprofit that works to protect water resources throughout Nevada and Utah.  “We just don’t want to be in a situation where the tail is wagging the dog,” he later added, “where this demand for data centers is driving water policy.” Luring data centers In the late 1850s, the mountains southeast of Reno began enticing prospectors from across the country, who hoped to strike silver or gold in the famed Comstock Lode. But Storey County had few residents or economic prospects by the late 1990s, around the time when Don Roger Norman, a media-shy real estate speculator, spotted a new opportunity in the sagebrush-covered hills.  He began buying up tens of thousands of acres of land for tens of millions of dollars and lining up development approvals to lure industrial projects to what became the Tahoe Reno Industrial Center. His partners included Lance Gilman, a cowboy-hat-wearing real estate broker, who later bought the nearby Mustang Ranch brothel and won a seat as a county commissioner. In 1999, the county passed an ordinance that preapproves companies to develop most types of commercial and industrial projects across the business park, cutting months to years off the development process. That helped cinch deals with a flock of tenants looking to build big projects fast, including Walmart, Tesla, and Redwood Materials. Now the promise of fast permits is helping to draw data centers by the gigawatt. On a clear, cool January afternoon, Brian Armon, a commercial real estate broker who leads the industrial practices group at NAI Alliance, takes me on a tour of the projects around the region, which mostly entails driving around the business center. Lance Gilman, a local real estate broker, helped to develop the Tahoe Reno Industrial Center and land some of its largest tenants.GREGG SEGAL After pulling off Interstate 80 onto USA Parkway, he points out the cranes, earthmovers, and riprap foundations, where a variety of data centers are under construction. Deeper into the industrial park, Armon pulls up near Switch’s long, low, arched-roof facility, which sits on a terrace above cement walls and security gates. The Las Vegas–based company says the first phase of its data center campus encompasses more than a million square feet, and that the full build-out will cover seven times that space.  Over the next hill, we turn around in Google’s parking lot. Cranes, tents, framing, and construction equipment extend behind the company’s existing data center, filling much of the 1,210-acre lot that the search engine giant acquired in 2017. Last August, during an event at the University of Nevada, Reno, the company announced it would spend $400 million to expand the data center campus along with another one in Las Vegas. Thompson says that the development company, Tahoe Reno Industrial LLC, has now sold off every parcel of developable land within the park (although several lots are available for resale following the failed gamble of one crypto tenant). When I ask Armon what’s attracting all the data centers here, he starts with the fast approvals but cites a list of other lures as well: The inexpensive land. NV Energy’s willingness to strike deals to supply relatively low-cost electricity. Cool nighttime and winter temperatures, as far as American deserts go, which reduce the energy and water needs. The proximity to tech hubs such as Silicon Valley, which cuts latency for applications in which milliseconds matter. And the lack of natural disasters that could shut down the facilities, at least for the most part. “We are high in seismic activity,” he says. “But everything else is good. We’re not going to have a tornado or flood or a devastating wildfire.” Then there’s the generous tax policies.In 2023, Novva, a Utah-based data center company, announced plans to build a 300,000-square-foot facility within the industrial business park. Nevada doesn’t charge corporate income tax, and it has also enacted deep tax cuts specifically for data centers that set up shop in the state. That includes abatements of up to 75% on property tax for a decade or two—and nearly as much of a bargain on the sales and use taxes applied to equipment purchased for the facilities. Data centers don’t require many permanent workers to run the operations, but the projects have created thousands of construction jobs. They’re also helping to diversify the region’s economy beyond casinos and generating tax windfalls for the state, counties, and cities, says Jeff Sutich, executive director of the Northern Nevada Development Authority. Indeed, just three data-center projects, developed by Apple, Google, and Vantage, will produce nearly half a billion dollars in tax revenue for Nevada, even with those generous abatements, according to the Nevada Governor’s Office of Economic Development. The question is whether the benefits of data centers are worth the tradeoffs for Nevadans, given the public health costs, greenhouse-gas emissions, energy demands, and water strains. The rain shadow The Sierra Nevada’s granite peaks trace the eastern edge of California, forcing Pacific Ocean winds to rise and cool. That converts water vapor in the air into the rain and snow that fill the range’s tributaries, rivers, and lakes.  But the same meteorological phenomenon casts a rain shadow over much of neighboring Nevada, forming an arid expanse known as the Great Basin Desert. The state receives about 10 inches of precipitation a year, about a third of the national average. The Truckee River draws from the melting Sierra snowpack at the edge of Lake Tahoe, cascades down the range, and snakes through the flatlands of Reno and Sparks. It forks at the Derby Dam, a Reclamation Act project a few miles from the Tahoe Reno Industrial Center, which diverts water to a farming region further east while allowing the rest to continue north toward Pyramid Lake.  Along the way, an engineered system of reservoirs, canals, and treatment plants divert, store, and release water from the river, supplying businesses, cities, towns, and native tribes across the region. But Nevada’s population and economy are expanding, creating more demands on these resources even as they become more constrained.  The Truckee River, which originates at Lake Tahoe and terminates at Pyramid Lake, is the major water source for cities, towns, and farms across northwestern Nevada.EMILY NAJERA Throughout much of the 2020s the state has suffered through one of the hottest and most widespread droughts on record, extending two decades of abnormally dry conditions across the American West. Some scientists fear it may constitute an emerging megadrought.  About 50% of Nevada currently faces moderate to exceptional drought conditions. In addition, more than half of the state’s hundreds of groundwater basins are already “over-appropriated,” meaning the water rights on paper exceed the levels believed to be underground.  It’s not clear if climate change will increase or decrease the state’s rainfall levels, on balance. But precipitation patterns are expected to become more erratic, whiplashing between short periods of intense rainfall and more-frequent, extended, or severe droughts.  In addition, more precipitation will fall as rain rather than snow, shortening the Sierra snow season by weeks to months over the coming decades.  “In the extreme case, at the end of the century, that’s pretty much all of winter,” says Sean McKenna, executive director of hydrologic sciences at the Desert Research Institute, a research division of the Nevada System of Higher Education. That loss will undermine an essential function of the Sierra snowpack: reliably delivering water to farmers and cities when it’s most needed in the spring and summer, across both Nevada and California.  These shifting conditions will require the region to develop better ways to store, preserve, and recycle the water it does get, McKenna says. Northern Nevada’s cities, towns, and agencies will also need to carefully evaluate and plan for the collective impacts of continuing growth and development on the interconnected water system, particularly when it comes to water-hungry projects like data centers, he adds. “We can’t consider each of these as a one-off, without considering that there may be tens or dozens of these in the next 15 years,” McKenna says.Thirsty data centers Data centers suck up water in two main ways. As giant rooms of server racks process information and consume energy, they generate heat that must be shunted away to prevent malfunctions and damage to the equipment. The processing units optimized for training and running AI models often draw more electricity and, in turn, produce more heat. To keep things cool, more and more data centers have turned to liquid cooling systems that don’t need as much electricity as fan cooling or air-conditioning. These often rely on water to absorb heat and transfer it to outdoor cooling towers, where much of the moisture evaporates. Microsoft’s US data centers, for instance, could have directly evaporated nearly 185,000 gallons of “clean freshwater” in the course of training OpenAI’s GPT-3 large language model, according to a 2023 preprint study led by researchers at the University of California, Riverside. (The research has since been peer-reviewed and is awaiting publication.) What’s less appreciated, however, is that the larger data-center drain on water generally occurs indirectly, at the power plants generating extra electricity for the turbocharged AI sector. These facilities, in turn, require more water to cool down equipment, among other purposes. You have to add up both uses “to reflect the true water cost of data centers,” says Shaolei Ren, an associate professor of electrical and computer engineering at UC Riverside and coauthor of the study. Ren estimates that the 12 data-center projects listed in NV Energy’s report would directly consume between 860 million gallons and 5.7 billion gallons a year, based on the requested electricity capacity. (“Consumed” here means the water is evaporated, not merely withdrawn and returned to the engineered water system.) The indirect water drain associated with electricity generation for those operations could add up to 15.5 billion gallons, based on the average consumption of the regional grid. The exact water figures would depend on shifting climate conditions, the type of cooling systems each data center uses, and the mix of power sources that supply the facilities. Solar power, which provides roughly a quarter of Nevada’s power, requires relatively little water to operate, for instance. But natural-gas plants, which generate about 56%, withdraw 2,803 gallons per megawatt-hour on average, according to the Energy Information Administration.  Geothermal plants, which produce about 10% of the state’s electricity by cycling water through hot rocks, generally consume less water than fossil fuel plants do but often require more water than other renewables, according to some research.  But here too, the water usage varies depending on the type of geothermal plant in question. Google has lined up several deals to partially power its data centers through Fervo Energy, which has helped to commercialize an emerging approach that injects water under high pressure to fracture rock and form wells deep below the surface.  The company stresses that it doesn’t evaporate water for cooling and that it relies on brackish groundwater, not fresh water, to develop and run its plants. In a recent post, Fervo noted that its facilities consume significantly less water per megawatt-hour than coal, nuclear, or natural-gas plants do. Part of NV Energy’s proposed plan to meet growing electricity demands in Nevada includes developing several natural-gas peaking units, adding more than one gigawatt of solar power and installing another gigawatt of battery storage. It's also forging ahead with a more than $4 billion transmission project. But the company didn’t respond to questions concerning how it will supply all of the gigawatts of additional electricity requested by data centers, if the construction of those power plants will increase consumer rates, or how much water those facilities are expected to consume. NV Energy operates a transmission line, substation, and power plant in or around the Tahoe Reno Industrial Center.EMILY NAJERA “NV Energy teams work diligently on our long-term planning to make investments in our infrastructure to serve new customers and the continued growth in the state without putting existing customers at risk,” the company said in a statement. An added challenge is that data centers need to run around the clock. That will often compel utilities to develop new electricity-generating sources that can run nonstop as well, as natural-gas, geothermal, or nuclear plants do, says Emily Grubert, an associate professor of sustainable energy policy at the University of Notre Dame, who has studied the relative water consumption of electricity sources.  “You end up with the water-intensive resources looking more important,” she adds. Even if NV Energy and the companies developing data centers do strive to power them through sources with relatively low water needs, “we only have so much ability to add six gigawatts to Nevada’s grid,” Grubert explains. “What you do will never be system-neutral, because it’s such a big number.” Securing supplies On a mid-February morning, I meet TRI’s Thompson and Don Gilman, Lance Gilman’s son, at the Storey County offices, located within the industrial center.  “I’m just a country boy who sells dirt,” Gilman, also a real estate broker, says by way of introduction.  We climb into his large SUV and drive to a reservoir in the heart of the industrial park, filled nearly to the lip.  Thompson explains that much of the water comes from an on-site treatment facility that filters waste fluids from companies in the park. In addition, tens of millions of gallons of treated effluent will also likely flow into the tank this year from the Truckee Meadows Water Authority Reclamation Facility, near the border of Reno and Sparks. That’s thanks to a 16-mile pipeline that the developers, the water authority, several tenants, and various local cities and agencies partnered to build, through a project that began in 2021. “Our general improvement district is furnishing that water to tech companies here in the park as we speak,” Thompson says. “That helps preserve the precious groundwater, so that is an environmental feather in the cap for these data centers. They are focused on environmental excellence.” The reservoir within the industrial business park provides water to data centers and other tenants.EMILY NAJERA But data centers often need drinking-quality water—not wastewater merely treated to irrigation standards—for evaporative cooling, “to avoid pipe clogs and/or bacterial growth,” the UC Riverside study notes. For instance, Google says its data centers withdrew about 7.7 billion gallons of water in 2023, and nearly 6 billion of those gallons were potable.  Tenants in the industrial park can potentially obtain access to water from the ground and the Truckee River, as well. From early on, the master developers worked hard to secure permits to water sources, since they are nearly as precious as development entitlements to companies hoping to build projects in the desert. Initially, the development company controlled a private business, the TRI Water and Sewer Company, that provided those services to the business park’s tenants, according to public documents. The company set up wells, a water tank, distribution lines, and a sewer disposal system.  But in 2000, the board of county commissioners established a general improvement district, a legal mechanism for providing municipal services in certain parts of the state, to manage electricity and then water within the center. It, in turn, hired TRI Water and Sewer as the operating company. As of its 2020 service plan, the general improvement district held permits for nearly 5,300 acre-feet of groundwater, “which can be pumped from well fields within the service area and used for new growth as it occurs.” The document lists another 2,000 acre-feet per year available from the on-site treatment facility, 1,000 from the Truckee River, and 4,000 more from the effluent pipeline.  Those figures haven’t budged much since, according to Shari Whalen, general manager of the TRI General Improvement District. All told, they add up to more than 4 billion gallons of water per year for all the needs of the industrial park and the tenants there, data centers and otherwise. Whalen says that the amount and quality of water required for any given data center depends on its design, and that those matters are worked out on a case-by-case basis.  When asked if the general improvement district is confident that it has adequate water resources to supply the needs of all the data centers under development, as well as other tenants at the industrial center, she says: “They can’t just show up and build unless they have water resources designated for their projects. We wouldn’t approve a project if it didn’t have those water resources.” Water As the region’s water sources have grown more constrained, lining up supplies has become an increasingly high-stakes and controversial business. More than a century ago, the US federal government filed a lawsuit against an assortment of parties pulling water from the Truckee River. The suit would eventually establish that the Pyramid Lake Paiute Tribe’s legal rights to water for irrigation superseded other claims. But the tribe has been fighting to protect those rights and increase flows from the river ever since, arguing that increasing strains on the watershed from upstream cities and businesses threaten to draw away water reserved for reservation farming, decrease lake levels, and harm native fish. The Pyramid Lake Paiute Tribe considers the water body and its fish, including the endangered cui-ui and threatened Lahontan cutthroat trout, to be essential parts of its culture, identity, and way of life. The tribe was originally named Cui-ui Ticutta, which translates to cui-ui eaters. The lake continues to provide sustenance as well as business for the tribe and its members, a number of whom operate boat charters and fishing guide services. “It’s completely tied into us as a people,” says Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe. “That is what has sustained us all this time,” he adds. “It’s just who we are. It’s part of our spiritual well-being.” Steven Wadsworth, chairman of the Pyramid Lake Paiute Tribe, fears that data centers will divert water that would otherwise reach the tribe’s namesake lake.EMILY NAJERA In recent decades, the tribe has sued the Nevada State Engineer, Washoe County, the federal government, and others for overallocating water rights and endangering the lake’s fish. It also protested the TRI General Improvement District’s applications to draw thousands of additional acre‑feet of groundwater from a basin near the business park. In 2019, the State Engineer’s office rejected those requests, concluding that the basin was already fully appropriated.  More recently, the tribe took issue with the plan to build the pipeline and divert effluent that would have flown into the Truckee, securing an agreement that required the Truckee Meadows Water Authority and other parties to add back several thousand acre‑feet of water to the river.  Whalen says she’s sensitive to Wadsworth’s concerns. But she says that the pipeline promises to keep a growing amount of treated wastewater out of the river, where it could otherwise contribute to rising salt levels in the lake. “I think that the pipeline from [the Truckee Meadows Water Authority] to our system is good for water quality in the river,” she says. “I understand philosophically the concerns about data centers, but the general improvement district is dedicated to working with everyone on the river for regional water-resource planning—and the tribe is no exception.” Water efficiency  In an email, Thompson added that he has “great respect and admiration,” for the tribe and has visited the reservation several times in an effort to help bring industrial or commercial development there. He stressed that all of the business park’s groundwater was “validated by the State Water Engineer,” and that the rights to surface water and effluent were purchased “for fair market value.”During the earlier interview at the industrial center, he and Gilman had both expressed confidence that tenants in the park have adequate water supplies, and that the businesses won’t draw water away from other areas.  “We’re in our own aquifer, our own water basin here,” Thompson said. “You put a straw in the ground here, you’re not going to pull water from Fernley or from Reno or from Silver Springs.” Gilman also stressed that data-center companies have gotten more water efficient in recent years, echoing a point others made as well. “With the newer technology, it’s not much of a worry,” says Sutich, of the Northern Nevada Development Authority. “The technology has come a long way in the last 10 years, which is really giving these guys the opportunity to be good stewards of water usage.” An aerial view of the cooling tower fans at Google’s data center in the Tahoe Reno Industrial Center.GOOGLE Indeed, Google’s existing Storey County facility is air-cooled, according to the company’s latest environmental report. The data center withdrew 1.9 million gallons in 2023 but only consumed 200,000 gallons. The rest cycles back into the water system. Google said all the data centers under construction on its campus will also “utilize air-cooling technology.” The company didn’t respond to a question about the scale of its planned expansion in the Tahoe Reno Industrial Center, and referred a question about indirect water consumption to NV Energy. The search giant has stressed that it strives to be water efficient across all of its data centers, and decides whether to use air or liquid cooling based on local supply and projected demand, among other variables. Four years ago, the company set a goal of replenishing more water than it consumes by 2030. Locally, it also committed to provide half a million dollars to the National Forest Foundation to improve the Truckee River watershed and reduce wildfire risks.  Microsoft clearly suggested in earlier news reports that the Silver Springs land it purchased around the end of 2022 would be used for a data center. NAI Alliance’s market real estate report identifies that lot, as well as the parcel Microsoft purchased within the Tahoe Reno Industrial Center, as data center sites. But the company now declines to specify what it intends to build in the region.  “While the land purchase is public knowledge, we have not disclosed specific details [of] our plans for the land or potential development timelines,” wrote Donna Whitehead, a Microsoft spokesperson, in an email.  Workers have begun grading land inside a fenced off lot within the Tahoe Reno Industrial Center.EMILY NAJERA Microsoft has also scaled down its global data-center ambitions, backing away from several projects in recent months amid shifting economic conditions, according to various reports. Whatever it ultimately does or doesn’t build, the company stresses that it has made strides to reduce water consumption in its facilities. Late last year, the company announced that it’s using “chip-level cooling solutions” in data centers, which continually circulate water between the servers and chillers through a closed loop that the company claims doesn’t lose any water to evaporation. It says the design requires only a “nominal increase” in energy compared to its data centers that rely on evaporative water cooling. Others seem to be taking a similar approach. EdgeCore also said its 900,000-square-foot data center at the Tahoe Reno Industrial Center will rely on an “air-cooled closed-loop chiller” that doesn’t require water evaporation for cooling.  But some of the companies seem to have taken steps to ensure access to significant amounts of water. Switch, for instance, took a lead role in developing the effluent pipeline. In addition, Tract, which develops campuses on which third-party data centers can build their own facilities, has said it lined up more than 1,100 acre-feet of water rights, the equivalent of nearly 360 million gallons a year.  Apple, Novva, Switch, Tract, and Vantage didn’t respond to inquiries from MIT Technology Review.  Coming conflicts  The suggestion that companies aren’t straining water supplies when they adopt air cooling is, in many cases, akin to saying they’re not responsible for the greenhouse gas produced through their power use simply because it occurs outside of their facilities. In fact, the additional water used at a power plant to meet the increased electricity needs of air cooling may exceed any gains at the data center, Ren, of UC Riverside, says. “That’s actually very likely, because it uses a lot more energy,” he adds. That means that some of the companies developing data centers in and around Storey County may simply hand off their water challenges to other parts of Nevada or neighboring states across the drying American West, depending on where and how the power is generated, Ren says.  Google has said its air-cooled facilities require about 10% more electricity, and its environmental report notes that the Storey County facility is one of its two least-energy-efficient data centers.  Pipes running along Google’s data center campus help the search company cool its servers.GOOGLE Some fear there’s also a growing mismatch between what Nevada’s water permits allow, what’s actually in the ground, and what nature will provide as climate conditions shift. Notably, the groundwater committed to all parties from the Tracy Segment basin—a long-fought-over resource that partially supplies the TRI General Improvement District—already exceeds the “perennial yield.” That refers to the maximum amount that can be drawn out every year without depleting the reservoir over the long term. “If pumping does ultimately exceed the available supply, that means there will be conflict among users,” Roerink, of the Great Basin Water Network, said in an email. “So I have to wonder: Who could be suing whom? Who could be buying out whom? How will the tribe’s rights be defended?”The Truckee Meadows Water Authority, the community-owned utility that manages the water system for Reno and Sparks, said it is planning carefully for the future and remains confident there will be “sufficient resources for decades to come,” at least within its territory east of the industrial center. Storey County, the Truckee-Carson Irrigation District, and the State Engineer’s office didn’t respond to questions or accept interview requests.  Open for business As data center proposals have begun shifting into Northern Nevada’s cities, more local residents and organizations have begun to take notice and express concerns. The regional division of the Sierra Club, for instance, recently sought to overturn the approval of Reno’s first data center, about 20 miles west of the Tahoe Reno Industrial Center.  Olivia Tanager, director of the Sierra Club’s Toiyabe Chapter, says the environmental organization was shocked by the projected electricity demands from data centers highlighted in NV Energy’s filings. Nevada’s wild horses are a common sight along USA Parkway, the highway cutting through the industrial business park. EMILY NAJERA “We have increasing interest in understanding the impact that data centers will have to our climate goals, to our grid as a whole, and certainly to our water resources,” she says. “The demands are extraordinary, and we don’t have that amount of water to toy around with.” During a city hall hearing in January that stretched late into the evening, she and a line of residents raised concerns about the water, energy, climate, and employment impacts of AI data centers. At the end, though, the city council upheld the planning department’s approval of the project, on a 5-2 vote. “Welcome to Reno,” Kathleen Taylor, Reno’s vice mayor, said before casting her vote. “We’re open for business.” Where the river ends In late March, I walk alongside Chairman Wadsworth, of the Pyramid Lake Paiute Tribe, on the shores of Pyramid Lake, watching a row of fly-fishers in waders cast their lines into the cold waters.  The lake is the largest remnant of Lake Lahontan, an Ice Age inland sea that once stretched across western Nevada and would have submerged present-day Reno. But as the climate warmed, the lapping waters retreated, etching erosional terraces into the mountainsides and exposing tufa deposits around the lake, large formations of porous rock made of calcium-carbonate. That includes the pyramid-shaped island on the eastern shore that inspired the lake’s name. A lone angler stands along the shores of Pyramid Lake. In the decades after the US Reclamation Service completed the Derby Dam in 1905, Pyramid Lake declined another 80 feet and nearby Winnemucca Lake dried up entirely. “We know what happens when water use goes unchecked,” says Wadsworth, gesturing eastward toward the range across the lake, where Winnemucca once filled the next basin over. “Because all we have to do is look over there and see a dry, barren lake bed that used to be full.”In an earlier interview, Wadsworth acknowledged that the world needs data centers. But he argued they should be spread out across the country, not densely clustered in the middle of the Nevada desert.Given the fierce competition for resources up to now, he can’t imagine how there could be enough water to meet the demands of data centers, expanding cities, and other growing businesses without straining the limited local supplies that should, by his accounting, flow to Pyramid Lake. He fears these growing pressures will force the tribe to wage new legal battles to protect their rights and preserve the lake, extending what he refers to as “a century of water wars.” “We have seen the devastating effects of what happens when you mess with Mother Nature,” Wadsworth says. “Part of our spirit has left us. And that’s why we fight so hard to hold on to what’s left.”
    0 Comentários ·0 Compartilhamentos ·0 Anterior
  • Bunq CEO warns closed minds are pushing Dutch entrepreneurs away

    Ali Niknam has built Dutch fintech Bunq into one of Europe’s biggest neobanks. But he fears the Netherlands is now driving entrepreneurs away.
    The Bunq founder and CEO is alarmed by the country’s business mindset. He believes risk-aversion, growing insularity, and hostility to ambition are pushing talent overseas.
    “Many of the best entrepreneurs I know have either left or are considering leaving,” Niknam tells TNW.
    Surveys back him up. A poll last year found that almost one in five Dutch entrepreneurs were considering relocating — up from nearly one in eight in 2023.
    The of EU techThe latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!Another study found that 24% of large companies were contemplating moves abroad — nearly double the share from the year before. 
    Tech scaleups are also mulling exits. One of the country’s biggest — software unicorn Bird — recently announced plans to shift operations out of the country. The company’s CEO blamed “over-regulation” and a bad climate for tech businesses.
    Niknam — who’s set to speak at TNW Conference on June 20 in Amsterdam — has his own critiques of the Dutch business landscape. He calls its support for entrepreneurship “among the worst” he’s seen. Yet he still has deep faith in the country’s talent pool. 
    “There are very few countries I know that have such amazing, creative, smart people as the Dutch,” he says. 
    Those people have been integral to Bunq’s rapid growth.
    Building bridges at Bunq
    Niknam’s idea for Bunq emerged in the wake of the 2008 financial crisis. One of the causes, he believes, was groupthink at incumbent banks. He founded Bunq in 2012 to create an alternative. 
    To create a new approach to banking, Niknam sought to embrace diverse ideas. He points to the company’s approach to proposals, which can be pitched anonymously — even to Niknam himself.
    “On the one hand, that’s better for the company — the best ideas win. And on the other hand, it makes it more fair, because all that counts is the quality of your idea, not who you know, where you’re brought up, or what school you attended.”
    The strategy delivered rapid results. In 2015, Bunq became the first Dutch company in 35 years to obtain a greenfield banking license. It then grew into Europe’s second-largest neobank after Revolut — and one of the few to achieve profitability. The company now boasts over 17 million users with more than €8bn in deposits.
    Outside Bunq, however, Niknam sees a country that’s becoming more closed. He believes the Netherlands is abandoning its internationalist roots, which is damaging its tech ecosystem and chasing talent away. 
    “Historically, the Netherlands has been very entrepreneurial, very international… when this country retreats and closes the doors is when things start to get worse.”
    Even the country’s vast pension funds, he notes, avoid backing Dutch startups. “They know the returns are going to be less,” he says. “Why are the returns less? Because it’s a small country, and it is retreating and starting to focus within its own borders.”
    He contrasts the mood with developments in the Baltics. The region’s tech ecosystem has attracted admiring glances for its optimism, openness, and rapid growth. Niknam feels that many people in the Netherlands take their rights for granted.
    “It’s maybe a little bit of an entitlement disease — that we have forgotten that all these wonderful things that we enjoy today, somebody worked for them really, really hard,” he says.
    Born in Canada to Iranian parents and with homes in the Netherlands and the US, Niknam has diverse cultural experiences. Image: OLSjopera
    Niknam feels the Netherlands has become too risk-averse and inward-looking. Despite the liberal stereotype, Dutch society can be surprisingly conservative. 
    That caution, Niknam says, is embedded in the culture — even in local proverbs. One goes: “Steek je kop niet boven het maaiveld uit.” Loosely translated: “Don’t stick your head above the mowing line.” If you do, it might get chopped off.
    In Niknam’s eyes, that mentality thwarts ambitious entrepreneurs.
    “Success is not only not celebrated, but you’re almost faulted for being successful,” he says. 
    The international return
    The Netherlands is also losing its appeal to international talent. Over nine in ten expats and migrant workers no longer even consider coming to work in the country, according to research from last year.
    Tech firms have raised major concerns over losing access to global talent. The chip equipment maker ASML — the largest company in the Netherlands — has threatened to move abroad because of the country’s hardening stance on migrants.
    Peter Wennink, ASML’s former CEO — who will also speak at TNW Conference — recently warned against losing access to skilled workers. “If we cannot get those people here, we will get those people in Eastern Europe or in Asia or in the United States,” he said.
    Still, Niknam believes the tide can turn. For change to come, he believes the “silent majority” — those who value openness and diversity — must speak up. 
    Despite its problems, Niknam remains upbeat about the future for tech businesses in the Netherlands.
    “The people are great. The schooling is great. The infrastructure is great,” he says. “It is simply changing the attitude and mindset — which can happen in a relatively short amount of time — that will make all the difference.”
    If you want to catch the talks by Niknam and Wennink — or anything else on the agenda for TNW Conference — we have a special offer for you. Use the code TNWXMEDIA2025 at the checkout to get 30% off your ticket.

    Story by

    Thomas Macaulay

    Managing editor

    Thomas is the managing editor of TNW. He leads our coverage of European tech and oversees our talented team of writers. Away from work, he eThomas is the managing editor of TNW. He leads our coverage of European tech and oversees our talented team of writers. Away from work, he enjoys playing chessand the guitar.

    Get the TNW newsletter
    Get the most important tech news in your inbox each week.

    Also tagged with
    #bunq #ceo #warns #closed #minds
    Bunq CEO warns closed minds are pushing Dutch entrepreneurs away
    Ali Niknam has built Dutch fintech Bunq into one of Europe’s biggest neobanks. But he fears the Netherlands is now driving entrepreneurs away. The Bunq founder and CEO is alarmed by the country’s business mindset. He believes risk-aversion, growing insularity, and hostility to ambition are pushing talent overseas. “Many of the best entrepreneurs I know have either left or are considering leaving,” Niknam tells TNW. Surveys back him up. A poll last year found that almost one in five Dutch entrepreneurs were considering relocating — up from nearly one in eight in 2023. The 💜 of EU techThe latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!Another study found that 24% of large companies were contemplating moves abroad — nearly double the share from the year before.  Tech scaleups are also mulling exits. One of the country’s biggest — software unicorn Bird — recently announced plans to shift operations out of the country. The company’s CEO blamed “over-regulation” and a bad climate for tech businesses. Niknam — who’s set to speak at TNW Conference on June 20 in Amsterdam — has his own critiques of the Dutch business landscape. He calls its support for entrepreneurship “among the worst” he’s seen. Yet he still has deep faith in the country’s talent pool.  “There are very few countries I know that have such amazing, creative, smart people as the Dutch,” he says.  Those people have been integral to Bunq’s rapid growth. Building bridges at Bunq Niknam’s idea for Bunq emerged in the wake of the 2008 financial crisis. One of the causes, he believes, was groupthink at incumbent banks. He founded Bunq in 2012 to create an alternative.  To create a new approach to banking, Niknam sought to embrace diverse ideas. He points to the company’s approach to proposals, which can be pitched anonymously — even to Niknam himself. “On the one hand, that’s better for the company — the best ideas win. And on the other hand, it makes it more fair, because all that counts is the quality of your idea, not who you know, where you’re brought up, or what school you attended.” The strategy delivered rapid results. In 2015, Bunq became the first Dutch company in 35 years to obtain a greenfield banking license. It then grew into Europe’s second-largest neobank after Revolut — and one of the few to achieve profitability. The company now boasts over 17 million users with more than €8bn in deposits. Outside Bunq, however, Niknam sees a country that’s becoming more closed. He believes the Netherlands is abandoning its internationalist roots, which is damaging its tech ecosystem and chasing talent away.  “Historically, the Netherlands has been very entrepreneurial, very international… when this country retreats and closes the doors is when things start to get worse.” Even the country’s vast pension funds, he notes, avoid backing Dutch startups. “They know the returns are going to be less,” he says. “Why are the returns less? Because it’s a small country, and it is retreating and starting to focus within its own borders.” He contrasts the mood with developments in the Baltics. The region’s tech ecosystem has attracted admiring glances for its optimism, openness, and rapid growth. Niknam feels that many people in the Netherlands take their rights for granted. “It’s maybe a little bit of an entitlement disease — that we have forgotten that all these wonderful things that we enjoy today, somebody worked for them really, really hard,” he says. Born in Canada to Iranian parents and with homes in the Netherlands and the US, Niknam has diverse cultural experiences. Image: OLSjopera Niknam feels the Netherlands has become too risk-averse and inward-looking. Despite the liberal stereotype, Dutch society can be surprisingly conservative.  That caution, Niknam says, is embedded in the culture — even in local proverbs. One goes: “Steek je kop niet boven het maaiveld uit.” Loosely translated: “Don’t stick your head above the mowing line.” If you do, it might get chopped off. In Niknam’s eyes, that mentality thwarts ambitious entrepreneurs. “Success is not only not celebrated, but you’re almost faulted for being successful,” he says.  The international return The Netherlands is also losing its appeal to international talent. Over nine in ten expats and migrant workers no longer even consider coming to work in the country, according to research from last year. Tech firms have raised major concerns over losing access to global talent. The chip equipment maker ASML — the largest company in the Netherlands — has threatened to move abroad because of the country’s hardening stance on migrants. Peter Wennink, ASML’s former CEO — who will also speak at TNW Conference — recently warned against losing access to skilled workers. “If we cannot get those people here, we will get those people in Eastern Europe or in Asia or in the United States,” he said. Still, Niknam believes the tide can turn. For change to come, he believes the “silent majority” — those who value openness and diversity — must speak up.  Despite its problems, Niknam remains upbeat about the future for tech businesses in the Netherlands. “The people are great. The schooling is great. The infrastructure is great,” he says. “It is simply changing the attitude and mindset — which can happen in a relatively short amount of time — that will make all the difference.” If you want to catch the talks by Niknam and Wennink — or anything else on the agenda for TNW Conference — we have a special offer for you. Use the code TNWXMEDIA2025 at the checkout to get 30% off your ticket. Story by Thomas Macaulay Managing editor Thomas is the managing editor of TNW. He leads our coverage of European tech and oversees our talented team of writers. Away from work, he eThomas is the managing editor of TNW. He leads our coverage of European tech and oversees our talented team of writers. Away from work, he enjoys playing chessand the guitar. Get the TNW newsletter Get the most important tech news in your inbox each week. Also tagged with #bunq #ceo #warns #closed #minds
    Bunq CEO warns closed minds are pushing Dutch entrepreneurs away
    thenextweb.com
    Ali Niknam has built Dutch fintech Bunq into one of Europe’s biggest neobanks. But he fears the Netherlands is now driving entrepreneurs away. The Bunq founder and CEO is alarmed by the country’s business mindset. He believes risk-aversion, growing insularity, and hostility to ambition are pushing talent overseas. “Many of the best entrepreneurs I know have either left or are considering leaving,” Niknam tells TNW. Surveys back him up. A poll last year found that almost one in five Dutch entrepreneurs were considering relocating — up from nearly one in eight in 2023. The 💜 of EU techThe latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!Another study found that 24% of large companies were contemplating moves abroad — nearly double the share from the year before.  Tech scaleups are also mulling exits. One of the country’s biggest — software unicorn Bird — recently announced plans to shift operations out of the country. The company’s CEO blamed “over-regulation” and a bad climate for tech businesses. Niknam — who’s set to speak at TNW Conference on June 20 in Amsterdam — has his own critiques of the Dutch business landscape. He calls its support for entrepreneurship “among the worst” he’s seen. Yet he still has deep faith in the country’s talent pool.  “There are very few countries I know that have such amazing, creative, smart people as the Dutch,” he says.  Those people have been integral to Bunq’s rapid growth. Building bridges at Bunq Niknam’s idea for Bunq emerged in the wake of the 2008 financial crisis. One of the causes, he believes, was groupthink at incumbent banks. He founded Bunq in 2012 to create an alternative.  To create a new approach to banking, Niknam sought to embrace diverse ideas. He points to the company’s approach to proposals, which can be pitched anonymously — even to Niknam himself. “On the one hand, that’s better for the company — the best ideas win. And on the other hand, it makes it more fair, because all that counts is the quality of your idea, not who you know, where you’re brought up, or what school you attended.” The strategy delivered rapid results. In 2015, Bunq became the first Dutch company in 35 years to obtain a greenfield banking license. It then grew into Europe’s second-largest neobank after Revolut — and one of the few to achieve profitability. The company now boasts over 17 million users with more than €8bn in deposits. Outside Bunq, however, Niknam sees a country that’s becoming more closed. He believes the Netherlands is abandoning its internationalist roots, which is damaging its tech ecosystem and chasing talent away.  “Historically, the Netherlands has been very entrepreneurial, very international… when this country retreats and closes the doors is when things start to get worse.” Even the country’s vast pension funds, he notes, avoid backing Dutch startups. “They know the returns are going to be less,” he says. “Why are the returns less? Because it’s a small country, and it is retreating and starting to focus within its own borders.” He contrasts the mood with developments in the Baltics. The region’s tech ecosystem has attracted admiring glances for its optimism, openness, and rapid growth. Niknam feels that many people in the Netherlands take their rights for granted. “It’s maybe a little bit of an entitlement disease — that we have forgotten that all these wonderful things that we enjoy today, somebody worked for them really, really hard,” he says. Born in Canada to Iranian parents and with homes in the Netherlands and the US, Niknam has diverse cultural experiences. Image: OLSjopera Niknam feels the Netherlands has become too risk-averse and inward-looking. Despite the liberal stereotype, Dutch society can be surprisingly conservative.  That caution, Niknam says, is embedded in the culture — even in local proverbs. One goes: “Steek je kop niet boven het maaiveld uit.” Loosely translated: “Don’t stick your head above the mowing line.” If you do, it might get chopped off. In Niknam’s eyes, that mentality thwarts ambitious entrepreneurs. “Success is not only not celebrated, but you’re almost faulted for being successful,” he says.  The international return The Netherlands is also losing its appeal to international talent. Over nine in ten expats and migrant workers no longer even consider coming to work in the country, according to research from last year. Tech firms have raised major concerns over losing access to global talent. The chip equipment maker ASML — the largest company in the Netherlands — has threatened to move abroad because of the country’s hardening stance on migrants. Peter Wennink, ASML’s former CEO — who will also speak at TNW Conference — recently warned against losing access to skilled workers. “If we cannot get those people here, we will get those people in Eastern Europe or in Asia or in the United States,” he said. Still, Niknam believes the tide can turn. For change to come, he believes the “silent majority” — those who value openness and diversity — must speak up.  Despite its problems, Niknam remains upbeat about the future for tech businesses in the Netherlands. “The people are great. The schooling is great. The infrastructure is great,” he says. “It is simply changing the attitude and mindset — which can happen in a relatively short amount of time — that will make all the difference.” If you want to catch the talks by Niknam and Wennink — or anything else on the agenda for TNW Conference — we have a special offer for you. Use the code TNWXMEDIA2025 at the checkout to get 30% off your ticket. Story by Thomas Macaulay Managing editor Thomas is the managing editor of TNW. He leads our coverage of European tech and oversees our talented team of writers. Away from work, he e (show all) Thomas is the managing editor of TNW. He leads our coverage of European tech and oversees our talented team of writers. Away from work, he enjoys playing chess (badly) and the guitar (even worse). Get the TNW newsletter Get the most important tech news in your inbox each week. Also tagged with
    0 Comentários ·0 Compartilhamentos ·0 Anterior
  • 3 Ways To Uproot Entitlement From Your Relationship, By A Psychologist

    Caring should be voluntary, not owed. If you find yourself keeping score, these three mindset shifts ... More will help you step into a healthier love.getty
    In the process of choosing a partner or deciding if what you’re getting in a relationship is truly what you deserve, it’s natural to reflect on your needs and expectations. Knowing your worth helps you recognize red flags and avoid “settling for less.”

    But there’s a fine line between having healthy standards and feeling entitled to someone else’s time, energy or emotional labor without considering their perspective or capacity.

    To be clear, wanting a partner who respects you and makes you feel valued is not entitlement.

    The trouble begins when those wants quietly shift into assumptions, like believing that because you’re a good partner, the other person should act a certain way, or that effort should always be equal and immediate.

    Relationships aren’t transactional. Sometimes, feeling that you’re owed something just because of what you bring to the table can block genuine connection and growth.
    This is exactly what “entitlement” in relationships can look like. It can usually be disguised as fairness or reciprocity and may not seem harmful at first. But over time, it creates a dynamic where love and care become conditions rather than choices.
    You might find yourself thinking, “If I’m doing all this, why aren’t they matching it?” or “They should know how I feel without me having to say it.” It’s important to recognize entitlement in your relationships, whether it’s something you’re experiencing or noticing in your partner.
    Here are three ways you can actively work to disrupt this pattern in your relationship.
    1. Make Space For Both Of Your Needs
    Sometimes, entitlement in relationships stems from a deep-rooted focus on getting one’s needs met, often at the expense of mutual understanding. Shifting toward a more caring, collaborative mindset starts with recognizing that relationships thrive when both partners feel seen and supported.
    Rather than approaching your partner with a sense of emotional, physical or material expectation, it helps to ask, “Am I expecting too much? Are my needs the only ones being prioritized here?”
    A 2023 study published in Personality and Social Psychology Bulletin found that individuals who endorsed self-transcendence values like benevolenceand universalismtended to report higher-quality romantic relationships.

    This is largely because these values promote pro-relational attitudes, which are mindsets that prioritize the well-being of both the relationship and the partners involved. People with pro-relational attitudes tend to demonstrate more empathy, actively nurture the relationship and approach conflicts with a focus on collaboration rather than personal gain.
    While it’s important to acknowledge when things may not be going well in a relationship, take a step back to assess if your expectations are coming from a place of entitlement. Rather than solely focusing on what’s lacking, try to understand if your needs are rooted in a desire for fairness and mutual respect rather than a sense of being owed something. Balancing this awareness with gratitude can help cultivate a healthier, more cooperative relationship dynamic.
    2. Let Go Of The ‘Quid Pro Quo’ Mentality
    In relationships, it’s common to fall into the trap of a “Quid pro quo” mentality where you expect every action should be reciprocated. This mindset operates on the assumption that for every emotional, physical or material effort you make, there should be an equal response from your partner.
    However, relationships based on this transactional approach often lead to disappointment and resentment when expectations aren’t met. Instead of looking at your relationship like a balance sheet where every favor must be accounted for, consider shifting toward a more unconditional mindset.
    Research on competitive behavior in young couples highlights the negative impact of this mindset. Researchers found that individuals with lower self-esteem were more likely to engage in competitive behaviors within their romantic relationships.
    This competition, driven by a need to prove one’s value, often manifests as one-upmanship, whether it’s in achieving goals, receiving affection or managing household tasks. However, this behavior tends to lead to conflict rather than strengthening the relationship, as it creates a divide instead of inviting collaboration.
    Breaking free from this mentality requires more than just shifting your mindset. You need to begin embracing the joy of giving without the pressure of receiving. Try to focus on moments of pure selflessness in your relationship.
    For example, surprise your partner with something meaningful to them without any expectation of getting something in return. This doesn’t mean never expecting anything from them — the goal is to cultivate a sense of fulfillment that doesn’t rely on reciprocity and to reinforce the unconditional nature of your bond.
    3. Don’t Expect Your Partner To Be Your Emotional Fix-All
    In a relationship, it’s easy to fall into the belief that your partner should always know how to comfort you or perhaps that they’re somehow responsible for making you feel better. This subtle sense of entitlement can show up as frustration or disappointment when they don’t respond the way you expect.
    You may even find yourself thinking, “If I’m upset, you should fix it.” But your partner isn’t a mind-reader or an emotional problem-solver. They’re human, and sometimes they won’t have the right answer or energy to help.
    Instead of depending on them to constantly carry your emotional weight, focus on building your emotional steadiness. This allows your partner’s support to feel like a natural expression of care, given without any underlying pressure.
    A 2024 study reveals that individuals who regulate their emotions better are more likely to have positive relationships. For example, those who can manage their emotions tend to communicate more effectively, leading to healthier expressions of feelings within the relationship.
    Additionally, emotional regulation encourages empathy and support, both of which contribute positively to relationship satisfaction. On the other hand, difficulties in emotional regulation can lead to conflicts, misunderstandings and dissatisfaction.
    So, it’s crucial to focus on building emotional resilience for both your well-being and your partner’s. Start by processing your emotions independently, finding ways to soothe and regulate yourself. This self-awareness will help you communicate better and stay calm when challenges arise.
    When you do need support, approach your partner with openness rather than expectation. Remember, it’s okay to lean on each other, but it’s essential to also cultivate the ability to comfort and steady yourself. This balanced approach enhances intimacy and long-term relationship satisfaction.
    To truly break the cycle of entitlement in relationships, it’s crucial to shift your focus from what you feel you’re owed to what you can offer. This means moving beyond the idea of love as a transaction and embracing it as a choice that both partners willingly make.
    Instead of assuming your partner should meet every need or fulfill specific roles, focus on contributing to the relationship without expecting a direct exchange. True connection thrives when both partners bring their whole selves to the relationship, free from a sense of obligation or entitlement.
    Is your relationship mindset rooted in connection or entitlement? Take this science-backed test to find out: Sense Of Relational Entitlement Scale
    #ways #uproot #entitlement #your #relationship
    3 Ways To Uproot Entitlement From Your Relationship, By A Psychologist
    Caring should be voluntary, not owed. If you find yourself keeping score, these three mindset shifts ... More will help you step into a healthier love.getty In the process of choosing a partner or deciding if what you’re getting in a relationship is truly what you deserve, it’s natural to reflect on your needs and expectations. Knowing your worth helps you recognize red flags and avoid “settling for less.” But there’s a fine line between having healthy standards and feeling entitled to someone else’s time, energy or emotional labor without considering their perspective or capacity. To be clear, wanting a partner who respects you and makes you feel valued is not entitlement. The trouble begins when those wants quietly shift into assumptions, like believing that because you’re a good partner, the other person should act a certain way, or that effort should always be equal and immediate. Relationships aren’t transactional. Sometimes, feeling that you’re owed something just because of what you bring to the table can block genuine connection and growth. This is exactly what “entitlement” in relationships can look like. It can usually be disguised as fairness or reciprocity and may not seem harmful at first. But over time, it creates a dynamic where love and care become conditions rather than choices. You might find yourself thinking, “If I’m doing all this, why aren’t they matching it?” or “They should know how I feel without me having to say it.” It’s important to recognize entitlement in your relationships, whether it’s something you’re experiencing or noticing in your partner. Here are three ways you can actively work to disrupt this pattern in your relationship. 1. Make Space For Both Of Your Needs Sometimes, entitlement in relationships stems from a deep-rooted focus on getting one’s needs met, often at the expense of mutual understanding. Shifting toward a more caring, collaborative mindset starts with recognizing that relationships thrive when both partners feel seen and supported. Rather than approaching your partner with a sense of emotional, physical or material expectation, it helps to ask, “Am I expecting too much? Are my needs the only ones being prioritized here?” A 2023 study published in Personality and Social Psychology Bulletin found that individuals who endorsed self-transcendence values like benevolenceand universalismtended to report higher-quality romantic relationships. This is largely because these values promote pro-relational attitudes, which are mindsets that prioritize the well-being of both the relationship and the partners involved. People with pro-relational attitudes tend to demonstrate more empathy, actively nurture the relationship and approach conflicts with a focus on collaboration rather than personal gain. While it’s important to acknowledge when things may not be going well in a relationship, take a step back to assess if your expectations are coming from a place of entitlement. Rather than solely focusing on what’s lacking, try to understand if your needs are rooted in a desire for fairness and mutual respect rather than a sense of being owed something. Balancing this awareness with gratitude can help cultivate a healthier, more cooperative relationship dynamic. 2. Let Go Of The ‘Quid Pro Quo’ Mentality In relationships, it’s common to fall into the trap of a “Quid pro quo” mentality where you expect every action should be reciprocated. This mindset operates on the assumption that for every emotional, physical or material effort you make, there should be an equal response from your partner. However, relationships based on this transactional approach often lead to disappointment and resentment when expectations aren’t met. Instead of looking at your relationship like a balance sheet where every favor must be accounted for, consider shifting toward a more unconditional mindset. Research on competitive behavior in young couples highlights the negative impact of this mindset. Researchers found that individuals with lower self-esteem were more likely to engage in competitive behaviors within their romantic relationships. This competition, driven by a need to prove one’s value, often manifests as one-upmanship, whether it’s in achieving goals, receiving affection or managing household tasks. However, this behavior tends to lead to conflict rather than strengthening the relationship, as it creates a divide instead of inviting collaboration. Breaking free from this mentality requires more than just shifting your mindset. You need to begin embracing the joy of giving without the pressure of receiving. Try to focus on moments of pure selflessness in your relationship. For example, surprise your partner with something meaningful to them without any expectation of getting something in return. This doesn’t mean never expecting anything from them — the goal is to cultivate a sense of fulfillment that doesn’t rely on reciprocity and to reinforce the unconditional nature of your bond. 3. Don’t Expect Your Partner To Be Your Emotional Fix-All In a relationship, it’s easy to fall into the belief that your partner should always know how to comfort you or perhaps that they’re somehow responsible for making you feel better. This subtle sense of entitlement can show up as frustration or disappointment when they don’t respond the way you expect. You may even find yourself thinking, “If I’m upset, you should fix it.” But your partner isn’t a mind-reader or an emotional problem-solver. They’re human, and sometimes they won’t have the right answer or energy to help. Instead of depending on them to constantly carry your emotional weight, focus on building your emotional steadiness. This allows your partner’s support to feel like a natural expression of care, given without any underlying pressure. A 2024 study reveals that individuals who regulate their emotions better are more likely to have positive relationships. For example, those who can manage their emotions tend to communicate more effectively, leading to healthier expressions of feelings within the relationship. Additionally, emotional regulation encourages empathy and support, both of which contribute positively to relationship satisfaction. On the other hand, difficulties in emotional regulation can lead to conflicts, misunderstandings and dissatisfaction. So, it’s crucial to focus on building emotional resilience for both your well-being and your partner’s. Start by processing your emotions independently, finding ways to soothe and regulate yourself. This self-awareness will help you communicate better and stay calm when challenges arise. When you do need support, approach your partner with openness rather than expectation. Remember, it’s okay to lean on each other, but it’s essential to also cultivate the ability to comfort and steady yourself. This balanced approach enhances intimacy and long-term relationship satisfaction. To truly break the cycle of entitlement in relationships, it’s crucial to shift your focus from what you feel you’re owed to what you can offer. This means moving beyond the idea of love as a transaction and embracing it as a choice that both partners willingly make. Instead of assuming your partner should meet every need or fulfill specific roles, focus on contributing to the relationship without expecting a direct exchange. True connection thrives when both partners bring their whole selves to the relationship, free from a sense of obligation or entitlement. Is your relationship mindset rooted in connection or entitlement? Take this science-backed test to find out: Sense Of Relational Entitlement Scale #ways #uproot #entitlement #your #relationship
    3 Ways To Uproot Entitlement From Your Relationship, By A Psychologist
    www.forbes.com
    Caring should be voluntary, not owed. If you find yourself keeping score, these three mindset shifts ... More will help you step into a healthier love.getty In the process of choosing a partner or deciding if what you’re getting in a relationship is truly what you deserve, it’s natural to reflect on your needs and expectations. Knowing your worth helps you recognize red flags and avoid “settling for less.” But there’s a fine line between having healthy standards and feeling entitled to someone else’s time, energy or emotional labor without considering their perspective or capacity. To be clear, wanting a partner who respects you and makes you feel valued is not entitlement. The trouble begins when those wants quietly shift into assumptions, like believing that because you’re a good partner, the other person should act a certain way, or that effort should always be equal and immediate. Relationships aren’t transactional. Sometimes, feeling that you’re owed something just because of what you bring to the table can block genuine connection and growth. This is exactly what “entitlement” in relationships can look like. It can usually be disguised as fairness or reciprocity and may not seem harmful at first. But over time, it creates a dynamic where love and care become conditions rather than choices. You might find yourself thinking, “If I’m doing all this, why aren’t they matching it?” or “They should know how I feel without me having to say it.” It’s important to recognize entitlement in your relationships, whether it’s something you’re experiencing or noticing in your partner. Here are three ways you can actively work to disrupt this pattern in your relationship. 1. Make Space For Both Of Your Needs Sometimes, entitlement in relationships stems from a deep-rooted focus on getting one’s needs met, often at the expense of mutual understanding. Shifting toward a more caring, collaborative mindset starts with recognizing that relationships thrive when both partners feel seen and supported. Rather than approaching your partner with a sense of emotional, physical or material expectation, it helps to ask, “Am I expecting too much? Are my needs the only ones being prioritized here?” A 2023 study published in Personality and Social Psychology Bulletin found that individuals who endorsed self-transcendence values like benevolence (care for close others) and universalism (concern for the broader good) tended to report higher-quality romantic relationships. This is largely because these values promote pro-relational attitudes, which are mindsets that prioritize the well-being of both the relationship and the partners involved. People with pro-relational attitudes tend to demonstrate more empathy, actively nurture the relationship and approach conflicts with a focus on collaboration rather than personal gain. While it’s important to acknowledge when things may not be going well in a relationship, take a step back to assess if your expectations are coming from a place of entitlement. Rather than solely focusing on what’s lacking, try to understand if your needs are rooted in a desire for fairness and mutual respect rather than a sense of being owed something. Balancing this awareness with gratitude can help cultivate a healthier, more cooperative relationship dynamic. 2. Let Go Of The ‘Quid Pro Quo’ Mentality In relationships, it’s common to fall into the trap of a “Quid pro quo” mentality where you expect every action should be reciprocated. This mindset operates on the assumption that for every emotional, physical or material effort you make, there should be an equal response from your partner. However, relationships based on this transactional approach often lead to disappointment and resentment when expectations aren’t met. Instead of looking at your relationship like a balance sheet where every favor must be accounted for, consider shifting toward a more unconditional mindset. Research on competitive behavior in young couples highlights the negative impact of this mindset. Researchers found that individuals with lower self-esteem were more likely to engage in competitive behaviors within their romantic relationships. This competition, driven by a need to prove one’s value, often manifests as one-upmanship, whether it’s in achieving goals, receiving affection or managing household tasks. However, this behavior tends to lead to conflict rather than strengthening the relationship, as it creates a divide instead of inviting collaboration. Breaking free from this mentality requires more than just shifting your mindset. You need to begin embracing the joy of giving without the pressure of receiving. Try to focus on moments of pure selflessness in your relationship. For example, surprise your partner with something meaningful to them without any expectation of getting something in return. This doesn’t mean never expecting anything from them — the goal is to cultivate a sense of fulfillment that doesn’t rely on reciprocity and to reinforce the unconditional nature of your bond. 3. Don’t Expect Your Partner To Be Your Emotional Fix-All In a relationship, it’s easy to fall into the belief that your partner should always know how to comfort you or perhaps that they’re somehow responsible for making you feel better. This subtle sense of entitlement can show up as frustration or disappointment when they don’t respond the way you expect. You may even find yourself thinking, “If I’m upset, you should fix it.” But your partner isn’t a mind-reader or an emotional problem-solver. They’re human, and sometimes they won’t have the right answer or energy to help. Instead of depending on them to constantly carry your emotional weight, focus on building your emotional steadiness. This allows your partner’s support to feel like a natural expression of care, given without any underlying pressure. A 2024 study reveals that individuals who regulate their emotions better are more likely to have positive relationships. For example, those who can manage their emotions tend to communicate more effectively, leading to healthier expressions of feelings within the relationship. Additionally, emotional regulation encourages empathy and support, both of which contribute positively to relationship satisfaction. On the other hand, difficulties in emotional regulation can lead to conflicts, misunderstandings and dissatisfaction. So, it’s crucial to focus on building emotional resilience for both your well-being and your partner’s. Start by processing your emotions independently, finding ways to soothe and regulate yourself. This self-awareness will help you communicate better and stay calm when challenges arise. When you do need support, approach your partner with openness rather than expectation. Remember, it’s okay to lean on each other, but it’s essential to also cultivate the ability to comfort and steady yourself. This balanced approach enhances intimacy and long-term relationship satisfaction. To truly break the cycle of entitlement in relationships, it’s crucial to shift your focus from what you feel you’re owed to what you can offer. This means moving beyond the idea of love as a transaction and embracing it as a choice that both partners willingly make. Instead of assuming your partner should meet every need or fulfill specific roles, focus on contributing to the relationship without expecting a direct exchange. True connection thrives when both partners bring their whole selves to the relationship, free from a sense of obligation or entitlement. Is your relationship mindset rooted in connection or entitlement? Take this science-backed test to find out: Sense Of Relational Entitlement Scale
    0 Comentários ·0 Compartilhamentos ·0 Anterior
  • The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations

    Recently a friend mentioned how much of a shame it was that, generally speaking, there are few of those backdoor “classic” reimaginings today like the ones we had growing up. And after thinking for a moment, I agreed. Children and teens of the ‘90s were treated to an embarrassment of riches when it came to the Bard and Bard-adjacent films. Nearly every week seemed to offer another modernization of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all retrofitted with a wink and a nudge to appeal to teenagers reading much the same texts in high school or university.
    But then when looking back at the sweep of 1990s cinema beyond just “teen movies,” it was more than only Julia Stiles and Heath Ledger vehicles that were getting the classical treatment. In fact the ‘90s, and to a large extent the ‘80s as well, was an era ripe with indie studios and Hollywood majors treating classic literaturewith the sanctity nowadays reserved for comic books and video games. It was a time when some of the most exciting or ambitious artists working in the industry sought to trade in the bullets and brutality of New Hollywood from a decade or two earlier in favor of the even more brutal constraints of corsets and top hats.

    Shakespeare was arguably bigger business in tinsel town than at any other point during this period, and we saw some of the most faithful and enduring adaptations of Austen or Louisa May Alcott make it to the screen. Why is that and can it happen again? Let’s look back at the golden age of period piece costumed dramas and splashy literary adaptations…

    Mozart and Merchant Ivory
    Since the beginning of the medium, moviemakers have looked back at well-worn and familiar stories for inspiration and audience familiarity. Not too many years after making his enduring trip to the moon, Georges Méliès adapted Hamlet into a roughly 10-minute silent short in 1907. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the tights.

    Even so, literary adaptations were often constrained, particularly in Hollywood where filmmakers had to contend with the limitations of censorship via the Hays Code and preconceived notions about what an American audience would enjoy. The most popular costumed dramas tended to therefore be vanity projects or something of a more sensational hue—think biblical or swords and sandals epics.
    So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, the first was the year that Miloš Forman’s AmadeusA Room with a View. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece.
    In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest—a zeitgeist-shaping portrait of modern oppression and control from about a decade earlier—was taking the story of Mozart and making it a punk rock tragicomedy. Based on a Peter Shaffer play of the same name, Forman and Shaffer radically reimagined the story, making it both funnier and darker as Forman strove to pose Mozart as a modern day rebel iconoclast with his wig resembling as much Sid Vicious as the Age of Enlightenment. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience.
    It went on to do relatively big business and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past. Otherwise, most of the recent winners were dramas or dramedies about the modern world: Kramer vs. Kramer, The Deer Hunter, and Annie Hall. They reflected an audience that wanted to get away from the artificiality of their parents’ cinema, which in the U.S. associated historical costumes with thephoniness of Ben-Huror Oliver!.
    Yet perhaps the movie that proved this was the beginning of a popular trend came a few years later via the British masterpiece A Room with a View. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those films were mixed with modern comedies and dramas like rock ’n roll-infused The Guruand Jane Austen in Manhattan. More importantly, all of these films tended to be art house pictures; small chamber pieces intended for a limited audience.
    Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U.S.—this movie had the “highest single theatre gross in the country!”The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success.

    It also defined what became the “Merchant Ivory” period piece forever after, including in future Oscar and box office darlings like the Anthony Hopkins, Emma Thompson, and Carter-starring Howard’s End, and Hopkins and Thompson’s reunion in The Remains of the Day. These were all distinctly British and understated pictures, with Remains being an outright tragedy delivered in a hushed whisper, but their relative success with a certain type of moviegoer and Academy voter signaled to Hollywood that there was gold up in ‘em hills. And soon enough, more than just Forman on the American side was going up there to mine it.

    Join our mailing list
    Get the best of Den of Geek delivered right to your inbox!

    20th Century Studios
    Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama
    In 1990, Michael Mann was one of the hottest creatives working in Hollywood. As the executive producer and sometime-director on NBC’s edgypolice drama, Miami Vice, he played a direct hand in proving American television could be “gritty” and artistic. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would clash with the neon-light-on-celluloid aesthetic that Mann developed for the series.
    As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the ’80s like serial killer thriller Manhunter. Instead he sought to adapt a childhood favorite for the screen, James Fenimore Cooper’s 19th century American frontier novel, The Last of the Mohicans. Certainly a problematic text in its original form with its imperial-fantasy riff on the French and Indian Warwhere Indigenous tribes in what is today upstate New York were either reduced to the noble or cruel savage stereotypes, the text proved a jumping off point for Mann to craft a gripping, primal, and prestigious film.
    He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is an elevated action movie, and a beautiful drama that did bigger business in the U.S. than Disney’s Beauty and the Beast and Tom Cruise vehicle A Few Good Men in the same year. It also would create a precedent we’d see followed time and again throughout the rest of the decade.
    Some of the biggest and most respected filmmakers of the moment, many of them praised under auteur theory, were looking to literary classics for an audience that craved them. After the one-two genre punch of Goodfellasand Cape Fear, Martin Scorsese made one of his most ambitious and underrated films: a stone-cold 1993 masterpiece inspired by an Edith Wharton novel, The Age of Innocence.
    It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. Indeed, The Age of Innocence remains the best cinematic representation of the Gilded Age in the U.S., capturing the lush pageantry of the most elite New Yorkers’ lifestyles in their robber baron heyday, as well as how class snobbery metastasized into a ruthless tribalism that doomed the romantic yearnings of one conformist attorneyand this would-be divorcée love of his life.

    It might not have been a hit in its time, but Ang Lee’s breakout in the U.S. a year later definitely was. The Taiwanese filmmaker was already the toast of international and independent cinema via movies like The Wedding Banquetand martial arts-adjacent Pushing Hands, but it is when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility in 1995 that he became a Hollywood favorite who would soon get movies like Crouching Tiger, Hidden Dragonand Hulkgreenlit. Sense and Sensibility benefits greatly, too, from a marvelous cast with Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman among its ensemble. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched.
    It set a standard that most of the best Austen adaptations to this day are measured by, be it Joe Wright and Keira Knightley’s cinematic take on Pride and Prejudice a decade later, various attempts at Emma from the 1990s with Gwyneth Paltrow to this decade with Anya Taylor-Joy, or even Netflix’s recent Dakota Johnson-led Persuasion adaptation.
    Columbia / Sony
    A Dark Universe of Gods and Monsters
    Meanwhile, right before Columbia Pictures greenlit Scorsese’s The Age of Innocence and later Gillian Armstrong’s still delightfulinterpretation of Little Women in 1994, the same studio signed off on its first period piece with Winona Ryder attached to star. And it was Dracula.
    Considered a folly of hubris at the time by rivals who snickered to Variety it should be renamed “Bonfire of the Vampires”, Bram Stoker’s Dracula was Francis Ford Coppola’s lurid and magnificent reimagining of Stoker’s definitive Victorian novel. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called “New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. He also correctly predicted there was a box office hit if he could bring all those elements out in an exciting and anachronistic fever dream for the MTV generation.
    Love or hate Coppola’s looseness with Stoker’s novel—which is pretty audacious since he put the author’s name in the title—Coppola crafted one of the most sumptuous and expensive depictions of Victorian society ever put onscreen, winning costume designer Eiko Ishioka an Oscar for the effort. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter.
    It set a standard for what can in retrospect be considered a pseudo “dark universe” of classic literary monsters getting ostensibly faithful and expensive adaptations by Hollywood. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein, a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. It was also a worse movie that flopped, but it looked spectacular as the only major Frankenstein movie to remember Shelley set the story during the Age of Enlightenment in the late 18th century.

    Yet while Frankenstein failed, Tom Cruise and Neil Jordan would have a lot of success in the same year adapting Anne Rice’s Interview with the Vampire. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. It was also a grandiose costumed drama where the guy who played Top Gun’s Maverick would sink fangs into young Brad Pitt’s neck in a scene dripping in homoeroticism.
    This trend continued throughout the ‘90s with some successes, like Tim Burton’s wildly revisionistSleepy Hollow in 1999, and some misses. For instance, did you remember that Julia Roberts at the height of her stardom appeared in a revisionist take on Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde where she played the not-so-good doctor’s maid? It’s called Mary Reilly, by the by.
    The Samuel Goldwyn Company
    The Resurgence of Shakespeare
    Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Yet curiously it was only in the 1990s that someone really lit on the idea of making a movie directly based on the Bard tailored almost exclusively for that demographic: Baz Luhrmann in 1996, who reconfigured the tragedy of Romeo and Juliet into the visual language of MTV. He even stylized the title as William Shakespeare’s Romeo + Juliet.
    That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. Their Verona was a Neverland composite of Miami, Rio de Janeiro, and the nightly news, with hyper music video editing and frenetic neon-hued melodrama. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do.
    But it was hardly the first box office breakout for Shakespeare in the ‘90s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. To the modern eye, it is hard to remember Gibson was a heartthrob of sorts in the ‘80s and early ‘90s—or generally viewed as a dashing star worthy of heroic leading men roles.
    Nonetheless, there is quite a bit to like about Hamletif you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Gibson actually makes for a credible and genuinely mad Hamlet, and Zeffirelli mines the medieval melancholy of the story well with production design, costumes, and location shooting at real Norman castles. Plus, Helena Bonham Carter remains the best Ophelia ever put to screen. Hamletwould eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the ‘90s: Kenneth Branagh.

    Aye, Branagh might deserve the most credit for the Shakespearean renaissance in this era, beginning with his adaptation of Henry V, which featured the makings of Branagh’s troupe of former RSC favorites turned film actors: Derek Jacobi, Brian Blessed, and of course his future wife, Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing, a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. It also features Denzel Washington as a dashing Renaissance prince, Kate Beckinsale in her breakout role, and a gloriously over-the-top score by Patrick Doyle.
    It would define the style of Branagh’s following ‘90s efforts, whether they went off-the-rails like in the aforementioned Frankenstein, or right back on them in the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. At the play’s full four-hour length, Hamletis indulgent. Yet somehow that befits the material. Branagh would also star as Iago in Oliver Parker’s Othelloopposite Laurence Fishburne and reconfigure the Bard as a musical in his own directorial effort, Love’s Labour’s Lost.
    It paved the way for more outside-the-box Shakespeare movies by the end of the decade like Julie Taymor’s deconstructionist Titusand the A Midsummer Night’s Dream from 1999 where Kevin Kline turns into an ass and makes out with Michelle Pfeiffer.
    CBS via Getty Images
    The Birth of the Teenage Shakespeare RemixAs popular as the Shakespeare movie became in the ‘90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re talking moving past modernizing Romeo and Juliet like Luhrmann did, or repurposing it for high New York society like Leonard Bernstein and Stephen Sondheim aimed with West Side Story.
    These were straight, unapologetic youth films that also proved clever reworkings of classic storytelling structure. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You, a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. Stiles would, in fact, do this kind of remix a number times in the more serious-faced modernization of Othello, O, which also starred Mekhi Phifer as a tragically distrusting high school sports star instead of warrior, and Michael Almereyda and Ethan Hawke’s own Hamlet, the third Hamlet movie in 10 years, albeit this one set in turn-of-the-century NYC.
    Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale, an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the strange attempt to turn Pierre Choderlos de Laclos’ Dangerous Liaisons from 1782 into an erotic thriller for teensvia the lusty Cruel Intentions

    However, easily the best of these remains Amy Heckerling’s CluelessEmma from the Regency period to a fairytale version of 1990s Beverly Hills. Foregoing modern fads and simply inventing her own—with the assumption anything she wrote in 1994 would be dated by ’95—Heckerling create a faux yet now authentically iconic language and fashion style via Cher, a charmed SoCal princess who is so well-meaning in her matchmaking mischief that she defies any attempts to detest her entitlement or vanity. You kind of are even low-key chill that the happy ending is she hooks up with her step brother. It’s a classic!
    And the Rest
    There are many, many more examples we could examine from this era. These can include the sublime like the Gillian Armstrong-directed Little Women of 1994 starring Winona Ryder, Claire Danes, and Kirsten Dunst; and they can include the wretched like the Demi Moore and Gary Oldman-led The Scarlet Letter. There were more plays adapted, a la Arthur Miller’s The Crucible, and then those that just had some fun with playwrights, as seen in the over-celebrated Shakespeare in LoveBraveheart.
    More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. And yet, this type of film has by and large gone away. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. That beautiful film was a good-sized hit in 2019, but it did not exactly usher in a new era of literary adaptations.
    Now such projects, like everything else not considered four-quadrant intellectual property by studio bean counters, is mostly relegated to long-form stream series. Which in some cases is fine. Many would argue the best version of Pride & Prejudice was the BBC production… also from the ‘90s, mind. But whether it is original period piece films or adaptations, unless you’re Robert Eggers, period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton.
    This seems due to studios increasingly eschewing anything that isn’t reliably based on a brand that middle-aged adults loved. But in that case… it might be worth reminding them that ‘90s kids are getting older and having children of their own. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories; a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These stories are mostly in the public domain too. And recent original hits like Sinners suggests you don’t even need a classic story to connect with audiences. So perhaps once again, a play’s the thing in which they can catch the conscience of the… consumer? Or something like that.
    #1990s #were #golden #age #period
    The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations
    Recently a friend mentioned how much of a shame it was that, generally speaking, there are few of those backdoor “classic” reimaginings today like the ones we had growing up. And after thinking for a moment, I agreed. Children and teens of the ‘90s were treated to an embarrassment of riches when it came to the Bard and Bard-adjacent films. Nearly every week seemed to offer another modernization of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all retrofitted with a wink and a nudge to appeal to teenagers reading much the same texts in high school or university. But then when looking back at the sweep of 1990s cinema beyond just “teen movies,” it was more than only Julia Stiles and Heath Ledger vehicles that were getting the classical treatment. In fact the ‘90s, and to a large extent the ‘80s as well, was an era ripe with indie studios and Hollywood majors treating classic literaturewith the sanctity nowadays reserved for comic books and video games. It was a time when some of the most exciting or ambitious artists working in the industry sought to trade in the bullets and brutality of New Hollywood from a decade or two earlier in favor of the even more brutal constraints of corsets and top hats. Shakespeare was arguably bigger business in tinsel town than at any other point during this period, and we saw some of the most faithful and enduring adaptations of Austen or Louisa May Alcott make it to the screen. Why is that and can it happen again? Let’s look back at the golden age of period piece costumed dramas and splashy literary adaptations… Mozart and Merchant Ivory Since the beginning of the medium, moviemakers have looked back at well-worn and familiar stories for inspiration and audience familiarity. Not too many years after making his enduring trip to the moon, Georges Méliès adapted Hamlet into a roughly 10-minute silent short in 1907. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the tights. Even so, literary adaptations were often constrained, particularly in Hollywood where filmmakers had to contend with the limitations of censorship via the Hays Code and preconceived notions about what an American audience would enjoy. The most popular costumed dramas tended to therefore be vanity projects or something of a more sensational hue—think biblical or swords and sandals epics. So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, the first was the year that Miloš Forman’s AmadeusA Room with a View. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece. In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest—a zeitgeist-shaping portrait of modern oppression and control from about a decade earlier—was taking the story of Mozart and making it a punk rock tragicomedy. Based on a Peter Shaffer play of the same name, Forman and Shaffer radically reimagined the story, making it both funnier and darker as Forman strove to pose Mozart as a modern day rebel iconoclast with his wig resembling as much Sid Vicious as the Age of Enlightenment. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience. It went on to do relatively big business and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past. Otherwise, most of the recent winners were dramas or dramedies about the modern world: Kramer vs. Kramer, The Deer Hunter, and Annie Hall. They reflected an audience that wanted to get away from the artificiality of their parents’ cinema, which in the U.S. associated historical costumes with thephoniness of Ben-Huror Oliver!. Yet perhaps the movie that proved this was the beginning of a popular trend came a few years later via the British masterpiece A Room with a View. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those films were mixed with modern comedies and dramas like rock ’n roll-infused The Guruand Jane Austen in Manhattan. More importantly, all of these films tended to be art house pictures; small chamber pieces intended for a limited audience. Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U.S.—this movie had the “highest single theatre gross in the country!”The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success. It also defined what became the “Merchant Ivory” period piece forever after, including in future Oscar and box office darlings like the Anthony Hopkins, Emma Thompson, and Carter-starring Howard’s End, and Hopkins and Thompson’s reunion in The Remains of the Day. These were all distinctly British and understated pictures, with Remains being an outright tragedy delivered in a hushed whisper, but their relative success with a certain type of moviegoer and Academy voter signaled to Hollywood that there was gold up in ‘em hills. And soon enough, more than just Forman on the American side was going up there to mine it. Join our mailing list Get the best of Den of Geek delivered right to your inbox! 20th Century Studios Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama In 1990, Michael Mann was one of the hottest creatives working in Hollywood. As the executive producer and sometime-director on NBC’s edgypolice drama, Miami Vice, he played a direct hand in proving American television could be “gritty” and artistic. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would clash with the neon-light-on-celluloid aesthetic that Mann developed for the series. As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the ’80s like serial killer thriller Manhunter. Instead he sought to adapt a childhood favorite for the screen, James Fenimore Cooper’s 19th century American frontier novel, The Last of the Mohicans. Certainly a problematic text in its original form with its imperial-fantasy riff on the French and Indian Warwhere Indigenous tribes in what is today upstate New York were either reduced to the noble or cruel savage stereotypes, the text proved a jumping off point for Mann to craft a gripping, primal, and prestigious film. He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is an elevated action movie, and a beautiful drama that did bigger business in the U.S. than Disney’s Beauty and the Beast and Tom Cruise vehicle A Few Good Men in the same year. It also would create a precedent we’d see followed time and again throughout the rest of the decade. Some of the biggest and most respected filmmakers of the moment, many of them praised under auteur theory, were looking to literary classics for an audience that craved them. After the one-two genre punch of Goodfellasand Cape Fear, Martin Scorsese made one of his most ambitious and underrated films: a stone-cold 1993 masterpiece inspired by an Edith Wharton novel, The Age of Innocence. It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. Indeed, The Age of Innocence remains the best cinematic representation of the Gilded Age in the U.S., capturing the lush pageantry of the most elite New Yorkers’ lifestyles in their robber baron heyday, as well as how class snobbery metastasized into a ruthless tribalism that doomed the romantic yearnings of one conformist attorneyand this would-be divorcée love of his life. It might not have been a hit in its time, but Ang Lee’s breakout in the U.S. a year later definitely was. The Taiwanese filmmaker was already the toast of international and independent cinema via movies like The Wedding Banquetand martial arts-adjacent Pushing Hands, but it is when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility in 1995 that he became a Hollywood favorite who would soon get movies like Crouching Tiger, Hidden Dragonand Hulkgreenlit. Sense and Sensibility benefits greatly, too, from a marvelous cast with Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman among its ensemble. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched. It set a standard that most of the best Austen adaptations to this day are measured by, be it Joe Wright and Keira Knightley’s cinematic take on Pride and Prejudice a decade later, various attempts at Emma from the 1990s with Gwyneth Paltrow to this decade with Anya Taylor-Joy, or even Netflix’s recent Dakota Johnson-led Persuasion adaptation. Columbia / Sony A Dark Universe of Gods and Monsters Meanwhile, right before Columbia Pictures greenlit Scorsese’s The Age of Innocence and later Gillian Armstrong’s still delightfulinterpretation of Little Women in 1994, the same studio signed off on its first period piece with Winona Ryder attached to star. And it was Dracula. Considered a folly of hubris at the time by rivals who snickered to Variety it should be renamed “Bonfire of the Vampires”, Bram Stoker’s Dracula was Francis Ford Coppola’s lurid and magnificent reimagining of Stoker’s definitive Victorian novel. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called “New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. He also correctly predicted there was a box office hit if he could bring all those elements out in an exciting and anachronistic fever dream for the MTV generation. Love or hate Coppola’s looseness with Stoker’s novel—which is pretty audacious since he put the author’s name in the title—Coppola crafted one of the most sumptuous and expensive depictions of Victorian society ever put onscreen, winning costume designer Eiko Ishioka an Oscar for the effort. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter. It set a standard for what can in retrospect be considered a pseudo “dark universe” of classic literary monsters getting ostensibly faithful and expensive adaptations by Hollywood. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein, a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. It was also a worse movie that flopped, but it looked spectacular as the only major Frankenstein movie to remember Shelley set the story during the Age of Enlightenment in the late 18th century. Yet while Frankenstein failed, Tom Cruise and Neil Jordan would have a lot of success in the same year adapting Anne Rice’s Interview with the Vampire. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. It was also a grandiose costumed drama where the guy who played Top Gun’s Maverick would sink fangs into young Brad Pitt’s neck in a scene dripping in homoeroticism. This trend continued throughout the ‘90s with some successes, like Tim Burton’s wildly revisionistSleepy Hollow in 1999, and some misses. For instance, did you remember that Julia Roberts at the height of her stardom appeared in a revisionist take on Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde where she played the not-so-good doctor’s maid? It’s called Mary Reilly, by the by. The Samuel Goldwyn Company The Resurgence of Shakespeare Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Yet curiously it was only in the 1990s that someone really lit on the idea of making a movie directly based on the Bard tailored almost exclusively for that demographic: Baz Luhrmann in 1996, who reconfigured the tragedy of Romeo and Juliet into the visual language of MTV. He even stylized the title as William Shakespeare’s Romeo + Juliet. That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. Their Verona was a Neverland composite of Miami, Rio de Janeiro, and the nightly news, with hyper music video editing and frenetic neon-hued melodrama. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do. But it was hardly the first box office breakout for Shakespeare in the ‘90s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. To the modern eye, it is hard to remember Gibson was a heartthrob of sorts in the ‘80s and early ‘90s—or generally viewed as a dashing star worthy of heroic leading men roles. Nonetheless, there is quite a bit to like about Hamletif you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Gibson actually makes for a credible and genuinely mad Hamlet, and Zeffirelli mines the medieval melancholy of the story well with production design, costumes, and location shooting at real Norman castles. Plus, Helena Bonham Carter remains the best Ophelia ever put to screen. Hamletwould eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the ‘90s: Kenneth Branagh. Aye, Branagh might deserve the most credit for the Shakespearean renaissance in this era, beginning with his adaptation of Henry V, which featured the makings of Branagh’s troupe of former RSC favorites turned film actors: Derek Jacobi, Brian Blessed, and of course his future wife, Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing, a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. It also features Denzel Washington as a dashing Renaissance prince, Kate Beckinsale in her breakout role, and a gloriously over-the-top score by Patrick Doyle. It would define the style of Branagh’s following ‘90s efforts, whether they went off-the-rails like in the aforementioned Frankenstein, or right back on them in the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. At the play’s full four-hour length, Hamletis indulgent. Yet somehow that befits the material. Branagh would also star as Iago in Oliver Parker’s Othelloopposite Laurence Fishburne and reconfigure the Bard as a musical in his own directorial effort, Love’s Labour’s Lost. It paved the way for more outside-the-box Shakespeare movies by the end of the decade like Julie Taymor’s deconstructionist Titusand the A Midsummer Night’s Dream from 1999 where Kevin Kline turns into an ass and makes out with Michelle Pfeiffer. CBS via Getty Images The Birth of the Teenage Shakespeare RemixAs popular as the Shakespeare movie became in the ‘90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re talking moving past modernizing Romeo and Juliet like Luhrmann did, or repurposing it for high New York society like Leonard Bernstein and Stephen Sondheim aimed with West Side Story. These were straight, unapologetic youth films that also proved clever reworkings of classic storytelling structure. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You, a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. Stiles would, in fact, do this kind of remix a number times in the more serious-faced modernization of Othello, O, which also starred Mekhi Phifer as a tragically distrusting high school sports star instead of warrior, and Michael Almereyda and Ethan Hawke’s own Hamlet, the third Hamlet movie in 10 years, albeit this one set in turn-of-the-century NYC. Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale, an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the strange attempt to turn Pierre Choderlos de Laclos’ Dangerous Liaisons from 1782 into an erotic thriller for teensvia the lusty Cruel Intentions However, easily the best of these remains Amy Heckerling’s CluelessEmma from the Regency period to a fairytale version of 1990s Beverly Hills. Foregoing modern fads and simply inventing her own—with the assumption anything she wrote in 1994 would be dated by ’95—Heckerling create a faux yet now authentically iconic language and fashion style via Cher, a charmed SoCal princess who is so well-meaning in her matchmaking mischief that she defies any attempts to detest her entitlement or vanity. You kind of are even low-key chill that the happy ending is she hooks up with her step brother. It’s a classic! And the Rest There are many, many more examples we could examine from this era. These can include the sublime like the Gillian Armstrong-directed Little Women of 1994 starring Winona Ryder, Claire Danes, and Kirsten Dunst; and they can include the wretched like the Demi Moore and Gary Oldman-led The Scarlet Letter. There were more plays adapted, a la Arthur Miller’s The Crucible, and then those that just had some fun with playwrights, as seen in the over-celebrated Shakespeare in LoveBraveheart. More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. And yet, this type of film has by and large gone away. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. That beautiful film was a good-sized hit in 2019, but it did not exactly usher in a new era of literary adaptations. Now such projects, like everything else not considered four-quadrant intellectual property by studio bean counters, is mostly relegated to long-form stream series. Which in some cases is fine. Many would argue the best version of Pride & Prejudice was the BBC production… also from the ‘90s, mind. But whether it is original period piece films or adaptations, unless you’re Robert Eggers, period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton. This seems due to studios increasingly eschewing anything that isn’t reliably based on a brand that middle-aged adults loved. But in that case… it might be worth reminding them that ‘90s kids are getting older and having children of their own. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories; a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These stories are mostly in the public domain too. And recent original hits like Sinners suggests you don’t even need a classic story to connect with audiences. So perhaps once again, a play’s the thing in which they can catch the conscience of the… consumer? Or something like that. #1990s #were #golden #age #period
    The 1990s Were a Golden Age for Period Piece Movies and Literary Adaptations
    www.denofgeek.com
    Recently a friend mentioned how much of a shame it was that, generally speaking, there are few of those backdoor “classic” reimaginings today like the ones we had growing up. And after thinking for a moment, I agreed. Children and teens of the ‘90s were treated to an embarrassment of riches when it came to the Bard and Bard-adjacent films. Nearly every week seemed to offer another modernization of William Shakespeare, Jane Austen, or Geoffrey Chaucer, all retrofitted with a wink and a nudge to appeal to teenagers reading much the same texts in high school or university. But then when looking back at the sweep of 1990s cinema beyond just “teen movies,” it was more than only Julia Stiles and Heath Ledger vehicles that were getting the classical treatment. In fact the ‘90s, and to a large extent the ‘80s as well, was an era ripe with indie studios and Hollywood majors treating classic literature (if largely of the English variety) with the sanctity nowadays reserved for comic books and video games. It was a time when some of the most exciting or ambitious artists working in the industry sought to trade in the bullets and brutality of New Hollywood from a decade or two earlier in favor of the even more brutal constraints of corsets and top hats. Shakespeare was arguably bigger business in tinsel town than at any other point during this period, and we saw some of the most faithful and enduring adaptations of Austen or Louisa May Alcott make it to the screen. Why is that and can it happen again? Let’s look back at the golden age of period piece costumed dramas and splashy literary adaptations… Mozart and Merchant Ivory Since the beginning of the medium, moviemakers have looked back at well-worn and familiar stories for inspiration and audience familiarity. Not too many years after making his enduring trip to the moon, Georges Méliès adapted Hamlet into a roughly 10-minute silent short in 1907. And of course before Kenneth Branagh, Laurence Olivier had Hollywood falling in love with the Bard… at least as long it was Larry in the tights. Even so, literary adaptations were often constrained, particularly in Hollywood where filmmakers had to contend with the limitations of censorship via the Hays Code and preconceived notions about what an American audience would enjoy. The most popular costumed dramas tended to therefore be vanity projects or something of a more sensational hue—think biblical or swords and sandals epics. So it’s difficult to point to an exact moment where that changed in the 1980s, yet we’d hazard to suggest the close together Oscar seasons of 1984 and 1986 had a lot to do with it. After all, the first was the year that Miloš Forman’s AmadeusA Room with a View. Considered by Forster scholars one of the author’s slighter works, the film had critics like Roger Ebert swooning that it was a masterpiece. In the case of Amadeus, the director of One Flew Over the Cuckoo’s Nest (1975)—a zeitgeist-shaping portrait of modern oppression and control from about a decade earlier—was taking the story of Mozart and making it a punk rock tragicomedy. Based on a Peter Shaffer play of the same name, Forman and Shaffer radically reimagined the story, making it both funnier and darker as Forman strove to pose Mozart as a modern day rebel iconoclast with his wig resembling as much Sid Vicious as the Age of Enlightenment. Located atop Tom Hulce’s giggling head, it signaled a movie that had all the trappings of melodrama but felt accessible and exciting to a wide modern audience. It went on to do relatively big business and win Best Picture. While not the first period film to do so, it was the first in a long while set in what could be construed as the distant past (Richard Attenborough’s Gandhi won the year before but that was based on a subject matter in the living memory of most Academy voters). Otherwise, most of the recent winners were dramas or dramedies about the modern world: Kramer vs. Kramer (1979), The Deer Hunter (1978), and Annie Hall (1977). They reflected an audience that wanted to get away from the artificiality of their parents’ cinema, which in the U.S. associated historical costumes with the (grand) phoniness of Ben-Hur (1959) or Oliver! (1968). Yet perhaps the movie that proved this was the beginning of a popular trend came a few years later via the British masterpiece A Room with a View. To be sure, the partnership of Merchant and Ivory had been going for more than 20 years by the time they got to adapting Forster, including with several other costumed dramas and period pieces. However, those films were mixed with modern comedies and dramas like rock ’n roll-infused The Guru (1969) and Jane Austen in Manhattan (1980). More importantly, all of these films tended to be art house pictures; small chamber pieces intended for a limited audience. Yet as the marketing campaign would later trumpet about A Room with a View—the ethereal romantic dramedy which introduced Daniel Day-Lewis and a fresh-faced Helena Bonham Carter to the U.S.—this movie had the “highest single theatre gross in the country!” (It’s fun to remember a time when a movie just selling out in New York every day could make it a hit.) The film’s combination of Forster’s wry satire and cynicism about English aristocracy in the late Victorian and early Edwardian era, coupled with the sweeping romance of Puccini arias and Tuscan countrysides, made it a massive success. It also defined what became the “Merchant Ivory” period piece forever after, including in future Oscar and box office darlings like the Anthony Hopkins, Emma Thompson, and Carter-starring Howard’s End (1992), and Hopkins and Thompson’s reunion in The Remains of the Day (1993). These were all distinctly British and understated pictures, with Remains being an outright tragedy delivered in a hushed whisper, but their relative success with a certain type of moviegoer and Academy voter signaled to Hollywood that there was gold up in ‘em hills. And soon enough, more than just Forman on the American side was going up there to mine it. Join our mailing list Get the best of Den of Geek delivered right to your inbox! 20th Century Studios Martin Scorsese, Michael Mann, and the Auteur’s Costumed Drama In 1990, Michael Mann was one of the hottest creatives working in Hollywood. As the executive producer and sometime-director on NBC’s edgy (by ‘80s standards) police drama, Miami Vice, he played a direct hand in proving American television could be “gritty” and artistic. Even the episodes he didn’t helm were defined by the standards he insisted upon—such as never putting cool guys Crockett and Tubbs in a red or brown car. It would clash with the neon-light-on-celluloid aesthetic that Mann developed for the series. As that series was winding down by 1990, Mann was more in demand than ever to make any film project he might have wanted—something perhaps in-keeping with Vice or gritty crime thrillers he’d made in the ’80s like serial killer thriller Manhunter (1986). Instead he sought to adapt a childhood favorite for the screen, James Fenimore Cooper’s 19th century American frontier novel, The Last of the Mohicans. Certainly a problematic text in its original form with its imperial-fantasy riff on the French and Indian War (or Seven Years War) where Indigenous tribes in what is today upstate New York were either reduced to the noble or cruel savage stereotypes, the text proved a jumping off point for Mann to craft a gripping, primal, and prestigious film. He also made a movie that far exceeded its source material with The Last of the Mohicans being an often wordless opera of big emotions played in silence by Day-Lewis, Madeleine Stowe, and Wes Studi, all while Trevor Jones and Randy Edelman’s musical score looms like thunderclouds across the mountainous landscape. It is an elevated action movie, and a beautiful drama that did bigger business in the U.S. than Disney’s Beauty and the Beast and Tom Cruise vehicle A Few Good Men in the same year. It also would create a precedent we’d see followed time and again throughout the rest of the decade. Some of the biggest and most respected filmmakers of the moment, many of them praised under auteur theory, were looking to literary classics for an audience that craved them. After the one-two genre punch of Goodfellas (1990) and Cape Fear (1991), Martin Scorsese made one of his most ambitious and underrated films: a stone-cold 1993 masterpiece inspired by an Edith Wharton novel, The Age of Innocence. It’s a story that Scorsese argues is just as brutal, if not more so, than his gangster pictures. Indeed, The Age of Innocence remains the best cinematic representation of the Gilded Age in the U.S., capturing the lush pageantry of the most elite New Yorkers’ lifestyles in their robber baron heyday, as well as how class snobbery metastasized into a ruthless tribalism that doomed the romantic yearnings of one conformist attorney (again Daniel Day-Lewis) and this would-be divorcée love of his life (Michelle Pfeiffer). It might not have been a hit in its time, but Ang Lee’s breakout in the U.S. a year later definitely was. The Taiwanese filmmaker was already the toast of international and independent cinema via movies like The Wedding Banquet (1993) and martial arts-adjacent Pushing Hands (1991), but it is when he directed a flawless adaptation of Jane Austen’s Sense and Sensibility in 1995 that he became a Hollywood favorite who would soon get movies like Crouching Tiger, Hidden Dragon (2000) and Hulk (2003) greenlit. Sense and Sensibility benefits greatly, too, from a marvelous cast with Emma Thompson, Hugh Grant, Kate Winslet, and Alan Rickman among its ensemble. It also captured the sophisticated satirical and melancholic underpinnings of Austen’s pen that most previous Hollywood adaptations never scratched. It set a standard that most of the best Austen adaptations to this day are measured by, be it Joe Wright and Keira Knightley’s cinematic take on Pride and Prejudice a decade later, various attempts at Emma from the 1990s with Gwyneth Paltrow to this decade with Anya Taylor-Joy, or even Netflix’s recent Dakota Johnson-led Persuasion adaptation. Columbia / Sony A Dark Universe of Gods and Monsters Meanwhile, right before Columbia Pictures greenlit Scorsese’s The Age of Innocence and later Gillian Armstrong’s still delightful (and arguably definitive) interpretation of Little Women in 1994, the same studio signed off on its first period piece with Winona Ryder attached to star. And it was Dracula. Considered a folly of hubris at the time by rivals who snickered to Variety it should be renamed “Bonfire of the Vampires” (in reference to a notorious Brian De Palma bomb from 1990), Bram Stoker’s Dracula was Francis Ford Coppola’s lurid and magnificent reimagining of Stoker’s definitive Victorian novel. Published in 1897 with on-the-nose metaphors for London society’s anxieties over foreigners, sexual promiscuity and disease, and the so-called “New Woman” working in the professional classes, Coppola saw all of that potential in the well-worn and adapted vampire novel. He also correctly predicted there was a box office hit if he could bring all those elements out in an exciting and anachronistic fever dream for the MTV generation. Love or hate Coppola’s looseness with Stoker’s novel—which is pretty audacious since he put the author’s name in the title—Coppola crafted one of the most sumptuous and expensive depictions of Victorian society ever put onscreen, winning costume designer Eiko Ishioka an Oscar for the effort. He also made an unexpected holiday hit that played like bloody gangbusters alongside Home Alone 2 and Aladdin that winter. It set a standard for what can in retrospect be considered a pseudo “dark universe” of classic literary monsters getting ostensibly faithful and expensive adaptations by Hollywood. Coppola himself produced Kenneth Branagh’s Mary Shelley’s Frankenstein (1994), a film that is actually in many ways closer to the thematic letter of its author than Bram Stoker’s Dracula ever was. It was also a worse movie that flopped, but it looked spectacular as the only major Frankenstein movie to remember Shelley set the story during the Age of Enlightenment in the late 18th century. Yet while Frankenstein failed, Tom Cruise and Neil Jordan would have a lot of success in the same year adapting Anne Rice’s Interview with the Vampire. The book admittedly was recent, having been published in 1976, but the story’s roots and setting in 18th and 19th century bayou occultism were not. It was also a grandiose costumed drama where the guy who played Top Gun’s Maverick would sink fangs into young Brad Pitt’s neck in a scene dripping in homoeroticism. This trend continued throughout the ‘90s with some successes, like Tim Burton’s wildly revisionist (and Coppola-produced) Sleepy Hollow in 1999, and some misses. For instance, did you remember that Julia Roberts at the height of her stardom appeared in a revisionist take on Robert Louis Stevenson’s The Strange Case of Dr. Jekyll and Mr. Hyde where she played the not-so-good doctor’s maid? It’s called Mary Reilly (1996), by the by. The Samuel Goldwyn Company The Resurgence of Shakespeare Of course when talking about classic literature and storytelling, one name rises above most others in the schools and curriculums of the English-speaking world. Yet curiously it was only in the 1990s that someone really lit on the idea of making a movie directly based on the Bard tailored almost exclusively for that demographic: Baz Luhrmann in 1996, who reconfigured the tragedy of Romeo and Juliet into the visual language of MTV. He even stylized the title as William Shakespeare’s Romeo + Juliet. That proved the tip of an anachronistic iceberg whose cast included Leonardo DiCaprio at the height of his heartthrob powers as Romeo and real-life teenager Claire Danes as his Capulet amore. Their Verona was a Neverland composite of Miami, Rio de Janeiro, and the nightly news, with hyper music video editing and frenetic neon-hued melodrama. Some older scholars viewed Luhrmann’s anachronisms as an abomination, but as a Millennial, I can attest we loved this thing back in the day. Many still do. But it was hardly the first box office breakout for Shakespeare in the ‘90s. When the decade began, the helmer of another cinematic Romeo and Juliet classic from a different era, Franco Zeffirelli, attempted to make Hamlet exciting for “kids these days” by casting Mel Gibson right in the midst of his Lethal Weapon popularity as the indecisive Dane. To the modern eye, it is hard to remember Gibson was a heartthrob of sorts in the ‘80s and early ‘90s—or generally viewed as a dashing star worthy of heroic leading men roles. Nonetheless, there is quite a bit to like about Hamlet (1990) if you can look past Gibson’s off-screen behavior in the following decades, or the fact Zeffirelli cuts what is a four-hour play down to less than 2.5 hours. Gibson actually makes for a credible and genuinely mad Hamlet (perhaps not a surprise now), and Zeffirelli mines the medieval melancholy of the story well with production design, costumes, and location shooting at real Norman castles. Plus, Helena Bonham Carter remains the best Ophelia ever put to screen. Hamlet (1990) would eventually be overshadowed, though, both by Gibson’s awful behavior and because of a much grander and bombastic adaptation from the man who became the King of Shakespeare Movies in the ‘90s: Kenneth Branagh. Aye, Branagh might deserve the most credit for the Shakespearean renaissance in this era, beginning with his adaptation of Henry V (1989), which featured the makings of Branagh’s troupe of former RSC favorites turned film actors: Derek Jacobi, Brian Blessed, and of course his future wife (and ex), Emma Thompson. Together the pair would mount what is in this writer’s opinion the best film ever based on a Shakespeare play, the divine and breezy Much Ado About Nothing (1993), a perfect encapsulation of perhaps the first romantic comedy ever written that features Branagh and Thompson as the sharp-tongued, dueling lovers Benedict and Beatrice. It also features Denzel Washington as a dashing Renaissance prince, Kate Beckinsale in her breakout role, and a gloriously over-the-top score by Patrick Doyle. It would define the style of Branagh’s following ‘90s efforts, whether they went off-the-rails like in the aforementioned Frankenstein, or right back on them in the 70mm-filmed, ultra wide and sunny adaptation of Hamlet he helmed in 1996. Avoiding the psychological and Freudian interpretations of the Danish prince chased by Olivier and Zeffirelli, Branagh turns Hamlet into a romantic hero spearheading an all-star ensemble cast. At the play’s full four-hour length, Hamlet (1996) is indulgent. Yet somehow that befits the material. Branagh would also star as Iago in Oliver Parker’s Othello (1995) opposite Laurence Fishburne and reconfigure the Bard as a musical in his own directorial effort, Love’s Labour’s Lost (2000). It paved the way for more outside-the-box Shakespeare movies by the end of the decade like Julie Taymor’s deconstructionist Titus (1999) and the A Midsummer Night’s Dream from 1999 where Kevin Kline turns into an ass and makes out with Michelle Pfeiffer. CBS via Getty Images The Birth of the Teenage Shakespeare Remix (and Austen, and Chaucer, and…) As popular as the Shakespeare movie became in the ‘90s, what’s curiously unique about this era is the simultaneous rise of movies that adapted either the Bard or other highly respected literary writers and turned them into a pure teenage dream. We’re talking moving past modernizing Romeo and Juliet like Luhrmann did, or repurposing it for high New York society like Leonard Bernstein and Stephen Sondheim aimed with West Side Story. These were straight, unapologetic youth films that also proved clever reworkings of classic storytelling structure. Among the best directly derived from Shakespeare is the movie that made Julia Stiles and Heath Ledger Gen-X icons, 10 Things I Hate About You (1999), a happily campy update of The Taming of the Shrew set in a fairytale high school also populated by future Christopher Nolan favorites like Joseph Gordon-Levitt and David Krumholtz. Stiles would, in fact, do this kind of remix a number times in the more serious-faced modernization of Othello, O (2000), which also starred Mekhi Phifer as a tragically distrusting high school sports star instead of warrior, and Michael Almereyda and Ethan Hawke’s own Hamlet (2000), the third Hamlet movie in 10 years, albeit this one set in turn-of-the-century NYC. Ledger also returned to the concept by adapting another, even older literary giant, in this case the medieval poet Geoffrey Chaucer, for A Knight’s Tale (2001), an anachronistic blending of the medieval and modern where peasants grooved in the jousting tournament stands to Queen. There was also the strange attempt to turn Pierre Choderlos de Laclos’ Dangerous Liaisons from 1782 into an erotic thriller for teens (the ‘90s were weird, huh?) via the lusty Cruel Intentions However, easily the best of these remains Amy Heckerling’s CluelessEmma from the Regency period to a fairytale version of 1990s Beverly Hills. Foregoing modern fads and simply inventing her own—with the assumption anything she wrote in 1994 would be dated by ’95—Heckerling create a faux yet now authentically iconic language and fashion style via Cher (Alicia Silverstone), a charmed SoCal princess who is so well-meaning in her matchmaking mischief that she defies any attempts to detest her entitlement or vanity. You kind of are even low-key chill that the happy ending is she hooks up with her step brother (Paul Rudd). It’s a classic! And the Rest There are many, many more examples we could examine from this era. These can include the sublime like the Gillian Armstrong-directed Little Women of 1994 starring Winona Ryder, Claire Danes, and Kirsten Dunst; and they can include the wretched like the Demi Moore and Gary Oldman-led The Scarlet Letter (1995). There were more plays adapted, a la Arthur Miller’s The Crucible (again with Ryder and Day-Lewis!), and then those that just had some fun with playwrights, as seen in the over-celebrated Shakespeare in LoveBraveheart (1995). More than a few of these won Best Picture Oscars as well, including Braveheart, Shakespeare in Love, and James Cameron’s little 1997 movie you might have heard about elsewhere: Titanic. And yet, this type of film has by and large gone away. Once in a while one comes along that still works, such as Greta Gerwig’s own revisionist interpretation of Little Women. That beautiful film was a good-sized hit in 2019, but it did not exactly usher in a new era of literary adaptations. Now such projects, like everything else not considered four-quadrant intellectual property by studio bean counters, is mostly relegated to long-form stream series. Which in some cases is fine. Many would argue the best version of Pride & Prejudice was the BBC production… also from the ‘90s, mind. But whether it is original period piece films or adaptations, unless you’re Robert Eggers (who arguably isn’t making films for the same mainstream sensibility the likes of Gerwig or, for that matter, Coppola were), period piece storytelling and “great adaptations” have been abandoned to the small screen and full-on wish fulfillment anachronisms like Bridgerton. This seems due to studios increasingly eschewing anything that isn’t reliably based on a brand that middle-aged adults loved. But in that case… it might be worth reminding them that ‘90s kids are getting older and having children of their own. There may again be a market beyond the occasional Gerwig swing, or Eggers take on Dracula, for classic stories; a new audience being raised to want modern riffs inspired by tales that have endured for years and centuries. These stories are mostly in the public domain too. And recent original hits like Sinners suggests you don’t even need a classic story to connect with audiences. So perhaps once again, a play’s the thing in which they can catch the conscience of the… consumer? Or something like that.
    0 Comentários ·0 Compartilhamentos ·0 Anterior
  • Is IPSIE the game changer that SaaS security demands?

    Over the past few years, Okta has stated its commitment to ending the threat of identity-enabled cyber crime and attacks.
    As part of its Secure Identity Commitment, Okta has been keen on “elevating our industry’ by accelerating its capabilities and embracing new technology, such as AI, and und the digital transformation of nonprofits and advance inclusive pathways into tech.
    Therefore, when an announcement was made of a standard around identity security in security-as-a-serviceapplications, it was worth taking notice.

    Named the Interoperability Profile for Secure Identity in the Enterprise, the concept is of an open standard which provides a framework for SaaS companies to enhance the end-to-end security of their products across every touchpoint of their technology stack.
    Announcing it in October 2024, Okta CEO and co-founder Todd McKinnon said there is a “needmassive standardisation” and “move to a world where every app, every device, every workload all speak a common language”.
    McKinnon said that by adopting IPSIE, users will get complete visibility into their identity environment and the threat surface, and they can provide access to the right applications at the right time and take real-time actions in response to threats.
    Okta’s announcement stated that the point of IPSIE is to “foster a more open, consistent, flexible SaaS ecosystem by empowering organisations to adhere to a higher level of security, more seamlessly and efficiently integrating among tech stacks”.
    This open standard will provide the framework for any enterprise application to be discoverable and governable. By adopting IPSIE, users will be able to gain complete visibility across the identity threat surface, enable consistent security outcomes across SaaS applications, and build secure-by-default SaaS applications more seamlessly and efficiently.
    On that final point, Okta states that any app built to the IPSIE standard adheres to a higher level of security by ensuring that it can be governed, have entitlements managed, can support multi-factor authentication and posture management, as well as feature real-time Universal Logout.

    So far, 50 enterprise SaaS applications have joined the cause and integrated with IPSIE – including Google, Microsoft Office 365, Slack and Salesforce – to support modern identity best practices aimed at enhancing security and reducing operational burden.
    Harish Peri, senior vice-president of product marketing at Okta, tells Computer Weekly that IPSIE is a way to ensure that every app and API conforms to a standard whereby its identity can be secure: “We are leading the way with the OpenID foundation, and we’re part of the working group for the creation of IPSIE interoperably profiled for secure identity of the enterprise.”
    Far from working alone, Okta has enlisted members of the OpenID Foundation to create the IPSIE Working Group, which will develop profiles of existing specifications with a primary goal of achieving interoperability between independent implementations.
    Gail Hodges, executive director of the OpenID Foundation, says that while the development of the IPSIE was initially getting off the ground in this first year, she felt the concept was “great”, adding: “I’m really encouraged as the foundation is moving more and more towards lining up specifications; like a lot of our work internally, they’re intended to kind of sync up with each other so that you could layer specifications on top of each other.
    “I see the work of IPSIE and a group of subject matter experts looking to do exactly that – line up the specifications together. So there’s even more consistency in how those specifications are configured, so there will be even greater benefits of interoperability and security associated with deploying a more complex stack. I think it’s fantastic.”
    Shiv Ramji, president of customer identity cloud at Okta, says the ultimate ambition with IPSIE is to “make it easy for customers to choose the right default path, which is to be secure, and I think they’ll do that if the value is clear to them, and, over time, it will be”.
    The concept of IPSIE from Okta is to gain industry-wide adoption, but Ramji was keen to make the point that Okta is “one participant”, and if every participant adopts the standards, “we will deliver better security outcomes for the entire software as a service ecosystem”.

    One factor Ramji stressed is the support for Universal Logout. Okta describes this as a concept where you can terminate users’ sessions, and their tokens, for supported apps when your identity threat protection identifies a risk change.
    Specifically, a user session is the time during which a user is authenticated and authorised to access apps secured by Okta, while an app session refers to sessions that an app generates to allow users to access the app’s resources. Universal Logout can be configured to terminate a users’ sessions in generic Security Assertion Markup Languageand OpenID Connectapps.
    Stephen McDermid, EMEA chief security officer at Okta, says the concept of Universal Logout will help to mitigate and minimise risks, “so that you’re not waiting for your SOC or your SIEM solution to respond in real time”.
    He adds: “I think the fact that there’s talk about the risks that IPSIE is trying to address reassures me that we’re going in the right direction for us – and for other vendors as well. The more vendors we can get to agree to it, the better the solution becomes.”
    This is why SaaS companies are integrating Okta’s software development kit, Ramji says, with companies now adopting this, “we’re changing the type of integrations that we do with these SaaS applications because we can do signal sharing”.

    In terms of integrations, Ramji says there were more than 150 in April 2025, and users “are asking us what are the ways they can support the adoption of these standards”. Out of those 150 integrations, is this something that the customer can implement on their own, rather than waiting for Salesforce, for example, to do it, for them?
    Ramji says if a user is using Auth0 today, they can switch IPSIE and Universal Login on and go into their Okta dashboard to enable the Universal Logout cable. “They have to enable it to opt in, as it’s an opt-in mechanism,” he says.
    “It’s easy to turn it on. As we roll this out initially, a lot of this will be opt-in, and then over time we can look at ways to make that easier, or maybe look at other options, but for now, it’ll be opt-in.We don’t want behaviours in companies where their applications where users are being logged out without working it out, so this is a deliberate thing that they need to roll out.”
    Peri says Okta’s largest existing customers asked, “How soon can you get all of our apps IPSIE-fied?”, and levels of IPSIE are being defined, but he adds that this is not an Okta-driven initiative or about asserting dominance, but “about doing the right thing for the industry, as the more people that are in it, the better is for everybody”.

    So, how well will IPSIE be adopted? Computer Weekly contacted a number of other authentication suppliers to find out.
    Chris Anderson, duo product CTO at Cisco, confirms that the firm had joined the IPSIE Working Group, which aims to develop profiles of existing specifications and achieve interoperability between independent implementations, stating: “While it’s still early days, we believe that interoperability across standards is key to greater success in identity security.”
    Andras Cser, vice-president and principal analyst at Forrester, says that standards that anyone can implement, proposed by one supplier, generally “do not fare very well”, but with the backing of the working group and OpenID Foundation, could work out well.
    He points at the example set by the FIDO Alliance, which “started out as a bunch of vendors coming together”. However, Cser believes that if IPSIE could follow FIDO’s lead, then it has a chance to work.
    “The use case behind FIDO was a lot smaller than IPSIE, it was just authentication and second factor and biometrics, that was the design and try not to boil the ocean,” he says. “Single sign on, logout and token verification are largely resolved by SAML and OpenID, and there’s a scanner for those things.
    “There’s also a very concrete and distinct use case behind sharing risk signals – there’s a new login from a new IP address, from a new device and that makes a lot of sense.”
    He claims that single sign on, token revocation and logout have been resolved, while user lifecycle management, and while other areas are being addressed further down the line. He adds that IPSIE is trying to resolve things, “30% of which are not solvable in the security domain only, 60% are addressed by other standards, and 10% is the key part of what IPSIE is trying to do”.
    Less than a year since its announcement, the conversations around IPSIE suggest it will take a long time to gain full traction and industry adoption, but there is persistent positivity on the side of Okta, its main supplier driver: the criticism comes from it being too broad and “putting everything in the kitchen sink”. Time will tell, but all revolutions need to start somewhere.

    about identity management and SaaS applications

    Identity and access management tools and features for 2025.
    How to build an effective IAM architecture.
    Seven cloud IAM challenges and how to address them.
    #ipsie #game #changer #that #saas
    Is IPSIE the game changer that SaaS security demands?
    Over the past few years, Okta has stated its commitment to ending the threat of identity-enabled cyber crime and attacks. As part of its Secure Identity Commitment, Okta has been keen on “elevating our industry’ by accelerating its capabilities and embracing new technology, such as AI, and und the digital transformation of nonprofits and advance inclusive pathways into tech. Therefore, when an announcement was made of a standard around identity security in security-as-a-serviceapplications, it was worth taking notice. Named the Interoperability Profile for Secure Identity in the Enterprise, the concept is of an open standard which provides a framework for SaaS companies to enhance the end-to-end security of their products across every touchpoint of their technology stack. Announcing it in October 2024, Okta CEO and co-founder Todd McKinnon said there is a “needmassive standardisation” and “move to a world where every app, every device, every workload all speak a common language”. McKinnon said that by adopting IPSIE, users will get complete visibility into their identity environment and the threat surface, and they can provide access to the right applications at the right time and take real-time actions in response to threats. Okta’s announcement stated that the point of IPSIE is to “foster a more open, consistent, flexible SaaS ecosystem by empowering organisations to adhere to a higher level of security, more seamlessly and efficiently integrating among tech stacks”. This open standard will provide the framework for any enterprise application to be discoverable and governable. By adopting IPSIE, users will be able to gain complete visibility across the identity threat surface, enable consistent security outcomes across SaaS applications, and build secure-by-default SaaS applications more seamlessly and efficiently. On that final point, Okta states that any app built to the IPSIE standard adheres to a higher level of security by ensuring that it can be governed, have entitlements managed, can support multi-factor authentication and posture management, as well as feature real-time Universal Logout. So far, 50 enterprise SaaS applications have joined the cause and integrated with IPSIE – including Google, Microsoft Office 365, Slack and Salesforce – to support modern identity best practices aimed at enhancing security and reducing operational burden. Harish Peri, senior vice-president of product marketing at Okta, tells Computer Weekly that IPSIE is a way to ensure that every app and API conforms to a standard whereby its identity can be secure: “We are leading the way with the OpenID foundation, and we’re part of the working group for the creation of IPSIE interoperably profiled for secure identity of the enterprise.” Far from working alone, Okta has enlisted members of the OpenID Foundation to create the IPSIE Working Group, which will develop profiles of existing specifications with a primary goal of achieving interoperability between independent implementations. Gail Hodges, executive director of the OpenID Foundation, says that while the development of the IPSIE was initially getting off the ground in this first year, she felt the concept was “great”, adding: “I’m really encouraged as the foundation is moving more and more towards lining up specifications; like a lot of our work internally, they’re intended to kind of sync up with each other so that you could layer specifications on top of each other. “I see the work of IPSIE and a group of subject matter experts looking to do exactly that – line up the specifications together. So there’s even more consistency in how those specifications are configured, so there will be even greater benefits of interoperability and security associated with deploying a more complex stack. I think it’s fantastic.” Shiv Ramji, president of customer identity cloud at Okta, says the ultimate ambition with IPSIE is to “make it easy for customers to choose the right default path, which is to be secure, and I think they’ll do that if the value is clear to them, and, over time, it will be”. The concept of IPSIE from Okta is to gain industry-wide adoption, but Ramji was keen to make the point that Okta is “one participant”, and if every participant adopts the standards, “we will deliver better security outcomes for the entire software as a service ecosystem”. One factor Ramji stressed is the support for Universal Logout. Okta describes this as a concept where you can terminate users’ sessions, and their tokens, for supported apps when your identity threat protection identifies a risk change. Specifically, a user session is the time during which a user is authenticated and authorised to access apps secured by Okta, while an app session refers to sessions that an app generates to allow users to access the app’s resources. Universal Logout can be configured to terminate a users’ sessions in generic Security Assertion Markup Languageand OpenID Connectapps. Stephen McDermid, EMEA chief security officer at Okta, says the concept of Universal Logout will help to mitigate and minimise risks, “so that you’re not waiting for your SOC or your SIEM solution to respond in real time”. He adds: “I think the fact that there’s talk about the risks that IPSIE is trying to address reassures me that we’re going in the right direction for us – and for other vendors as well. The more vendors we can get to agree to it, the better the solution becomes.” This is why SaaS companies are integrating Okta’s software development kit, Ramji says, with companies now adopting this, “we’re changing the type of integrations that we do with these SaaS applications because we can do signal sharing”. In terms of integrations, Ramji says there were more than 150 in April 2025, and users “are asking us what are the ways they can support the adoption of these standards”. Out of those 150 integrations, is this something that the customer can implement on their own, rather than waiting for Salesforce, for example, to do it, for them? Ramji says if a user is using Auth0 today, they can switch IPSIE and Universal Login on and go into their Okta dashboard to enable the Universal Logout cable. “They have to enable it to opt in, as it’s an opt-in mechanism,” he says. “It’s easy to turn it on. As we roll this out initially, a lot of this will be opt-in, and then over time we can look at ways to make that easier, or maybe look at other options, but for now, it’ll be opt-in.We don’t want behaviours in companies where their applications where users are being logged out without working it out, so this is a deliberate thing that they need to roll out.” Peri says Okta’s largest existing customers asked, “How soon can you get all of our apps IPSIE-fied?”, and levels of IPSIE are being defined, but he adds that this is not an Okta-driven initiative or about asserting dominance, but “about doing the right thing for the industry, as the more people that are in it, the better is for everybody”. So, how well will IPSIE be adopted? Computer Weekly contacted a number of other authentication suppliers to find out. Chris Anderson, duo product CTO at Cisco, confirms that the firm had joined the IPSIE Working Group, which aims to develop profiles of existing specifications and achieve interoperability between independent implementations, stating: “While it’s still early days, we believe that interoperability across standards is key to greater success in identity security.” Andras Cser, vice-president and principal analyst at Forrester, says that standards that anyone can implement, proposed by one supplier, generally “do not fare very well”, but with the backing of the working group and OpenID Foundation, could work out well. He points at the example set by the FIDO Alliance, which “started out as a bunch of vendors coming together”. However, Cser believes that if IPSIE could follow FIDO’s lead, then it has a chance to work. “The use case behind FIDO was a lot smaller than IPSIE, it was just authentication and second factor and biometrics, that was the design and try not to boil the ocean,” he says. “Single sign on, logout and token verification are largely resolved by SAML and OpenID, and there’s a scanner for those things. “There’s also a very concrete and distinct use case behind sharing risk signals – there’s a new login from a new IP address, from a new device and that makes a lot of sense.” He claims that single sign on, token revocation and logout have been resolved, while user lifecycle management, and while other areas are being addressed further down the line. He adds that IPSIE is trying to resolve things, “30% of which are not solvable in the security domain only, 60% are addressed by other standards, and 10% is the key part of what IPSIE is trying to do”. Less than a year since its announcement, the conversations around IPSIE suggest it will take a long time to gain full traction and industry adoption, but there is persistent positivity on the side of Okta, its main supplier driver: the criticism comes from it being too broad and “putting everything in the kitchen sink”. Time will tell, but all revolutions need to start somewhere. about identity management and SaaS applications Identity and access management tools and features for 2025. How to build an effective IAM architecture. Seven cloud IAM challenges and how to address them. #ipsie #game #changer #that #saas
    Is IPSIE the game changer that SaaS security demands?
    www.computerweekly.com
    Over the past few years, Okta has stated its commitment to ending the threat of identity-enabled cyber crime and attacks. As part of its Secure Identity Commitment, Okta has been keen on “elevating our industry’ by accelerating its capabilities and embracing new technology, such as AI, and und the digital transformation of nonprofits and advance inclusive pathways into tech. Therefore, when an announcement was made of a standard around identity security in security-as-a-service (SaaS) applications, it was worth taking notice. Named the Interoperability Profile for Secure Identity in the Enterprise (IPSIE), the concept is of an open standard which provides a framework for SaaS companies to enhance the end-to-end security of their products across every touchpoint of their technology stack. Announcing it in October 2024, Okta CEO and co-founder Todd McKinnon said there is a “need [for] massive standardisation” and “move to a world where every app, every device, every workload all speak a common language”. McKinnon said that by adopting IPSIE, users will get complete visibility into their identity environment and the threat surface, and they can provide access to the right applications at the right time and take real-time actions in response to threats. Okta’s announcement stated that the point of IPSIE is to “foster a more open, consistent, flexible SaaS ecosystem by empowering organisations to adhere to a higher level of security, more seamlessly and efficiently integrating among tech stacks”. This open standard will provide the framework for any enterprise application to be discoverable and governable. By adopting IPSIE, users will be able to gain complete visibility across the identity threat surface, enable consistent security outcomes across SaaS applications, and build secure-by-default SaaS applications more seamlessly and efficiently. On that final point, Okta states that any app built to the IPSIE standard adheres to a higher level of security by ensuring that it can be governed, have entitlements managed, can support multi-factor authentication and posture management, as well as feature real-time Universal Logout. So far, 50 enterprise SaaS applications have joined the cause and integrated with IPSIE – including Google, Microsoft Office 365, Slack and Salesforce – to support modern identity best practices aimed at enhancing security and reducing operational burden. Harish Peri, senior vice-president of product marketing at Okta, tells Computer Weekly that IPSIE is a way to ensure that every app and API conforms to a standard whereby its identity can be secure: “We are leading the way with the OpenID foundation, and we’re part of the working group for the creation of IPSIE interoperably profiled for secure identity of the enterprise.” Far from working alone, Okta has enlisted members of the OpenID Foundation to create the IPSIE Working Group, which will develop profiles of existing specifications with a primary goal of achieving interoperability between independent implementations. Gail Hodges, executive director of the OpenID Foundation, says that while the development of the IPSIE was initially getting off the ground in this first year, she felt the concept was “great”, adding: “I’m really encouraged as the foundation is moving more and more towards lining up specifications; like a lot of our work internally, they’re intended to kind of sync up with each other so that you could layer specifications on top of each other. “I see the work of IPSIE and a group of subject matter experts looking to do exactly that – line up the specifications together. So there’s even more consistency in how those specifications are configured, so there will be even greater benefits of interoperability and security associated with deploying a more complex stack. I think it’s fantastic.” Shiv Ramji, president of customer identity cloud at Okta, says the ultimate ambition with IPSIE is to “make it easy for customers to choose the right default path, which is to be secure, and I think they’ll do that if the value is clear to them, and, over time, it will be”. The concept of IPSIE from Okta is to gain industry-wide adoption, but Ramji was keen to make the point that Okta is “one participant”, and if every participant adopts the standards, “we will deliver better security outcomes for the entire software as a service ecosystem”. One factor Ramji stressed is the support for Universal Logout. Okta describes this as a concept where you can terminate users’ sessions, and their tokens, for supported apps when your identity threat protection identifies a risk change. Specifically, a user session is the time during which a user is authenticated and authorised to access apps secured by Okta, while an app session refers to sessions that an app generates to allow users to access the app’s resources. Universal Logout can be configured to terminate a users’ sessions in generic Security Assertion Markup Language (SAML) and OpenID Connect (OIDC) apps. Stephen McDermid, EMEA chief security officer at Okta, says the concept of Universal Logout will help to mitigate and minimise risks, “so that you’re not waiting for your SOC or your SIEM solution to respond in real time”. He adds: “I think the fact that there’s talk about the risks that IPSIE is trying to address reassures me that we’re going in the right direction for us – and for other vendors as well. The more vendors we can get to agree to it, the better the solution becomes.” This is why SaaS companies are integrating Okta’s software development kit, Ramji says, with companies now adopting this, “we’re changing the type of integrations that we do with these SaaS applications because we can do signal sharing”. In terms of integrations, Ramji says there were more than 150 in April 2025, and users “are asking us what are the ways they can support the adoption of these standards”. Out of those 150 integrations, is this something that the customer can implement on their own, rather than waiting for Salesforce, for example, to do it, for them? Ramji says if a user is using Auth0 today, they can switch IPSIE and Universal Login on and go into their Okta dashboard to enable the Universal Logout cable. “They have to enable it to opt in, as it’s an opt-in mechanism,” he says. “It’s easy to turn it on. As we roll this out initially, a lot of this will be opt-in, and then over time we can look at ways to make that easier, or maybe look at other options, but for now, it’ll be opt-in.We don’t want behaviours in companies where their applications where users are being logged out without working it out, so this is a deliberate thing that they need to roll out.” Peri says Okta’s largest existing customers asked, “How soon can you get all of our apps IPSIE-fied?”, and levels of IPSIE are being defined, but he adds that this is not an Okta-driven initiative or about asserting dominance, but “about doing the right thing for the industry, as the more people that are in it, the better is for everybody”. So, how well will IPSIE be adopted? Computer Weekly contacted a number of other authentication suppliers to find out. Chris Anderson, duo product CTO at Cisco, confirms that the firm had joined the IPSIE Working Group, which aims to develop profiles of existing specifications and achieve interoperability between independent implementations, stating: “While it’s still early days, we believe that interoperability across standards is key to greater success in identity security.” Andras Cser, vice-president and principal analyst at Forrester, says that standards that anyone can implement, proposed by one supplier, generally “do not fare very well”, but with the backing of the working group and OpenID Foundation, could work out well. He points at the example set by the FIDO Alliance, which “started out as a bunch of vendors coming together”. However, Cser believes that if IPSIE could follow FIDO’s lead, then it has a chance to work. “The use case behind FIDO was a lot smaller than IPSIE, it was just authentication and second factor and biometrics, that was the design and try not to boil the ocean,” he says. “Single sign on, logout and token verification are largely resolved by SAML and OpenID, and there’s a scanner for those things. “There’s also a very concrete and distinct use case behind sharing risk signals – there’s a new login from a new IP address, from a new device and that makes a lot of sense.” He claims that single sign on, token revocation and logout have been resolved, while user lifecycle management, and while other areas are being addressed further down the line. He adds that IPSIE is trying to resolve things, “30% of which are not solvable in the security domain only, 60% are addressed by other standards, and 10% is the key part of what IPSIE is trying to do”. Less than a year since its announcement, the conversations around IPSIE suggest it will take a long time to gain full traction and industry adoption, but there is persistent positivity on the side of Okta, its main supplier driver: the criticism comes from it being too broad and “putting everything in the kitchen sink”. Time will tell, but all revolutions need to start somewhere. Read more about identity management and SaaS applications Identity and access management tools and features for 2025. How to build an effective IAM architecture. Seven cloud IAM challenges and how to address them.
    0 Comentários ·0 Compartilhamentos ·0 Anterior
CGShares https://cgshares.com