• Harvard just fired a tenured professor for the first time in 80 years. Good.

    In the summer of 2023, I wrote about a shocking scandal at Harvard Business School: Star professor Francesca Gino had been accused of falsifying data in four of her published papers, with whispers there was falsification in others, too. A series of posts on Data Colada, a blog that focuses on research integrity, documented Gino’s apparent brazen data manipulation, which involved clearly changing study data to better support her hypotheses. This was a major accusation against a researcher at the top of her field, but Gino’s denials were unconvincing. She didn’t have a good explanation for what had gone wrong, asserting that maybe a research assistant had done it, even though she was the only author listed across all four of the falsified studies. Harvard put her on unpaid administrative leave and barred her from campus.The cherry on top? Gino’s main academic area of study was honesty in business.As I wrote at the time, my read of the evidence was that Gino had most likely committed fraud. That impression was only reinforced by her subsequent lawsuit against Harvard and the Data Colada authors. Gino complained that she’d been defamed and that Harvard hadn’t followed the right investigation process, but she didn’t offer any convincing explanation of how she’d ended up putting her name to paper after paper with fake data.This week, almost two years after the news first broke, the process has reached its resolution: Gino was stripped of tenure, the first time Harvard has essentially fired a tenured professor in at least 80 years.What we do right and wrong when it comes to scientific fraudHarvard is in the news right now for its war with the Trump administration, which has sent a series of escalating demands to the university, canceled billions of dollars in federal grants and contracts, and is now blocking the university from enrolling international students, all in an apparent attempt to force the university to conform to MAGA’s ideological demands. Stripping a celebrity professor of tenure might not seem like the best look at a moment when Harvard is in an existential struggle for its right to exist as an independent academic institution. But the Gino situation, which long predates the conflict with Trump, shouldn’t be interpreted solely through the lens of that fight. Scientific fraud is a real problem, one that is chillingly common across academia. But far from putting the university in a bad light, Harvard’s handling of the Gino case has actually been unusually good, even though it still underscores just how much further academia has to go to ensure scientific fraud becomes rare and is reliably caught and punished.There are two parts to fraud response: catching it and punishing it. Academia clearly isn’t very good at the first part. The peer-review process that all meaningful research undergoes tends to start from the default assumption that data in a reviewed paper is real, and instead focuses on whether the paper represents a meaningful advance and is correctly positioned with respect to other research. Almost no reviewer is going back to check to see if what is described in a paper actually happened.Fraud, therefore, is often caught only when other researchers actively try to replicate a result or take a close look at the data. Science watchdogs who find these fraud cases tell me that we need a strong expectation that data be made public — which makes it much harder to fake — as well as a scientific culture that embraces replications.. It is these watchdogs, not anyone at Harvard or in the peer-review process, who caught the discrepancies that ultimately sunk Gino.Crime and no punishmentEven when fraud is caught, academia too often fails to properly punish it. When third-party investigators bring a concern to the attention of a university, it’s been unusual for the responsible party to actually face consequences. One of Gino’s co-authors on one of the retracted papers was Dan Ariely, a star professor of psychology and behavioral economics at Duke University. He, too, has been credibly accused of falsifying data: For example, he published one study that he claimed took place at UCLA with the assistance of researcher Aimee Drolet Rossi. But UCLA says the study didn’t happen there, and Rossi says she did not participate in it. In a past case, he claimed on a podcast to have gotten data from the insurance company Delta Dental, which the company says it did not collect. In another case, an investigation by Duke reportedly found that data from a paper he co-authored with Gino had been falsified, but that there was no evidence Ariely had used fake data knowingly.Frankly, I don’t buy this. Maybe an unlucky professor might once end up using data that was faked without their knowledge. But if it happens again, I’m not willing to credit bad luck, and at some point, a professor who keeps “accidentally” using falsified or nonexistent data should be out of a job even if we can’t prove it was no accident. But Ariely, who has maintained his innocence, is still at Duke. Or take Olivier Voinnet, a plant biologist who had multiple papers conclusively demonstrated to contain image manipulation. He was found guilty of misconduct and suspended for two years. It’s hard to imagine a higher scientific sin than faking and manipulating data. If you can’t lose your job for that, the message to young scientists is inevitably that fraud isn’t really that serious. What it means to take fraud seriouslyGino’s loss of tenure, which is one of a few recent cases where misconduct has had major career consequences, might be a sign that the tides are changing. In 2023, around when the Gino scandal broke, Stanford’s then-president Marc Tessier-Lavigne stepped down after 12 papers he authored were found to contain manipulated data. A few weeks ago, MIT announced a data falsification scandal with a terse announcement that the university no longer had confidence in a widely distributed paper “by a former second-year PhD student.” It’s reasonable to assume the student was expelled from the program.I hope that these high-profile cases are a sign we are moving in the right direction on scientific fraud because its persistence is enormously damaging to science. Other researchers waste time and energy following false lines of research substantiated by fake data; in medicine, falsification can outright kill people. But even more than that, research fraud damages the reputation of science at exactly the moment when it is most under attack.We should tighten standards to make fraud much harder to commit in the first place, and when it is identified, the consequences should be immediate and serious. Let’s hope Harvard sets a trend.A version of this story originally appeared in the Future Perfect newsletter. Sign up here!See More:
    #harvard #just #fired #tenured #professor
    Harvard just fired a tenured professor for the first time in 80 years. Good.
    In the summer of 2023, I wrote about a shocking scandal at Harvard Business School: Star professor Francesca Gino had been accused of falsifying data in four of her published papers, with whispers there was falsification in others, too. A series of posts on Data Colada, a blog that focuses on research integrity, documented Gino’s apparent brazen data manipulation, which involved clearly changing study data to better support her hypotheses. This was a major accusation against a researcher at the top of her field, but Gino’s denials were unconvincing. She didn’t have a good explanation for what had gone wrong, asserting that maybe a research assistant had done it, even though she was the only author listed across all four of the falsified studies. Harvard put her on unpaid administrative leave and barred her from campus.The cherry on top? Gino’s main academic area of study was honesty in business.As I wrote at the time, my read of the evidence was that Gino had most likely committed fraud. That impression was only reinforced by her subsequent lawsuit against Harvard and the Data Colada authors. Gino complained that she’d been defamed and that Harvard hadn’t followed the right investigation process, but she didn’t offer any convincing explanation of how she’d ended up putting her name to paper after paper with fake data.This week, almost two years after the news first broke, the process has reached its resolution: Gino was stripped of tenure, the first time Harvard has essentially fired a tenured professor in at least 80 years.What we do right and wrong when it comes to scientific fraudHarvard is in the news right now for its war with the Trump administration, which has sent a series of escalating demands to the university, canceled billions of dollars in federal grants and contracts, and is now blocking the university from enrolling international students, all in an apparent attempt to force the university to conform to MAGA’s ideological demands. Stripping a celebrity professor of tenure might not seem like the best look at a moment when Harvard is in an existential struggle for its right to exist as an independent academic institution. But the Gino situation, which long predates the conflict with Trump, shouldn’t be interpreted solely through the lens of that fight. Scientific fraud is a real problem, one that is chillingly common across academia. But far from putting the university in a bad light, Harvard’s handling of the Gino case has actually been unusually good, even though it still underscores just how much further academia has to go to ensure scientific fraud becomes rare and is reliably caught and punished.There are two parts to fraud response: catching it and punishing it. Academia clearly isn’t very good at the first part. The peer-review process that all meaningful research undergoes tends to start from the default assumption that data in a reviewed paper is real, and instead focuses on whether the paper represents a meaningful advance and is correctly positioned with respect to other research. Almost no reviewer is going back to check to see if what is described in a paper actually happened.Fraud, therefore, is often caught only when other researchers actively try to replicate a result or take a close look at the data. Science watchdogs who find these fraud cases tell me that we need a strong expectation that data be made public — which makes it much harder to fake — as well as a scientific culture that embraces replications.. It is these watchdogs, not anyone at Harvard or in the peer-review process, who caught the discrepancies that ultimately sunk Gino.Crime and no punishmentEven when fraud is caught, academia too often fails to properly punish it. When third-party investigators bring a concern to the attention of a university, it’s been unusual for the responsible party to actually face consequences. One of Gino’s co-authors on one of the retracted papers was Dan Ariely, a star professor of psychology and behavioral economics at Duke University. He, too, has been credibly accused of falsifying data: For example, he published one study that he claimed took place at UCLA with the assistance of researcher Aimee Drolet Rossi. But UCLA says the study didn’t happen there, and Rossi says she did not participate in it. In a past case, he claimed on a podcast to have gotten data from the insurance company Delta Dental, which the company says it did not collect. In another case, an investigation by Duke reportedly found that data from a paper he co-authored with Gino had been falsified, but that there was no evidence Ariely had used fake data knowingly.Frankly, I don’t buy this. Maybe an unlucky professor might once end up using data that was faked without their knowledge. But if it happens again, I’m not willing to credit bad luck, and at some point, a professor who keeps “accidentally” using falsified or nonexistent data should be out of a job even if we can’t prove it was no accident. But Ariely, who has maintained his innocence, is still at Duke. Or take Olivier Voinnet, a plant biologist who had multiple papers conclusively demonstrated to contain image manipulation. He was found guilty of misconduct and suspended for two years. It’s hard to imagine a higher scientific sin than faking and manipulating data. If you can’t lose your job for that, the message to young scientists is inevitably that fraud isn’t really that serious. What it means to take fraud seriouslyGino’s loss of tenure, which is one of a few recent cases where misconduct has had major career consequences, might be a sign that the tides are changing. In 2023, around when the Gino scandal broke, Stanford’s then-president Marc Tessier-Lavigne stepped down after 12 papers he authored were found to contain manipulated data. A few weeks ago, MIT announced a data falsification scandal with a terse announcement that the university no longer had confidence in a widely distributed paper “by a former second-year PhD student.” It’s reasonable to assume the student was expelled from the program.I hope that these high-profile cases are a sign we are moving in the right direction on scientific fraud because its persistence is enormously damaging to science. Other researchers waste time and energy following false lines of research substantiated by fake data; in medicine, falsification can outright kill people. But even more than that, research fraud damages the reputation of science at exactly the moment when it is most under attack.We should tighten standards to make fraud much harder to commit in the first place, and when it is identified, the consequences should be immediate and serious. Let’s hope Harvard sets a trend.A version of this story originally appeared in the Future Perfect newsletter. Sign up here!See More: #harvard #just #fired #tenured #professor
    WWW.VOX.COM
    Harvard just fired a tenured professor for the first time in 80 years. Good.
    In the summer of 2023, I wrote about a shocking scandal at Harvard Business School: Star professor Francesca Gino had been accused of falsifying data in four of her published papers, with whispers there was falsification in others, too. A series of posts on Data Colada, a blog that focuses on research integrity, documented Gino’s apparent brazen data manipulation, which involved clearly changing study data to better support her hypotheses. This was a major accusation against a researcher at the top of her field, but Gino’s denials were unconvincing. She didn’t have a good explanation for what had gone wrong, asserting that maybe a research assistant had done it, even though she was the only author listed across all four of the falsified studies. Harvard put her on unpaid administrative leave and barred her from campus.The cherry on top? Gino’s main academic area of study was honesty in business.As I wrote at the time, my read of the evidence was that Gino had most likely committed fraud. That impression was only reinforced by her subsequent lawsuit against Harvard and the Data Colada authors. Gino complained that she’d been defamed and that Harvard hadn’t followed the right investigation process, but she didn’t offer any convincing explanation of how she’d ended up putting her name to paper after paper with fake data.This week, almost two years after the news first broke, the process has reached its resolution: Gino was stripped of tenure, the first time Harvard has essentially fired a tenured professor in at least 80 years. (Her defamation lawsuit against the bloggers who found the data manipulation was dismissed last year.)What we do right and wrong when it comes to scientific fraudHarvard is in the news right now for its war with the Trump administration, which has sent a series of escalating demands to the university, canceled billions of dollars in federal grants and contracts, and is now blocking the university from enrolling international students, all in an apparent attempt to force the university to conform to MAGA’s ideological demands. Stripping a celebrity professor of tenure might not seem like the best look at a moment when Harvard is in an existential struggle for its right to exist as an independent academic institution. But the Gino situation, which long predates the conflict with Trump, shouldn’t be interpreted solely through the lens of that fight. Scientific fraud is a real problem, one that is chillingly common across academia. But far from putting the university in a bad light, Harvard’s handling of the Gino case has actually been unusually good, even though it still underscores just how much further academia has to go to ensure scientific fraud becomes rare and is reliably caught and punished.There are two parts to fraud response: catching it and punishing it. Academia clearly isn’t very good at the first part. The peer-review process that all meaningful research undergoes tends to start from the default assumption that data in a reviewed paper is real, and instead focuses on whether the paper represents a meaningful advance and is correctly positioned with respect to other research. Almost no reviewer is going back to check to see if what is described in a paper actually happened.Fraud, therefore, is often caught only when other researchers actively try to replicate a result or take a close look at the data. Science watchdogs who find these fraud cases tell me that we need a strong expectation that data be made public — which makes it much harder to fake — as well as a scientific culture that embraces replications. (Given the premiums journals put on novelty in research and the supreme importance of publishing for academic careers, there’s been little motivation for scientists to pursue replication.). It is these watchdogs, not anyone at Harvard or in the peer-review process, who caught the discrepancies that ultimately sunk Gino.Crime and no punishmentEven when fraud is caught, academia too often fails to properly punish it. When third-party investigators bring a concern to the attention of a university, it’s been unusual for the responsible party to actually face consequences. One of Gino’s co-authors on one of the retracted papers was Dan Ariely, a star professor of psychology and behavioral economics at Duke University. He, too, has been credibly accused of falsifying data: For example, he published one study that he claimed took place at UCLA with the assistance of researcher Aimee Drolet Rossi. But UCLA says the study didn’t happen there, and Rossi says she did not participate in it. In a past case, he claimed on a podcast to have gotten data from the insurance company Delta Dental, which the company says it did not collect. In another case, an investigation by Duke reportedly found that data from a paper he co-authored with Gino had been falsified, but that there was no evidence Ariely had used fake data knowingly.Frankly, I don’t buy this. Maybe an unlucky professor might once end up using data that was faked without their knowledge. But if it happens again, I’m not willing to credit bad luck, and at some point, a professor who keeps “accidentally” using falsified or nonexistent data should be out of a job even if we can’t prove it was no accident. But Ariely, who has maintained his innocence, is still at Duke. Or take Olivier Voinnet, a plant biologist who had multiple papers conclusively demonstrated to contain image manipulation. He was found guilty of misconduct and suspended for two years. It’s hard to imagine a higher scientific sin than faking and manipulating data. If you can’t lose your job for that, the message to young scientists is inevitably that fraud isn’t really that serious. What it means to take fraud seriouslyGino’s loss of tenure, which is one of a few recent cases where misconduct has had major career consequences, might be a sign that the tides are changing. In 2023, around when the Gino scandal broke, Stanford’s then-president Marc Tessier-Lavigne stepped down after 12 papers he authored were found to contain manipulated data. A few weeks ago, MIT announced a data falsification scandal with a terse announcement that the university no longer had confidence in a widely distributed paper “by a former second-year PhD student.” It’s reasonable to assume the student was expelled from the program.I hope that these high-profile cases are a sign we are moving in the right direction on scientific fraud because its persistence is enormously damaging to science. Other researchers waste time and energy following false lines of research substantiated by fake data; in medicine, falsification can outright kill people. But even more than that, research fraud damages the reputation of science at exactly the moment when it is most under attack.We should tighten standards to make fraud much harder to commit in the first place, and when it is identified, the consequences should be immediate and serious. Let’s hope Harvard sets a trend.A version of this story originally appeared in the Future Perfect newsletter. Sign up here!See More:
    0 Комментарии 0 Поделились 0 предпросмотр
  • Atomfall’s Wicked Isle Expansion Gets June Release Date

    Rebellion Developments has revealed the release date for Atomfall’s Wicked Isle Expansion. Dropping June 3, this story-based DLC will send players to a new location called Midsummer Island. Once there, they’ll have to contend with new enemy factions--including infected druids who use severed heads like lanterns--strange fauna, and more while exploring story threads that could lead to alternate endings for the main game.Midsummer Island will seemingly be a tough place to explore thanks to its close proximity to the Windscale Nuclear Plant; the level of infection is higher there than Atomfall’s other environments. Thankfully, fans won’t be thrown to the radioactive wolves without some form of help. New weapons like the Blunderbuss shotgun will be available on the island. There will also be new skills to unlock, items to craft, and an upgraded metal detector to find. Essentially, players should have enough tools to stave off death for a long as possible. Atomfall has proven to be quite the survival game. Featuring an interesting mission structure, challenging combat, and unique RPG elements, it easily stands out amongst the genre’s best. That said, given the conclusive nature of its campaign, the Wicked Isle Expansion significantly increases the prospect of venturing back into its post-apocalyptic environments.Continue Reading at GameSpot
    #atomfalls #wicked #isle #expansion #gets
    Atomfall’s Wicked Isle Expansion Gets June Release Date
    Rebellion Developments has revealed the release date for Atomfall’s Wicked Isle Expansion. Dropping June 3, this story-based DLC will send players to a new location called Midsummer Island. Once there, they’ll have to contend with new enemy factions--including infected druids who use severed heads like lanterns--strange fauna, and more while exploring story threads that could lead to alternate endings for the main game.Midsummer Island will seemingly be a tough place to explore thanks to its close proximity to the Windscale Nuclear Plant; the level of infection is higher there than Atomfall’s other environments. Thankfully, fans won’t be thrown to the radioactive wolves without some form of help. New weapons like the Blunderbuss shotgun will be available on the island. There will also be new skills to unlock, items to craft, and an upgraded metal detector to find. Essentially, players should have enough tools to stave off death for a long as possible. Atomfall has proven to be quite the survival game. Featuring an interesting mission structure, challenging combat, and unique RPG elements, it easily stands out amongst the genre’s best. That said, given the conclusive nature of its campaign, the Wicked Isle Expansion significantly increases the prospect of venturing back into its post-apocalyptic environments.Continue Reading at GameSpot #atomfalls #wicked #isle #expansion #gets
    WWW.GAMESPOT.COM
    Atomfall’s Wicked Isle Expansion Gets June Release Date
    Rebellion Developments has revealed the release date for Atomfall’s Wicked Isle Expansion. Dropping June 3, this story-based DLC will send players to a new location called Midsummer Island. Once there, they’ll have to contend with new enemy factions--including infected druids who use severed heads like lanterns--strange fauna, and more while exploring story threads that could lead to alternate endings for the main game.Midsummer Island will seemingly be a tough place to explore thanks to its close proximity to the Windscale Nuclear Plant; the level of infection is higher there than Atomfall’s other environments. Thankfully, fans won’t be thrown to the radioactive wolves without some form of help. New weapons like the Blunderbuss shotgun will be available on the island. There will also be new skills to unlock, items to craft, and an upgraded metal detector to find. Essentially, players should have enough tools to stave off death for a long as possible. Atomfall has proven to be quite the survival game. Featuring an interesting mission structure, challenging combat, and unique RPG elements, it easily stands out amongst the genre’s best. That said, given the conclusive nature of its campaign, the Wicked Isle Expansion significantly increases the prospect of venturing back into its post-apocalyptic environments.Continue Reading at GameSpot
    0 Комментарии 0 Поделились 0 предпросмотр
  • Video games' soaring prices have a cost beyond your wallet - the concept of ownership itself

    Video games' soaring prices have a cost beyond your wallet - the concept of ownership itself
    As the industry's big squeeze reaches consumers, a grim bargain emerges.

    Image credit: Adobe Stock, Microsoft

    Opinion

    by Chris Tapsell
    Deputy Editor

    Published on May 22, 2025

    Earlier this month, Microsoft bumped up the prices of its entire range of Xbox consoles, first-party video games, and mostof its accessories. It comes a few weeks after Nintendo revealed a £396 Switch 2, with £75 copies of its own first-party fare in Mario Kart World, and a few months after Sony launched the exorbitant £700 PS5 Pro, a £40 price rise for its all-digital console in the UK, the second of this generation, and news that it's considering even more price rises in the months to come.
    The suspicion - or depending on where you live, perhaps hope - had been that when Donald Trump's ludicrously flip-flopping, self-defeating tariffs came into play, that the US would bear the brunt of it. The reality is that we're still waiting on the full effects. But it's also clear, already, that this is far from just an American problem. The platform-holders are already spreading the costs, presumably to avoid an outright doubling of prices in one of their largest markets. PS5s in Japan now cost £170 more than they did at launch.
    That price rise, mind, took place long before the tariffs, as did the £700 PS5 Pro, and the creeping costs of subscriptions such as Game Pass and PS Plus. Nor is it immediately clear how that justifies charging for, say, a copy of Borderlands 4, a price which hasn't been confirmed but which has still been justified by the ever graceful Randy Pitchford, a man who seems to stride across the world with one foot perpetually bared and ready to be put, squelching, square in it, and who says true fans will still "find a way" to buy his game.
    The truth is inflation has been at it here for a while, and that inflation is a funny beast, one which often comes with an awkward mix of genuine unavoidability - tariffs, wars, pandemics - and concealed opportunism. Games are their own case amongst the many, their prices instead impacted more by the cost of labour, which soars not because developers are paid particularly wellbut because of the continued, lagging impact of their executives' total miscalculation, in assuming triple-A budgets and timescales could continue growing exponentially. And by said opportunism - peep how long it took for Microsoft and the like to announce those bumped prices after Nintendo came in with Mario Kart at £75.
    Anyway, the causes are, in a sense, kind of moot. The result of all this squeezing from near enough all angles of gaming's corporate world is less a pincer manoeuvre on the consumer than a suffocating, immaculately executed full-court press, a full team hurtling with ruthless speed towards the poor unwitting sucker at home on the sofa. Identifying whether gaming costs a fortune now for reasons we can or can't sympathise with does little to change the fact that gaming costs a fortune. And, to be clear, it really does cost a fortune.

    Things are getting very expensive in the world of video games. £700 for a PS5 Pro! | Image credit: Eurogamer

    Whenever complaints about video game prices come up there is naturally a bit of pushback - games have always been expensive! What about the 90s! - usually via attempts to draw conclusions from economic data. Normally I'd be all on board with this - numbers can't lie! - but in this case it's a little different. Numbers can't lie, but they can, sometimes, be manipulated to prove almost anything you want - or just as often, simply misunderstood to the same ends.Instead, it's worth remembering that economics isn't just a numerical science. It is also a behavioural one - a psychological one. The impact of pricing is as much in the mind as it is on the spreadsheet, hence these very real notions of "consumer confidence" and pricing that continues to end in ".99". And so sometimes with pricing I find it helps to borrow another phrase from sport, alongside that full-court press, in the "eye test". Sports scouts use all kinds of numerical data to analyse prospective players these days, but the best ones still marry that with a bit of old-school viewing in the flesh. If a player looks good on paper and passes the eye test, they're probably the real deal. Likewise, if the impact of buying an video game at full price looks unclear in the data, but to your human eye feels about as whince-inducing as biting into a raw onion like it's an apple, and then rubbing said raw onion all over said eye, it's probably extremely bloody expensive and you should stop trying to be clever.
    Video games, to me, do feel bloody expensive. If I weren't in the incredibly fortunate position of being able to source or expense most of them for work I am genuinely unsure if I'd be continuing with them as a hobby - at least beyond shifting my patterns, as so many players have over the years, away from premium console and PC games to the forever-tempting, free-to-play time-vampires like Fortnite or League of Legends. Which leads, finally, to the real point here: that there is another cost to rising game and console prices, beyond the one hitting you square in the wallet.

    How much is GTA 6 going to cost? or more? | Image credit: Rockstar

    The other cost - perhaps the real cost, when things settle - is the notion of ownership itself. Plenty of physical media collectors, aficionados and diehards will tell you this has been locked in the sights of this industry for a long time, of course. They will point to gaming's sister entertainment industries of music, film and television, and the paradigm shift to streaming in each, as a sign of the inevitability of it all. And they will undoubtedly have a point. But this step change in the cost of gaming will only be an accelerant.
    Understanding that only takes a quick glance at the strategy of, say, Xbox in recent years. While Nintendo is still largely adhering to the buy-it-outright tradition and Sony is busy shooting off its toes with live service-shaped bullets, Microsoft has, like it or not, positioned itself rather deftly. After jacking up the cost of its flatlining hardware and platform-agnostic games, Xbox, its execs would surely argue, is also now rather counterintuitively the home of value gaming - if only because Microsoft itself is the one hoiking up the cost of your main alternative. Because supplanting the waning old faithfuls in this kind of scenario - trade-ins, short-term rentals - is, you guessed it, Game Pass.
    You could even argue the consoles are factored in here too. Microsoft, with its "this is an Xbox" campaign and long-stated ambition to reach players in the billions, has made it plain that it doesn't care where you play its games, as long as you're playing them. When all physical consoles are jumping up in price, thanks to that rising tide effect of inflation, the platform that lets you spend £15 a month to stream Clair Obscur: Expedition 33, Oblivion Remastered and the latest Doom straight to your TV without even buying one is, at least in theorylooking like quite an attractive proposition.
    Xbox, for its part, has been chipping away at this idea for a while - we at Eurogamer had opinions about team green's disregard for game ownership as far back as the reveal of the Xbox One, in the ancient times of 2013. Then it was a different method, the once-horrifying face of digital rights management, or DRM, along with regulated digital game sharing and online-only requirements. Here in 2025, with that disdain now platform-agnostic, and where games are being disappeared from people's libraries, platforms like Steam are, by law, forced to remind you that you're not actually buying your games at all, where older games are increasingly only playable via subscriptions to Nintendo, Sony, and now Xbox, and bosses are making wild claims about AI's ability to "preserve" old games by making terrible facsimiles of them, that seems slightly quaint.
    More directly, Xbox has been talking about this very openly since at least 2021. As Ben Decker, then head of gaming services marketing at Xbox, said to me at the time: "Our goal for Xbox Game Pass really ladders up to our goal at Xbox, to reach the more than 3 billion gamers worldwide… we are building a future with this in mind."
    Four years on, that future might be now. Jacking up the cost of games and consoles alone won't do anything to grow gaming's userbase, that being the touted panacea still by the industry's top brass. Quite the opposite, obviously. But funneling more and more core players away from owning games, and towards a newly incentivised world where they merely pay a comparatively low monthly fee to access them, might just. How much a difference that will truly make, and the consequences of it, remain up for debate of course. We've seen the impact of streaming on the other entertainment industries in turn, none for the better, but games are a medium of their own.
    Perhaps there's still a little room for optimism. Against the tide there are still organisations like Does It Play? and the Game History Foundation, or platforms such as itch.io and GOG, that exist precisely because of the growing resistance to that current. Just this week, Lost in Cult launched a new wave of luxurious, always-playable physical editions of acclaimed games, another small act of defiance - though perhaps another sign things are going the way of film and music, where purists splurge on vinyl and Criterion Collection BluRays but the vast majority remain on Netflix and Spotify. And as uncomfortable as it may be to hear for those - including this author! - who wish for this medium to be preserved and cared for like any other great artform, there will be some who argue that a model where more games can be enjoyed by more people, for a lower cost, is worth it.

    Game Pass often offers great value, but the library is always in a state of flux. Collectors may need to start looking at high-end physical editions. | Image credit: Microsoft

    There's also another point to bear in mind here. Nightmarish as it may be for preservation and consumer rights, against the backdrop of endless layoffs and instability many developers tout the stability of a predefined Game Pass or PS Plus deal over taking a punt in the increasingly crowded, choppy seas of the open market. Bethesda this week has just boasted Doom: The Dark Ages' achievement of becoming the most widely-playedDoom game ever. That despite it reaching only a fraction of peak Steam concurrents in the same period as its predecessor, Doom: Eternal - a sign, barring some surprise shift away from PC gaming to consoles, that people really are beginning to choose playing games on Game Pass over buying them outright. The likes of Remedy and Rebellion tout PS Plus and Game Pass as stabilisers, or even accelerants, for their games launching straight onto the services. And independent studios and publishers of varying sizes pre-empted that when we spoke to them for a piece about this exact this point, more than four years ago - in a sense, we're still waiting for a conclusive answer to a question we first began investigating back in 2021: Is Xbox Game Pass just too good to be true?
    We've talked, at this point, at great length about how this year would be make-or-break for the triple-A model in particular. About how the likes of Xbox, or Warner Bros., or the many others have lost sight of their purpose - and in the process, their path to sustainability - in the quest for exponential growth. How £700 Pro edition consoles are an argument against Pro editions altogether. And about how, it's becoming clear, the old industry we once knew is no more, with its new form still yet to take shape.
    There's an argument now, however, that a grim new normal for preservation and ownership may, just as grimly, be exactly what the industry needs to save itself. It would be in line with what we've seen from the wider world of technology and media - and really, the wider world itself. A shift from owning to renting. That old chestnut of all the capital slowly rising, curdling at the top. The public as mere tenants in a house of culture owned by someone, somewhere else. It needn't have to be this way, of course. If this all sounds like a particularly unfavourable trade-in, remember this too: it's one that could almost certainly have been avoided.
    #video #games039 #soaring #prices #have
    Video games' soaring prices have a cost beyond your wallet - the concept of ownership itself
    Video games' soaring prices have a cost beyond your wallet - the concept of ownership itself As the industry's big squeeze reaches consumers, a grim bargain emerges. Image credit: Adobe Stock, Microsoft Opinion by Chris Tapsell Deputy Editor Published on May 22, 2025 Earlier this month, Microsoft bumped up the prices of its entire range of Xbox consoles, first-party video games, and mostof its accessories. It comes a few weeks after Nintendo revealed a £396 Switch 2, with £75 copies of its own first-party fare in Mario Kart World, and a few months after Sony launched the exorbitant £700 PS5 Pro, a £40 price rise for its all-digital console in the UK, the second of this generation, and news that it's considering even more price rises in the months to come. The suspicion - or depending on where you live, perhaps hope - had been that when Donald Trump's ludicrously flip-flopping, self-defeating tariffs came into play, that the US would bear the brunt of it. The reality is that we're still waiting on the full effects. But it's also clear, already, that this is far from just an American problem. The platform-holders are already spreading the costs, presumably to avoid an outright doubling of prices in one of their largest markets. PS5s in Japan now cost £170 more than they did at launch. That price rise, mind, took place long before the tariffs, as did the £700 PS5 Pro, and the creeping costs of subscriptions such as Game Pass and PS Plus. Nor is it immediately clear how that justifies charging for, say, a copy of Borderlands 4, a price which hasn't been confirmed but which has still been justified by the ever graceful Randy Pitchford, a man who seems to stride across the world with one foot perpetually bared and ready to be put, squelching, square in it, and who says true fans will still "find a way" to buy his game. The truth is inflation has been at it here for a while, and that inflation is a funny beast, one which often comes with an awkward mix of genuine unavoidability - tariffs, wars, pandemics - and concealed opportunism. Games are their own case amongst the many, their prices instead impacted more by the cost of labour, which soars not because developers are paid particularly wellbut because of the continued, lagging impact of their executives' total miscalculation, in assuming triple-A budgets and timescales could continue growing exponentially. And by said opportunism - peep how long it took for Microsoft and the like to announce those bumped prices after Nintendo came in with Mario Kart at £75. Anyway, the causes are, in a sense, kind of moot. The result of all this squeezing from near enough all angles of gaming's corporate world is less a pincer manoeuvre on the consumer than a suffocating, immaculately executed full-court press, a full team hurtling with ruthless speed towards the poor unwitting sucker at home on the sofa. Identifying whether gaming costs a fortune now for reasons we can or can't sympathise with does little to change the fact that gaming costs a fortune. And, to be clear, it really does cost a fortune. Things are getting very expensive in the world of video games. £700 for a PS5 Pro! | Image credit: Eurogamer Whenever complaints about video game prices come up there is naturally a bit of pushback - games have always been expensive! What about the 90s! - usually via attempts to draw conclusions from economic data. Normally I'd be all on board with this - numbers can't lie! - but in this case it's a little different. Numbers can't lie, but they can, sometimes, be manipulated to prove almost anything you want - or just as often, simply misunderstood to the same ends.Instead, it's worth remembering that economics isn't just a numerical science. It is also a behavioural one - a psychological one. The impact of pricing is as much in the mind as it is on the spreadsheet, hence these very real notions of "consumer confidence" and pricing that continues to end in ".99". And so sometimes with pricing I find it helps to borrow another phrase from sport, alongside that full-court press, in the "eye test". Sports scouts use all kinds of numerical data to analyse prospective players these days, but the best ones still marry that with a bit of old-school viewing in the flesh. If a player looks good on paper and passes the eye test, they're probably the real deal. Likewise, if the impact of buying an video game at full price looks unclear in the data, but to your human eye feels about as whince-inducing as biting into a raw onion like it's an apple, and then rubbing said raw onion all over said eye, it's probably extremely bloody expensive and you should stop trying to be clever. Video games, to me, do feel bloody expensive. If I weren't in the incredibly fortunate position of being able to source or expense most of them for work I am genuinely unsure if I'd be continuing with them as a hobby - at least beyond shifting my patterns, as so many players have over the years, away from premium console and PC games to the forever-tempting, free-to-play time-vampires like Fortnite or League of Legends. Which leads, finally, to the real point here: that there is another cost to rising game and console prices, beyond the one hitting you square in the wallet. How much is GTA 6 going to cost? or more? | Image credit: Rockstar The other cost - perhaps the real cost, when things settle - is the notion of ownership itself. Plenty of physical media collectors, aficionados and diehards will tell you this has been locked in the sights of this industry for a long time, of course. They will point to gaming's sister entertainment industries of music, film and television, and the paradigm shift to streaming in each, as a sign of the inevitability of it all. And they will undoubtedly have a point. But this step change in the cost of gaming will only be an accelerant. Understanding that only takes a quick glance at the strategy of, say, Xbox in recent years. While Nintendo is still largely adhering to the buy-it-outright tradition and Sony is busy shooting off its toes with live service-shaped bullets, Microsoft has, like it or not, positioned itself rather deftly. After jacking up the cost of its flatlining hardware and platform-agnostic games, Xbox, its execs would surely argue, is also now rather counterintuitively the home of value gaming - if only because Microsoft itself is the one hoiking up the cost of your main alternative. Because supplanting the waning old faithfuls in this kind of scenario - trade-ins, short-term rentals - is, you guessed it, Game Pass. You could even argue the consoles are factored in here too. Microsoft, with its "this is an Xbox" campaign and long-stated ambition to reach players in the billions, has made it plain that it doesn't care where you play its games, as long as you're playing them. When all physical consoles are jumping up in price, thanks to that rising tide effect of inflation, the platform that lets you spend £15 a month to stream Clair Obscur: Expedition 33, Oblivion Remastered and the latest Doom straight to your TV without even buying one is, at least in theorylooking like quite an attractive proposition. Xbox, for its part, has been chipping away at this idea for a while - we at Eurogamer had opinions about team green's disregard for game ownership as far back as the reveal of the Xbox One, in the ancient times of 2013. Then it was a different method, the once-horrifying face of digital rights management, or DRM, along with regulated digital game sharing and online-only requirements. Here in 2025, with that disdain now platform-agnostic, and where games are being disappeared from people's libraries, platforms like Steam are, by law, forced to remind you that you're not actually buying your games at all, where older games are increasingly only playable via subscriptions to Nintendo, Sony, and now Xbox, and bosses are making wild claims about AI's ability to "preserve" old games by making terrible facsimiles of them, that seems slightly quaint. More directly, Xbox has been talking about this very openly since at least 2021. As Ben Decker, then head of gaming services marketing at Xbox, said to me at the time: "Our goal for Xbox Game Pass really ladders up to our goal at Xbox, to reach the more than 3 billion gamers worldwide… we are building a future with this in mind." Four years on, that future might be now. Jacking up the cost of games and consoles alone won't do anything to grow gaming's userbase, that being the touted panacea still by the industry's top brass. Quite the opposite, obviously. But funneling more and more core players away from owning games, and towards a newly incentivised world where they merely pay a comparatively low monthly fee to access them, might just. How much a difference that will truly make, and the consequences of it, remain up for debate of course. We've seen the impact of streaming on the other entertainment industries in turn, none for the better, but games are a medium of their own. Perhaps there's still a little room for optimism. Against the tide there are still organisations like Does It Play? and the Game History Foundation, or platforms such as itch.io and GOG, that exist precisely because of the growing resistance to that current. Just this week, Lost in Cult launched a new wave of luxurious, always-playable physical editions of acclaimed games, another small act of defiance - though perhaps another sign things are going the way of film and music, where purists splurge on vinyl and Criterion Collection BluRays but the vast majority remain on Netflix and Spotify. And as uncomfortable as it may be to hear for those - including this author! - who wish for this medium to be preserved and cared for like any other great artform, there will be some who argue that a model where more games can be enjoyed by more people, for a lower cost, is worth it. Game Pass often offers great value, but the library is always in a state of flux. Collectors may need to start looking at high-end physical editions. | Image credit: Microsoft There's also another point to bear in mind here. Nightmarish as it may be for preservation and consumer rights, against the backdrop of endless layoffs and instability many developers tout the stability of a predefined Game Pass or PS Plus deal over taking a punt in the increasingly crowded, choppy seas of the open market. Bethesda this week has just boasted Doom: The Dark Ages' achievement of becoming the most widely-playedDoom game ever. That despite it reaching only a fraction of peak Steam concurrents in the same period as its predecessor, Doom: Eternal - a sign, barring some surprise shift away from PC gaming to consoles, that people really are beginning to choose playing games on Game Pass over buying them outright. The likes of Remedy and Rebellion tout PS Plus and Game Pass as stabilisers, or even accelerants, for their games launching straight onto the services. And independent studios and publishers of varying sizes pre-empted that when we spoke to them for a piece about this exact this point, more than four years ago - in a sense, we're still waiting for a conclusive answer to a question we first began investigating back in 2021: Is Xbox Game Pass just too good to be true? We've talked, at this point, at great length about how this year would be make-or-break for the triple-A model in particular. About how the likes of Xbox, or Warner Bros., or the many others have lost sight of their purpose - and in the process, their path to sustainability - in the quest for exponential growth. How £700 Pro edition consoles are an argument against Pro editions altogether. And about how, it's becoming clear, the old industry we once knew is no more, with its new form still yet to take shape. There's an argument now, however, that a grim new normal for preservation and ownership may, just as grimly, be exactly what the industry needs to save itself. It would be in line with what we've seen from the wider world of technology and media - and really, the wider world itself. A shift from owning to renting. That old chestnut of all the capital slowly rising, curdling at the top. The public as mere tenants in a house of culture owned by someone, somewhere else. It needn't have to be this way, of course. If this all sounds like a particularly unfavourable trade-in, remember this too: it's one that could almost certainly have been avoided. #video #games039 #soaring #prices #have
    WWW.EUROGAMER.NET
    Video games' soaring prices have a cost beyond your wallet - the concept of ownership itself
    Video games' soaring prices have a cost beyond your wallet - the concept of ownership itself As the industry's big squeeze reaches consumers, a grim bargain emerges. Image credit: Adobe Stock, Microsoft Opinion by Chris Tapsell Deputy Editor Published on May 22, 2025 Earlier this month, Microsoft bumped up the prices of its entire range of Xbox consoles, first-party video games, and most (or in the US, all) of its accessories. It comes a few weeks after Nintendo revealed a £396 Switch 2, with £75 copies of its own first-party fare in Mario Kart World, and a few months after Sony launched the exorbitant £700 PS5 Pro (stand and disc drive not included), a £40 price rise for its all-digital console in the UK, the second of this generation, and news that it's considering even more price rises in the months to come. The suspicion - or depending on where you live, perhaps hope - had been that when Donald Trump's ludicrously flip-flopping, self-defeating tariffs came into play, that the US would bear the brunt of it. The reality is that we're still waiting on the full effects. But it's also clear, already, that this is far from just an American problem. The platform-holders are already spreading the costs, presumably to avoid an outright doubling of prices in one of their largest markets. PS5s in Japan now cost £170 more than they did at launch. That price rise, mind, took place long before the tariffs, as did the £700 PS5 Pro (stand and disc drive not included!), and the creeping costs of subscriptions such as Game Pass and PS Plus. Nor is it immediately clear how that justifies charging $80 for, say, a copy of Borderlands 4, a price which hasn't been confirmed but which has still been justified by the ever graceful Randy Pitchford, a man who seems to stride across the world with one foot perpetually bared and ready to be put, squelching, square in it, and who says true fans will still "find a way" to buy his game. The truth is inflation has been at it here for a while, and that inflation is a funny beast, one which often comes with an awkward mix of genuine unavoidability - tariffs, wars, pandemics - and concealed opportunism. Games are their own case amongst the many, their prices instead impacted more by the cost of labour, which soars not because developers are paid particularly well (I can hear their scoffs from here) but because of the continued, lagging impact of their executives' total miscalculation, in assuming triple-A budgets and timescales could continue growing exponentially. And by said opportunism - peep how long it took for Microsoft and the like to announce those bumped prices after Nintendo came in with Mario Kart at £75. Anyway, the causes are, in a sense, kind of moot. The result of all this squeezing from near enough all angles of gaming's corporate world is less a pincer manoeuvre on the consumer than a suffocating, immaculately executed full-court press, a full team hurtling with ruthless speed towards the poor unwitting sucker at home on the sofa. Identifying whether gaming costs a fortune now for reasons we can or can't sympathise with does little to change the fact that gaming costs a fortune. And, to be clear, it really does cost a fortune. Things are getting very expensive in the world of video games. £700 for a PS5 Pro! | Image credit: Eurogamer Whenever complaints about video game prices come up there is naturally a bit of pushback - games have always been expensive! What about the 90s! - usually via attempts to draw conclusions from economic data. Normally I'd be all on board with this - numbers can't lie! - but in this case it's a little different. Numbers can't lie, but they can, sometimes, be manipulated to prove almost anything you want - or just as often, simply misunderstood to the same ends. (Take most back-of-a-cigarette-packet attempts at doing the maths here, and the infinite considerations to bear in mind: Have you adjusted for inflation? How about for cost of living, as if the rising price of everything else may somehow make expensive games more palatable? Or share of disposable average household salary? For exchange rates? Purchasing power parity? Did you use the mean or the median for average income? What about cost-per-frame of performance? How much value do you place on moving from 1080p to 1440p? Does anyone sit close enough to their TV to tell enough of a difference with 4K?! Ahhhhh!) Instead, it's worth remembering that economics isn't just a numerical science. It is also a behavioural one - a psychological one. The impact of pricing is as much in the mind as it is on the spreadsheet, hence these very real notions of "consumer confidence" and pricing that continues to end in ".99". And so sometimes with pricing I find it helps to borrow another phrase from sport, alongside that full-court press, in the "eye test". Sports scouts use all kinds of numerical data to analyse prospective players these days, but the best ones still marry that with a bit of old-school viewing in the flesh. If a player looks good on paper and passes the eye test, they're probably the real deal. Likewise, if the impact of buying an $80 video game at full price looks unclear in the data, but to your human eye feels about as whince-inducing as biting into a raw onion like it's an apple, and then rubbing said raw onion all over said eye, it's probably extremely bloody expensive and you should stop trying to be clever. Video games, to me, do feel bloody expensive. If I weren't in the incredibly fortunate position of being able to source or expense most of them for work I am genuinely unsure if I'd be continuing with them as a hobby - at least beyond shifting my patterns, as so many players have over the years, away from premium console and PC games to the forever-tempting, free-to-play time-vampires like Fortnite or League of Legends. Which leads, finally, to the real point here: that there is another cost to rising game and console prices, beyond the one hitting you square in the wallet. How much is GTA 6 going to cost? $80 or more? | Image credit: Rockstar The other cost - perhaps the real cost, when things settle - is the notion of ownership itself. Plenty of physical media collectors, aficionados and diehards will tell you this has been locked in the sights of this industry for a long time, of course. They will point to gaming's sister entertainment industries of music, film and television, and the paradigm shift to streaming in each, as a sign of the inevitability of it all. And they will undoubtedly have a point. But this step change in the cost of gaming will only be an accelerant. Understanding that only takes a quick glance at the strategy of, say, Xbox in recent years. While Nintendo is still largely adhering to the buy-it-outright tradition and Sony is busy shooting off its toes with live service-shaped bullets, Microsoft has, like it or not, positioned itself rather deftly. After jacking up the cost of its flatlining hardware and platform-agnostic games, Xbox, its execs would surely argue, is also now rather counterintuitively the home of value gaming - if only because Microsoft itself is the one hoiking up the cost of your main alternative. Because supplanting the waning old faithfuls in this kind of scenario - trade-ins, short-term rentals - is, you guessed it, Game Pass. You could even argue the consoles are factored in here too. Microsoft, with its "this is an Xbox" campaign and long-stated ambition to reach players in the billions, has made it plain that it doesn't care where you play its games, as long as you're playing them. When all physical consoles are jumping up in price, thanks to that rising tide effect of inflation, the platform that lets you spend £15 a month to stream Clair Obscur: Expedition 33, Oblivion Remastered and the latest Doom straight to your TV without even buying one is, at least in theory (and not forgetting the BDS call for a boycott of them) looking like quite an attractive proposition. Xbox, for its part, has been chipping away at this idea for a while - we at Eurogamer had opinions about team green's disregard for game ownership as far back as the reveal of the Xbox One, in the ancient times of 2013. Then it was a different method, the once-horrifying face of digital rights management, or DRM, along with regulated digital game sharing and online-only requirements. Here in 2025, with that disdain now platform-agnostic, and where games are being disappeared from people's libraries, platforms like Steam are, by law, forced to remind you that you're not actually buying your games at all, where older games are increasingly only playable via subscriptions to Nintendo, Sony, and now Xbox, and bosses are making wild claims about AI's ability to "preserve" old games by making terrible facsimiles of them, that seems slightly quaint. More directly, Xbox has been talking about this very openly since at least 2021. As Ben Decker, then head of gaming services marketing at Xbox, said to me at the time: "Our goal for Xbox Game Pass really ladders up to our goal at Xbox, to reach the more than 3 billion gamers worldwide… we are building a future with this in mind." Four years on, that future might be now. Jacking up the cost of games and consoles alone won't do anything to grow gaming's userbase, that being the touted panacea still by the industry's top brass. Quite the opposite, obviously (although the Switch 2 looks set to still be massive, and the PS5, with all its price rises, still tracks in line with the price-cut PS4). But funneling more and more core players away from owning games, and towards a newly incentivised world where they merely pay a comparatively low monthly fee to access them, might just. How much a difference that will truly make, and the consequences of it, remain up for debate of course. We've seen the impact of streaming on the other entertainment industries in turn, none for the better, but games are a medium of their own. Perhaps there's still a little room for optimism. Against the tide there are still organisations like Does It Play? and the Game History Foundation, or platforms such as itch.io and GOG (nothing without its flaws, of course), that exist precisely because of the growing resistance to that current. Just this week, Lost in Cult launched a new wave of luxurious, always-playable physical editions of acclaimed games, another small act of defiance - though perhaps another sign things are going the way of film and music, where purists splurge on vinyl and Criterion Collection BluRays but the vast majority remain on Netflix and Spotify. And as uncomfortable as it may be to hear for those - including this author! - who wish for this medium to be preserved and cared for like any other great artform, there will be some who argue that a model where more games can be enjoyed by more people, for a lower cost, is worth it. Game Pass often offers great value, but the library is always in a state of flux. Collectors may need to start looking at high-end physical editions. | Image credit: Microsoft There's also another point to bear in mind here. Nightmarish as it may be for preservation and consumer rights, against the backdrop of endless layoffs and instability many developers tout the stability of a predefined Game Pass or PS Plus deal over taking a punt in the increasingly crowded, choppy seas of the open market. Bethesda this week has just boasted Doom: The Dark Ages' achievement of becoming the most widely-played (note: not fastest selling) Doom game ever. That despite it reaching only a fraction of peak Steam concurrents in the same period as its predecessor, Doom: Eternal - a sign, barring some surprise shift away from PC gaming to consoles, that people really are beginning to choose playing games on Game Pass over buying them outright. The likes of Remedy and Rebellion tout PS Plus and Game Pass as stabilisers, or even accelerants, for their games launching straight onto the services. And independent studios and publishers of varying sizes pre-empted that when we spoke to them for a piece about this exact this point, more than four years ago - in a sense, we're still waiting for a conclusive answer to a question we first began investigating back in 2021: Is Xbox Game Pass just too good to be true? We've talked, at this point, at great length about how this year would be make-or-break for the triple-A model in particular. About how the likes of Xbox, or Warner Bros., or the many others have lost sight of their purpose - and in the process, their path to sustainability - in the quest for exponential growth. How £700 Pro edition consoles are an argument against Pro editions altogether. And about how, it's becoming clear, the old industry we once knew is no more, with its new form still yet to take shape. There's an argument now, however, that a grim new normal for preservation and ownership may, just as grimly, be exactly what the industry needs to save itself. It would be in line with what we've seen from the wider world of technology and media - and really, the wider world itself. A shift from owning to renting. That old chestnut of all the capital slowly rising, curdling at the top. The public as mere tenants in a house of culture owned by someone, somewhere else. It needn't have to be this way, of course. If this all sounds like a particularly unfavourable trade-in, remember this too: it's one that could almost certainly have been avoided.
    0 Комментарии 0 Поделились 0 предпросмотр
  • The Fortnite saga shows Apple at its pettiest–and most vulnerable

    Macworld

    Just when you think Apple’s long-running dispute with Epic Games is finally over, someone manages to find a way to spin it out even longer. There’s always time for one more lawsuit, one more appeal, one more acrimonious tweet. And if two multibillion-dollar corporations can’t find a way to resolve their differences amicably, what hope have the rest of us?

    At the end of April, Judge Yvonne Gonzalez Rogers issued what felt at the time like a conclusively unambiguous ruling demanding that Apple comply instantly with previous measures and adding more on top. The company, it was made plain, will not merely have to allow other payment systems within iOS apps, but refrain from sabotaging them with satirically high fees and off-putting verbiage.

    That should have been that, but Cupertino is still feeling punchy. Epic thought, not unreasonably, that its Fortnite game would now be allowed back on the App Store, having originally been kicked off for the practices which Apple has just been told it must allow. But life is full of surprises. “Apple has blocked our Fortnite submission,” Epic tweeted on May 16, “so we cannot release to the U.S. App Store or to the Epic Games Store for iOS in the European Union.”

    But here’s the kicker: Epic promptly decided to retaliate by taking the game dark on iOS worldwide, including versions delivered through Epic’s own store. That might sound like cutting off your nose to spite your face, but this will hurt Apple too. And users. In fact barely anyone will have a nose by the time this is over.

    On the face of it, it makes little sense to prevent Fortnite from returning to iOS. It’s an immensely popular game with hundreds of millions of players across a wide range of platforms, and being able to play it on iPhone makes the iPhone a more appealing device. Even if Apple made zero revenue from sales and in-app purchases, it would still be worth having the title on iOS purely for the sake of user happiness. Conversely, refusing to allow it generates masses of bad PR and resentment, which is poison to a company that depends so much on its image.

    In this particular case, that zero-revenue idea isn’t entirely inconceivable, because Epic is a big enough company with a reputable enough storefront that it couldcut Apple entirely out of the transactional loop. But most games developers don’t have that luxury. They will, at the very least, offer their own and Apple’s payment systems side by side, and I’m convinced that most users would vastly, vastly prefer to buy through the official App Store. Ask yourself this: How easy is it to find jailbreaking instructions online? And yet, how manypeople do you know who actually jailbreak their iPhones?

    While following this saga, it’s occurred to me that I don’t actually want to buy iOS apps from alternative stores. I just want that to be an option because competition is healthy and would encourage Apple to lower fees and generally treat developers better. Having a bunch of different places to go whenever I want to buy something new or update what I’ve got is a dismaying prospect; I don’t want to enter my payment details in a dozen different websites and constantly worry about whether they can be trusted. But other people might. The App Store itself is one of Apple’s all-time great products, delivering such a reassuring, frictionless one-stop-shop experience that software became for the first time an impulse buy. It won’t lose its magic just because there’s another option.

    I get why Apple tried to maintain its control of iOS app installs and purchases; it has shareholders to think about and can’t just wave away swathes of revenue without pushback. But the vigor and uncompromising fierceness with which the company has defended that revenue stream has been alarming, to say the least, and may prove costly too. At multiple points in the saga Apple could have yielded a little and reached a compromise acceptable to all parties: allowed sideloading under strict limits; allowed third-party payment systems with a moderate instead of laughably high revenue cut. Instead it pushed and fought and, as the judge put it, “willfully chose not to comply.” And now it appears to be pursuing petty revenge which will hurt itself just as much as the opposition, for no clear rational reason.

    Saying that someone never knows when they’re beaten sounds like a complimentbut often it just means suffering more than you need to for a lost cause. Apple lost this war, and choice was the winner. Now it needs to accept the result and get on with making that outcome work for its users… who are, after all, supposed to be the priority.

    Foundry

    Welcome to our weekly Apple Breakfast column, which includes all the Apple news you missed last week in a handy bite-sized roundup. We call it Apple Breakfast because we think it goes great with a Monday morning cup of coffee or tea, but it’s cool if you want to give it a read during lunch or dinner hours too.

    Trending: Top stories

    Apple is doomed because Apple isn’t doomed.

    Mahmoud Itani reveals 10 hidden Apple Watch features you’ve probably never used–but should.

    If the iPhone 17 Air is anything like the Galaxy S25 Edge, I don’t want it.

    The first thing Jason Cross installs on every new Mac is this little free emoji utility.

    If iPadOS 19 is going to be more like the Mac, it needs these 9 features stat.

    Podcast of the week

    When WWDC rolls around in June, it’ll be two years since Apple announced visionOS and the Apple Vision Pro. In episode 934 of the Macworld Podcast, we examine the state of Apple’s spatial computing platform and what Apple could have in store at WWDC and beyond.

    You can catch every episode of the Macworld Podcast on YouTube, Spotify, Soundcloud, the Podcasts app, or our own site.

    Reviews corner

    iPad Airreview: Only slightly better, but still the best.

    Verbatim TurboMetal SSD review: Stylish portable drive.

    PowerPhotos 3 review: Time-saving Apple Photos tool for power users on the Mac.

    EcoFlow Power Hat review: A sun hat with solar panels.

    The rumor mill

    Apple is reportedly working on the hybrid Mac we all want–and it could arrive in 2028.

    Folding and curved iPhones are both predicted in 2027 ‘product blitz.’

    In fact the iPhone 18’s edgeless curved display seems like a certainty now.

    iOS 19 will reportedly use AI to extend the iPhone’s battery life.

    Software updates, bugs, and problems

    A bizarre iPhone bug is causing some audio messages to fail. Here’s why.

    iOS 18.5 may be a minor update, but it has one major iPhone upgrade.

    Apple’s months-old C1 modem has a serious security flaw.

    And with that, we’re done for this week’s Apple Breakfast. If you’d like to get regular roundups, sign up for our newsletters, including our new email from The Macalope–an irreverent, humorous take on the latest news and rumors from a half-man, half-mythical Mac beast. You can also follow us on Facebook, Threads, Bluesky, or X for discussion of breaking Apple news stories. See you next Monday, and stay Appley.
    #fortnite #saga #shows #apple #its
    The Fortnite saga shows Apple at its pettiest–and most vulnerable
    Macworld Just when you think Apple’s long-running dispute with Epic Games is finally over, someone manages to find a way to spin it out even longer. There’s always time for one more lawsuit, one more appeal, one more acrimonious tweet. And if two multibillion-dollar corporations can’t find a way to resolve their differences amicably, what hope have the rest of us? At the end of April, Judge Yvonne Gonzalez Rogers issued what felt at the time like a conclusively unambiguous ruling demanding that Apple comply instantly with previous measures and adding more on top. The company, it was made plain, will not merely have to allow other payment systems within iOS apps, but refrain from sabotaging them with satirically high fees and off-putting verbiage. That should have been that, but Cupertino is still feeling punchy. Epic thought, not unreasonably, that its Fortnite game would now be allowed back on the App Store, having originally been kicked off for the practices which Apple has just been told it must allow. But life is full of surprises. “Apple has blocked our Fortnite submission,” Epic tweeted on May 16, “so we cannot release to the U.S. App Store or to the Epic Games Store for iOS in the European Union.” But here’s the kicker: Epic promptly decided to retaliate by taking the game dark on iOS worldwide, including versions delivered through Epic’s own store. That might sound like cutting off your nose to spite your face, but this will hurt Apple too. And users. In fact barely anyone will have a nose by the time this is over. On the face of it, it makes little sense to prevent Fortnite from returning to iOS. It’s an immensely popular game with hundreds of millions of players across a wide range of platforms, and being able to play it on iPhone makes the iPhone a more appealing device. Even if Apple made zero revenue from sales and in-app purchases, it would still be worth having the title on iOS purely for the sake of user happiness. Conversely, refusing to allow it generates masses of bad PR and resentment, which is poison to a company that depends so much on its image. In this particular case, that zero-revenue idea isn’t entirely inconceivable, because Epic is a big enough company with a reputable enough storefront that it couldcut Apple entirely out of the transactional loop. But most games developers don’t have that luxury. They will, at the very least, offer their own and Apple’s payment systems side by side, and I’m convinced that most users would vastly, vastly prefer to buy through the official App Store. Ask yourself this: How easy is it to find jailbreaking instructions online? And yet, how manypeople do you know who actually jailbreak their iPhones? While following this saga, it’s occurred to me that I don’t actually want to buy iOS apps from alternative stores. I just want that to be an option because competition is healthy and would encourage Apple to lower fees and generally treat developers better. Having a bunch of different places to go whenever I want to buy something new or update what I’ve got is a dismaying prospect; I don’t want to enter my payment details in a dozen different websites and constantly worry about whether they can be trusted. But other people might. The App Store itself is one of Apple’s all-time great products, delivering such a reassuring, frictionless one-stop-shop experience that software became for the first time an impulse buy. It won’t lose its magic just because there’s another option. I get why Apple tried to maintain its control of iOS app installs and purchases; it has shareholders to think about and can’t just wave away swathes of revenue without pushback. But the vigor and uncompromising fierceness with which the company has defended that revenue stream has been alarming, to say the least, and may prove costly too. At multiple points in the saga Apple could have yielded a little and reached a compromise acceptable to all parties: allowed sideloading under strict limits; allowed third-party payment systems with a moderate instead of laughably high revenue cut. Instead it pushed and fought and, as the judge put it, “willfully chose not to comply.” And now it appears to be pursuing petty revenge which will hurt itself just as much as the opposition, for no clear rational reason. Saying that someone never knows when they’re beaten sounds like a complimentbut often it just means suffering more than you need to for a lost cause. Apple lost this war, and choice was the winner. Now it needs to accept the result and get on with making that outcome work for its users… who are, after all, supposed to be the priority. Foundry Welcome to our weekly Apple Breakfast column, which includes all the Apple news you missed last week in a handy bite-sized roundup. We call it Apple Breakfast because we think it goes great with a Monday morning cup of coffee or tea, but it’s cool if you want to give it a read during lunch or dinner hours too. Trending: Top stories Apple is doomed because Apple isn’t doomed. Mahmoud Itani reveals 10 hidden Apple Watch features you’ve probably never used–but should. If the iPhone 17 Air is anything like the Galaxy S25 Edge, I don’t want it. The first thing Jason Cross installs on every new Mac is this little free emoji utility. If iPadOS 19 is going to be more like the Mac, it needs these 9 features stat. Podcast of the week When WWDC rolls around in June, it’ll be two years since Apple announced visionOS and the Apple Vision Pro. In episode 934 of the Macworld Podcast, we examine the state of Apple’s spatial computing platform and what Apple could have in store at WWDC and beyond. You can catch every episode of the Macworld Podcast on YouTube, Spotify, Soundcloud, the Podcasts app, or our own site. Reviews corner iPad Airreview: Only slightly better, but still the best. Verbatim TurboMetal SSD review: Stylish portable drive. PowerPhotos 3 review: Time-saving Apple Photos tool for power users on the Mac. EcoFlow Power Hat review: A sun hat with solar panels. The rumor mill Apple is reportedly working on the hybrid Mac we all want–and it could arrive in 2028. Folding and curved iPhones are both predicted in 2027 ‘product blitz.’ In fact the iPhone 18’s edgeless curved display seems like a certainty now. iOS 19 will reportedly use AI to extend the iPhone’s battery life. Software updates, bugs, and problems A bizarre iPhone bug is causing some audio messages to fail. Here’s why. iOS 18.5 may be a minor update, but it has one major iPhone upgrade. Apple’s months-old C1 modem has a serious security flaw. And with that, we’re done for this week’s Apple Breakfast. If you’d like to get regular roundups, sign up for our newsletters, including our new email from The Macalope–an irreverent, humorous take on the latest news and rumors from a half-man, half-mythical Mac beast. You can also follow us on Facebook, Threads, Bluesky, or X for discussion of breaking Apple news stories. See you next Monday, and stay Appley. #fortnite #saga #shows #apple #its
    WWW.MACWORLD.COM
    The Fortnite saga shows Apple at its pettiest–and most vulnerable
    Macworld Just when you think Apple’s long-running dispute with Epic Games is finally over, someone manages to find a way to spin it out even longer. There’s always time for one more lawsuit, one more appeal, one more acrimonious tweet. And if two multibillion-dollar corporations can’t find a way to resolve their differences amicably, what hope have the rest of us? At the end of April, Judge Yvonne Gonzalez Rogers issued what felt at the time like a conclusively unambiguous ruling demanding that Apple comply instantly with previous measures and adding more on top. The company, it was made plain, will not merely have to allow other payment systems within iOS apps, but refrain from sabotaging them with satirically high fees and off-putting verbiage. That should have been that, but Cupertino is still feeling punchy. Epic thought, not unreasonably, that its Fortnite game would now be allowed back on the App Store, having originally been kicked off for the practices which Apple has just been told it must allow. But life is full of surprises. “Apple has blocked our Fortnite submission,” Epic tweeted on May 16, “so we cannot release to the U.S. App Store or to the Epic Games Store for iOS in the European Union.” But here’s the kicker: Epic promptly decided to retaliate by taking the game dark on iOS worldwide, including versions delivered through Epic’s own store. That might sound like cutting off your nose to spite your face, but this will hurt Apple too. And users. In fact barely anyone will have a nose by the time this is over. On the face of it, it makes little sense to prevent Fortnite from returning to iOS. It’s an immensely popular game with hundreds of millions of players across a wide range of platforms, and being able to play it on iPhone makes the iPhone a more appealing device. Even if Apple made zero revenue from sales and in-app purchases, it would still be worth having the title on iOS purely for the sake of user happiness. Conversely, refusing to allow it generates masses of bad PR and resentment, which is poison to a company that depends so much on its image. In this particular case, that zero-revenue idea isn’t entirely inconceivable, because Epic is a big enough company with a reputable enough storefront that it could (assuming the judge’s ruling isn’t watered down on appeal) cut Apple entirely out of the transactional loop. But most games developers don’t have that luxury. They will, at the very least, offer their own and Apple’s payment systems side by side, and I’m convinced that most users would vastly, vastly prefer to buy through the official App Store. Ask yourself this: How easy is it to find jailbreaking instructions online? And yet, how many (or how few) people do you know who actually jailbreak their iPhones? While following this saga, it’s occurred to me that I don’t actually want to buy iOS apps from alternative stores. I just want that to be an option because competition is healthy and would encourage Apple to lower fees and generally treat developers better. Having a bunch of different places to go whenever I want to buy something new or update what I’ve got is a dismaying prospect; I don’t want to enter my payment details in a dozen different websites and constantly worry about whether they can be trusted. But other people might. The App Store itself is one of Apple’s all-time great products, delivering such a reassuring, frictionless one-stop-shop experience that software became for the first time an impulse buy. It won’t lose its magic just because there’s another option. I get why Apple tried to maintain its control of iOS app installs and purchases; it has shareholders to think about and can’t just wave away swathes of revenue without pushback. But the vigor and uncompromising fierceness with which the company has defended that revenue stream has been alarming, to say the least, and may prove costly too. At multiple points in the saga Apple could have yielded a little and reached a compromise acceptable to all parties: allowed sideloading under strict limits; allowed third-party payment systems with a moderate instead of laughably high revenue cut. Instead it pushed and fought and, as the judge put it, “willfully chose not to comply.” And now it appears to be pursuing petty revenge which will hurt itself just as much as the opposition, for no clear rational reason. Saying that someone never knows when they’re beaten sounds like a compliment (“People should know when they’re conquered.” “Would you, Quintus? Would I?”) but often it just means suffering more than you need to for a lost cause. Apple lost this war, and choice was the winner. Now it needs to accept the result and get on with making that outcome work for its users… who are, after all, supposed to be the priority. Foundry Welcome to our weekly Apple Breakfast column, which includes all the Apple news you missed last week in a handy bite-sized roundup. We call it Apple Breakfast because we think it goes great with a Monday morning cup of coffee or tea, but it’s cool if you want to give it a read during lunch or dinner hours too. Trending: Top stories Apple is doomed because Apple isn’t doomed. Mahmoud Itani reveals 10 hidden Apple Watch features you’ve probably never used–but should. If the iPhone 17 Air is anything like the Galaxy S25 Edge, I don’t want it. The first thing Jason Cross installs on every new Mac is this little free emoji utility. If iPadOS 19 is going to be more like the Mac, it needs these 9 features stat. Podcast of the week When WWDC rolls around in June, it’ll be two years since Apple announced visionOS and the Apple Vision Pro. In episode 934 of the Macworld Podcast, we examine the state of Apple’s spatial computing platform and what Apple could have in store at WWDC and beyond. You can catch every episode of the Macworld Podcast on YouTube, Spotify, Soundcloud, the Podcasts app, or our own site. Reviews corner iPad Air (M3) review: Only slightly better, but still the best. Verbatim TurboMetal SSD review: Stylish portable drive. PowerPhotos 3 review: Time-saving Apple Photos tool for power users on the Mac. EcoFlow Power Hat review: A sun hat with solar panels. The rumor mill Apple is reportedly working on the hybrid Mac we all want–and it could arrive in 2028. Folding and curved iPhones are both predicted in 2027 ‘product blitz.’ In fact the iPhone 18’s edgeless curved display seems like a certainty now. iOS 19 will reportedly use AI to extend the iPhone’s battery life. Software updates, bugs, and problems A bizarre iPhone bug is causing some audio messages to fail. Here’s why. iOS 18.5 may be a minor update, but it has one major iPhone upgrade. Apple’s months-old C1 modem has a serious security flaw. And with that, we’re done for this week’s Apple Breakfast. If you’d like to get regular roundups, sign up for our newsletters, including our new email from The Macalope–an irreverent, humorous take on the latest news and rumors from a half-man, half-mythical Mac beast. You can also follow us on Facebook, Threads, Bluesky, or X for discussion of breaking Apple news stories. See you next Monday, and stay Appley.
    0 Комментарии 0 Поделились 0 предпросмотр
  • On this day: May 21

    May 21: World Day for Cultural Diversity for Dialogue and Development

    Busoni c. 1897

    1138 – The Crusades: The siege of Shaizar ended, and the Emir of Shaizar became a vassal of the Byzantine Empire.
    1864 – American Civil War: The inconclusive Battle of Spotsylvania Court House in Virginia ended with combined Union and Confederate casualties totaling around 31,000.
    1894 – The Manchester Ship Canal, linking Manchester in North West England to the Irish Sea, officially opened, becoming the world's largest navigation canal at the time.
    1925 – The opera Doktor Faust, unfinished when composer Ferruccio Busonidied, was premiered in Dresden.
    2014 – A Taiwanese man carried out a stabbing spree on a Taipei Metro train, killing four people and injuring 24 others.
    Feng DaoTommaso CampanellaArmand HammerLinda LaubensteinMore anniversaries:
    May 20
    May 21
    May 22

    Archive
    By email
    List of days of the year
    About
    #this #day
    On this day: May 21
    May 21: World Day for Cultural Diversity for Dialogue and Development Busoni c. 1897 1138 – The Crusades: The siege of Shaizar ended, and the Emir of Shaizar became a vassal of the Byzantine Empire. 1864 – American Civil War: The inconclusive Battle of Spotsylvania Court House in Virginia ended with combined Union and Confederate casualties totaling around 31,000. 1894 – The Manchester Ship Canal, linking Manchester in North West England to the Irish Sea, officially opened, becoming the world's largest navigation canal at the time. 1925 – The opera Doktor Faust, unfinished when composer Ferruccio Busonidied, was premiered in Dresden. 2014 – A Taiwanese man carried out a stabbing spree on a Taipei Metro train, killing four people and injuring 24 others. Feng DaoTommaso CampanellaArmand HammerLinda LaubensteinMore anniversaries: May 20 May 21 May 22 Archive By email List of days of the year About #this #day
    EN.WIKIPEDIA.ORG
    On this day: May 21
    May 21: World Day for Cultural Diversity for Dialogue and Development Busoni c. 1897 1138 – The Crusades: The siege of Shaizar ended, and the Emir of Shaizar became a vassal of the Byzantine Empire. 1864 – American Civil War: The inconclusive Battle of Spotsylvania Court House in Virginia ended with combined Union and Confederate casualties totaling around 31,000. 1894 – The Manchester Ship Canal, linking Manchester in North West England to the Irish Sea, officially opened, becoming the world's largest navigation canal at the time. 1925 – The opera Doktor Faust, unfinished when composer Ferruccio Busoni (pictured) died, was premiered in Dresden. 2014 – A Taiwanese man carried out a stabbing spree on a Taipei Metro train, killing four people and injuring 24 others. Feng Dao (d. 954)Tommaso Campanella (d. 1639)Armand Hammer (b. 1898)Linda Laubenstein (b. 1947) More anniversaries: May 20 May 21 May 22 Archive By email List of days of the year About
    0 Комментарии 0 Поделились 0 предпросмотр
  • Apple could be in serious trouble over Fortnite rejection

    Macworld

    It would be inaccurate to claim that either Apple or Epic Games has decisively won their acrimonious and long-running legal dispute over the use of external payment links in iPhone apps: courts have sided with both companies at various times and in various aspects of the case. But Epic seems to be getting the better of things, after a judge angrily ruled at the end of April that Apple must allow such links and called its previous response “insubordination.”

    That sounds conclusive, and is a potential financial hammer blow: Apple makes a great deal of money from transactions in iOS apps, and its cut may be about to shrink. But the question remains of what now happens to Fortnite, the game that triggered the dispute back in 2020. Epic thinks it should be allowed back on the App Store, because it was banned for something that must now be allowed, but Apple thinks it was within its rights to ban the game under the rules at the time and won’t even consider a reversal until all litigation is over.

    Whether Apple is wise to behave in this way is debatable. Refusing to allow Fortnite to return hurts iPhone owners as much as it hurts Epic, and it feels like petty retaliation. But whether it’s legally justifiable is a different matter—one which Epic decided to test by asking the judge in the case to force Apple’s hand and arguing that the company is in contempt of that April ruling. And the judge has now responded… rather ominously.

    “The Court is in receipt of Epic Games, Inc.’s Motion to Enforce the Injunction,” writes Judge Yvonne Gonzalez Rogers, in a document shared by Epic CEO Tim Sweeney. “The Court thus issues this Order to Show Cause as to why the motion should not be granted. Briefingshall include the legal authority upon which Apple contends that it can ignore this Court’s order having not received a stay from the Ninth Circuit Court of Appeal even though its request was filed twelve days ago on May 7, 2025.”

    Not the most promising start for Apple, which is instructed to explain why it hasn’t complied with the order despite receiving no encouragement from the appeals court. But it gets worse:

    “Obviously, Apple is fully capable of resolving this issue without further briefing or a hearing. However, if the parties do not file a joint notice that this issue is resolved, and this Court’s intervention is required, the Apple official who is personally responsible for ensuring compliance shall personally appear at the hearing hereby set for Tuesday, May 27.”

    It’s not clear who the “official who is personally responsible for ensuring compliance” would be. MacRumors speculates that it could be an executive as high-ranking as Phil Schiller, who has responsibility for the App Store, but Apple may try to get away with someone with a lower profile. But it does seem that individual penalties, rather than or as well as more easily disregarded corporate fines, could be in the cards if the company pushes its luck much further. And it may be worth pointing out that, while obviously an extreme option in this case, contempt of court can be punished with jail time.

    That doesn’t mean that Apple would necessarily lose that May 27 hearing. One of the remedies given in 2021’s original judgmentby the same judge was “a declaration thatApple’s termination of the DPLAand the related agreements between Epic Games and Apple was valid, lawful, and enforceable, andApple has the contractual right to terminate its DPLA with any or all of Epic Games’ wholly owned subsidiaries, affiliates, and/or other entities under Epic Games’ control at any time and at Apple’s sole discretion.” None of the rulings since then suggest a change in the judge’s position on whether Apple is allowed to boot companies from the App Store as and when it pleases.

    The problem is that Apple is specifically not allowed to “prohibit” the use of external payment links. It can reject apps, or ban developer accounts, at its own discretion. But if it rejects an app or bans a dev for no reason other than its use of such links, does that amount to a de facto prohibition? Again, that’s debatable.

    If Apple can come up with some other reason for Fortnite’s exclusion, it might be okay–and it could be helpful that the company has, in compliance with the ruling, approved updates to other high-profile apps such as Spotify and Patreon which add the links. But if it can’t, the penalties could be severe. The stakes just got a lot higher.
    #apple #could #serious #trouble #over
    Apple could be in serious trouble over Fortnite rejection
    Macworld It would be inaccurate to claim that either Apple or Epic Games has decisively won their acrimonious and long-running legal dispute over the use of external payment links in iPhone apps: courts have sided with both companies at various times and in various aspects of the case. But Epic seems to be getting the better of things, after a judge angrily ruled at the end of April that Apple must allow such links and called its previous response “insubordination.” That sounds conclusive, and is a potential financial hammer blow: Apple makes a great deal of money from transactions in iOS apps, and its cut may be about to shrink. But the question remains of what now happens to Fortnite, the game that triggered the dispute back in 2020. Epic thinks it should be allowed back on the App Store, because it was banned for something that must now be allowed, but Apple thinks it was within its rights to ban the game under the rules at the time and won’t even consider a reversal until all litigation is over. Whether Apple is wise to behave in this way is debatable. Refusing to allow Fortnite to return hurts iPhone owners as much as it hurts Epic, and it feels like petty retaliation. But whether it’s legally justifiable is a different matter—one which Epic decided to test by asking the judge in the case to force Apple’s hand and arguing that the company is in contempt of that April ruling. And the judge has now responded… rather ominously. “The Court is in receipt of Epic Games, Inc.’s Motion to Enforce the Injunction,” writes Judge Yvonne Gonzalez Rogers, in a document shared by Epic CEO Tim Sweeney. “The Court thus issues this Order to Show Cause as to why the motion should not be granted. Briefingshall include the legal authority upon which Apple contends that it can ignore this Court’s order having not received a stay from the Ninth Circuit Court of Appeal even though its request was filed twelve days ago on May 7, 2025.” Not the most promising start for Apple, which is instructed to explain why it hasn’t complied with the order despite receiving no encouragement from the appeals court. But it gets worse: “Obviously, Apple is fully capable of resolving this issue without further briefing or a hearing. However, if the parties do not file a joint notice that this issue is resolved, and this Court’s intervention is required, the Apple official who is personally responsible for ensuring compliance shall personally appear at the hearing hereby set for Tuesday, May 27.” It’s not clear who the “official who is personally responsible for ensuring compliance” would be. MacRumors speculates that it could be an executive as high-ranking as Phil Schiller, who has responsibility for the App Store, but Apple may try to get away with someone with a lower profile. But it does seem that individual penalties, rather than or as well as more easily disregarded corporate fines, could be in the cards if the company pushes its luck much further. And it may be worth pointing out that, while obviously an extreme option in this case, contempt of court can be punished with jail time. That doesn’t mean that Apple would necessarily lose that May 27 hearing. One of the remedies given in 2021’s original judgmentby the same judge was “a declaration thatApple’s termination of the DPLAand the related agreements between Epic Games and Apple was valid, lawful, and enforceable, andApple has the contractual right to terminate its DPLA with any or all of Epic Games’ wholly owned subsidiaries, affiliates, and/or other entities under Epic Games’ control at any time and at Apple’s sole discretion.” None of the rulings since then suggest a change in the judge’s position on whether Apple is allowed to boot companies from the App Store as and when it pleases. The problem is that Apple is specifically not allowed to “prohibit” the use of external payment links. It can reject apps, or ban developer accounts, at its own discretion. But if it rejects an app or bans a dev for no reason other than its use of such links, does that amount to a de facto prohibition? Again, that’s debatable. If Apple can come up with some other reason for Fortnite’s exclusion, it might be okay–and it could be helpful that the company has, in compliance with the ruling, approved updates to other high-profile apps such as Spotify and Patreon which add the links. But if it can’t, the penalties could be severe. The stakes just got a lot higher. #apple #could #serious #trouble #over
    WWW.MACWORLD.COM
    Apple could be in serious trouble over Fortnite rejection
    Macworld It would be inaccurate to claim that either Apple or Epic Games has decisively won their acrimonious and long-running legal dispute over the use of external payment links in iPhone apps: courts have sided with both companies at various times and in various aspects of the case. But Epic seems to be getting the better of things, after a judge angrily ruled at the end of April that Apple must allow such links and called its previous response “insubordination.” That sounds conclusive, and is a potential financial hammer blow: Apple makes a great deal of money from transactions in iOS apps, and its cut may be about to shrink. But the question remains of what now happens to Fortnite, the game that triggered the dispute back in 2020. Epic thinks it should be allowed back on the App Store, because it was banned for something that must now be allowed, but Apple thinks it was within its rights to ban the game under the rules at the time and won’t even consider a reversal until all litigation is over. Whether Apple is wise to behave in this way is debatable. Refusing to allow Fortnite to return hurts iPhone owners as much as it hurts Epic, and it feels like petty retaliation. But whether it’s legally justifiable is a different matter—one which Epic decided to test by asking the judge in the case to force Apple’s hand and arguing that the company is in contempt of that April ruling. And the judge has now responded… rather ominously. “The Court is in receipt of Epic Games, Inc.’s Motion to Enforce the Injunction,” writes Judge Yvonne Gonzalez Rogers, in a document shared by Epic CEO Tim Sweeney. “The Court thus issues this Order to Show Cause as to why the motion should not be granted. Briefing […] shall include the legal authority upon which Apple contends that it can ignore this Court’s order having not received a stay from the Ninth Circuit Court of Appeal even though its request was filed twelve days ago on May 7, 2025.” Not the most promising start for Apple, which is instructed to explain why it hasn’t complied with the order despite receiving no encouragement from the appeals court. But it gets worse: “Obviously, Apple is fully capable of resolving this issue without further briefing or a hearing. However, if the parties do not file a joint notice that this issue is resolved, and this Court’s intervention is required, the Apple official who is personally responsible for ensuring compliance shall personally appear at the hearing hereby set for Tuesday, May 27.” It’s not clear who the “official who is personally responsible for ensuring compliance” would be. MacRumors speculates that it could be an executive as high-ranking as Phil Schiller, who has responsibility for the App Store, but Apple may try to get away with someone with a lower profile. But it does seem that individual penalties, rather than or as well as more easily disregarded corporate fines, could be in the cards if the company pushes its luck much further. And it may be worth pointing out that, while obviously an extreme option in this case, contempt of court can be punished with jail time. That doesn’t mean that Apple would necessarily lose that May 27 hearing. One of the remedies given in 2021’s original judgment (see page 179, section G) by the same judge was “a declaration that (i) Apple’s termination of the DPLA [Developer Product Licensing Agreement] and the related agreements between Epic Games and Apple was valid, lawful, and enforceable, and (ii) Apple has the contractual right to terminate its DPLA with any or all of Epic Games’ wholly owned subsidiaries, affiliates, and/or other entities under Epic Games’ control at any time and at Apple’s sole discretion.” None of the rulings since then suggest a change in the judge’s position on whether Apple is allowed to boot companies from the App Store as and when it pleases. The problem is that Apple is specifically not allowed to “prohibit” the use of external payment links. It can reject apps, or ban developer accounts, at its own discretion. But if it rejects an app or bans a dev for no reason other than its use of such links, does that amount to a de facto prohibition? Again, that’s debatable. If Apple can come up with some other reason for Fortnite’s exclusion, it might be okay–and it could be helpful that the company has, in compliance with the ruling, approved updates to other high-profile apps such as Spotify and Patreon which add the links. But if it can’t, the penalties could be severe. The stakes just got a lot higher.
    0 Комментарии 0 Поделились 0 предпросмотр
  • Shrink exploit windows, slash MTTP: Why ring deployment is now a must for enterprise defense

    Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More

    Unpatched systems are a ticking time bomb. Fifty-seven percent of cyberattack victims acknowledge that available patches would have prevented breaches, yet nearly one-third admit failing to act, compounding the risk.
    Ponemon research shows organizations now take an alarming average of 43 days to detect cyberattacks, even after a patch is released, up from 36 days the previous year. According to the Verizon 2024 Data Breach Investigations Report, attackers’ ability to exploit vulnerabilities surged by 180% from 2023 to 2024.
    Chronic firefighting makes manual or partially automated patching overly burdensome, further pushing patching down teams’ priority lists.
    Relying on manual or partially automated patching systems is considered too time-consuming, further reducing patching to the bottom of a team’s action item list. This is consistent with an Ivanti study that found that the majorityof IT and security professionals think patching is overly complex, cumbersome and time-consuming.
    When it comes to patching, complacency kills
    Attackers aggressively exploit legacy Common Vulnerabilities and Exposures, often ten or more years old.
    A sure sign of how effective attackers’ tradecraft is becoming at targeting legacy CVEs is their success with vulnerabilities in some cases, 10-plus years old. A sure sign that attackers are finding new ways to weaponize old vulnerabilities is reflected in the startling stat that 76% of vulnerabilities leveraged by ransomware were reported between 2010 and 2019. The misalignment between IT and security teams compounds delays, with 27% lacking cohesive patch strategies and nearly a quarter disagreeing on patch schedules. One of the unexpected benefits of automating patch management is breaking the impasse between IT and security when it comes to managing the patch workload.   
    “Typically, on average, an enterprise may patch 90% of desktops within two to four weeks, 80% of Windows servers within six weeks and only 25% of Oracle Databases within six months from patch release date”, writes Gartner in their recent report, “We’re not patching our way out of vulnerability exposure.” The report states that “the cold, hard reality is that no one is out patching threat actors at scale in any size organization, geography or industry vertical.”
    Ring deployment: proactive defense at scale
    Every unpatched endpoint or threat surface invites attackers to exploit it. Enterprises are losing the patching race, which motivates attackers even more.
    In the meantime, patching has become exponentially more challenging for security and IT teams to manage manually. Approximately a decade ago, ring deployment began to rely on Microsoft-dominated networks. Since then, ring deployments have proliferated across on-premise and cloud-based patch and risk management systems. Ring deployment provides a phased, automated strategy, shrinking attacker windows and breach risks.
    Ring deployment rolls out patches incrementally through carefully controlled stages or “rings:”

    Test Ring: Core IT teams quickly validate patch stability.
    Early Adopter Ring: A broader internal group confirms real-world compatibility.
    Production Ring: Enterprise-wide rollout after stability is conclusively proven.

    Ivanti’s recent release of ring deployment is designed to give security teams greater control over when patches will be deployed, to which systems and how each sequence of updates will be managed. By addressing patching issues early, the goal is to minimize risks and reduce and eliminate disruptions.
    Gartner’s ring deployment strategy escalates patches from internal IT outward, providing continuous validation and dramatically reducing deployment risk. Source: Gartner, “Modernize Windows and Third-Party Application Patching,” p. 6.
    Ring deployment crushes MTTP, ends reactive patching chaos
    Relying on outdated vulnerability ratings to lead patch management strategies only increases the risk of a breach as enterprises race to keep up with growing patch backlogs. That’s often when patching becomes cybersecurity’s endless nightmare, with attackers looking to capitalize on the many legacy CVEs that remain unprotected.
    Gartner’s take in their recent report “Modernize windows and third-party application patching” makes the point brutally clear, showing how traditional patching methods routinely fail to keep pace. In contrast, enterprises embracing ring deployment are getting measurable results. Their research finds ring deployment achieves a “99% patch success within 24 hours for up to 100,000 PCs,” leaving traditional methods far behind.
    During an interview with VentureBeat, Tony Miller, Ivanti’s VP of enterprise services, emphasized that “Ivanti Neurons for Patch Management and implementing Ring Deployment is an important part of our Customer Zero journey.” He said the company uses many of its own products, which allows for a quick feedback loop and gives developers insight into customers’ pain points.
    Miller added: “We’ve tested out Ring Deployment internally with a limited group, and we are in the process of rolling it out organization-wide. In our test group, we have benefited from deploying patches based on real-world risk, and ensuring that updates don’t interrupt employee productivity–a significant challenge for any IT organization.”
    VentureBeat also spoke with Jesse Miller, SVP and director of IT at Southstar Bank, about leveraging Ivanti’s dynamic Vulnerability Risk Rating, an AI-driven system continuously recalibrated with real-time threat intelligence, live exploit activity, and current attack data.
    Miller stated clearly: “This is an important change for us and the entire industry. Judging a patch based on its CVSS now is like working in a vacuum. When judging how impactful something can be, you have to take everything from current events, your industry, your environment and more into the equation. Ultimately, we are just making wiser decisions as we are not disregarding CVSS scoring; we are simply adding to it.”
    Miller also highlighted his team’s prioritization strategy: “We have been able to focus on prioritizing Zero-Day and Priority patches to get out first, as well as anything being exploited live in the wild. Using patch prioritization helps us eliminate our biggest risk first so that we can reduce our attack surface as quickly as possible.”
    By combining ring deployment and dynamic VRR technology, Ivanti Neurons provides enterprises with structured visual orchestration of incremental patch rollouts. This approach sharply reduces Mean-Time-to-Patch, accelerating patches from targeted testing through full deployment and significantly decreasing the exposure windows that attackers exploit.
    Caption: The Ivanti Neurons interface visually manages deployment rings, success thresholds, patching progress and streamlining operational clarity. Source: Ivanti Neurons
    Comparing Ivanti Neurons, Microsoft Autopatch, Tanium and ServiceNow: Key strengths and gaps
    When selecting enterprise patch management solutions, apparent differences emerge among leading providers, including Microsoft Autopatch, Tanium, ServiceNow and Ivanti Neurons.
    Microsoft Autopatch relies on ring deployment but is restricted to Windows environments, including Microsoft 365 applications. Ivanti Neurons expands on this concept by covering a broader spectrum, including Windows, macOS, Linux and various third-party applications. This enables enterprise-wide patch management for organizations with large-scale, diverse infrastructure.
    Tanium stands out for its robust endpoint visibility and detailed reporting features, but its infrastructure requirements typically align better with resource-intensive enterprises. Meanwhile, ServiceNow’s strength lies in workflow automation and IT service management integrations. Executing actual patches often demands significant additional customization or third-party integrations.
    Ivanti Neurons aims to differentiate by integrating dynamic risk assessments, phased ring deployments and automated workflows within a single platform. It directly addresses common enterprise challenges in patch management, including visibility gaps, operational complexity and uncertainty about vulnerability prioritization with real-time risk assessments and intuitive visual dashboards.
    Caption: Ivanti Neurons provides real-time patch status, vulnerability assessments, and risk exposure metrics, ensuring continuous visibility. Source: Ivanti Neurons
    Transforming patch management into a strategic advantage
    Patching alone cannot eliminate vulnerability exposure. Gartner’s analysts continue to stress the necessity of integrating compensating controls, including endpoint protection platforms, multifactor authentication, and network segmentation to reinforce security beyond basic patching.
    Combining ring deployment with integrated compensating controls that are part of a broader zero-trust framework ensures security, allows IT teams to shrink exposure windows, and better manage cyber risks.
    Ivanti’s approach to ring deployment incorporates real-time risk assessments, automated remediation workflows, and built-in threat management, directly aligning patch management with broader business resilience strategies. The design decision to make it part of Neurons for Patch Management delivers the scale enterprises need to improve risk management’s real-time visibility.  
    Bottom line: Integrating ring deployment with compensating controls and prioritization tools transforms patch management from a reactive burden to a strategic advantage.

    Daily insights on business use cases with VB Daily
    If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI.
    Read our Privacy Policy

    Thanks for subscribing. Check out more VB newsletters here.

    An error occured.
    #shrink #exploit #windows #slash #mttp
    Shrink exploit windows, slash MTTP: Why ring deployment is now a must for enterprise defense
    Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Unpatched systems are a ticking time bomb. Fifty-seven percent of cyberattack victims acknowledge that available patches would have prevented breaches, yet nearly one-third admit failing to act, compounding the risk. Ponemon research shows organizations now take an alarming average of 43 days to detect cyberattacks, even after a patch is released, up from 36 days the previous year. According to the Verizon 2024 Data Breach Investigations Report, attackers’ ability to exploit vulnerabilities surged by 180% from 2023 to 2024. Chronic firefighting makes manual or partially automated patching overly burdensome, further pushing patching down teams’ priority lists. Relying on manual or partially automated patching systems is considered too time-consuming, further reducing patching to the bottom of a team’s action item list. This is consistent with an Ivanti study that found that the majorityof IT and security professionals think patching is overly complex, cumbersome and time-consuming. When it comes to patching, complacency kills Attackers aggressively exploit legacy Common Vulnerabilities and Exposures, often ten or more years old. A sure sign of how effective attackers’ tradecraft is becoming at targeting legacy CVEs is their success with vulnerabilities in some cases, 10-plus years old. A sure sign that attackers are finding new ways to weaponize old vulnerabilities is reflected in the startling stat that 76% of vulnerabilities leveraged by ransomware were reported between 2010 and 2019. The misalignment between IT and security teams compounds delays, with 27% lacking cohesive patch strategies and nearly a quarter disagreeing on patch schedules. One of the unexpected benefits of automating patch management is breaking the impasse between IT and security when it comes to managing the patch workload.    “Typically, on average, an enterprise may patch 90% of desktops within two to four weeks, 80% of Windows servers within six weeks and only 25% of Oracle Databases within six months from patch release date”, writes Gartner in their recent report, “We’re not patching our way out of vulnerability exposure.” The report states that “the cold, hard reality is that no one is out patching threat actors at scale in any size organization, geography or industry vertical.” Ring deployment: proactive defense at scale Every unpatched endpoint or threat surface invites attackers to exploit it. Enterprises are losing the patching race, which motivates attackers even more. In the meantime, patching has become exponentially more challenging for security and IT teams to manage manually. Approximately a decade ago, ring deployment began to rely on Microsoft-dominated networks. Since then, ring deployments have proliferated across on-premise and cloud-based patch and risk management systems. Ring deployment provides a phased, automated strategy, shrinking attacker windows and breach risks. Ring deployment rolls out patches incrementally through carefully controlled stages or “rings:” Test Ring: Core IT teams quickly validate patch stability. Early Adopter Ring: A broader internal group confirms real-world compatibility. Production Ring: Enterprise-wide rollout after stability is conclusively proven. Ivanti’s recent release of ring deployment is designed to give security teams greater control over when patches will be deployed, to which systems and how each sequence of updates will be managed. By addressing patching issues early, the goal is to minimize risks and reduce and eliminate disruptions. Gartner’s ring deployment strategy escalates patches from internal IT outward, providing continuous validation and dramatically reducing deployment risk. Source: Gartner, “Modernize Windows and Third-Party Application Patching,” p. 6. Ring deployment crushes MTTP, ends reactive patching chaos Relying on outdated vulnerability ratings to lead patch management strategies only increases the risk of a breach as enterprises race to keep up with growing patch backlogs. That’s often when patching becomes cybersecurity’s endless nightmare, with attackers looking to capitalize on the many legacy CVEs that remain unprotected. Gartner’s take in their recent report “Modernize windows and third-party application patching” makes the point brutally clear, showing how traditional patching methods routinely fail to keep pace. In contrast, enterprises embracing ring deployment are getting measurable results. Their research finds ring deployment achieves a “99% patch success within 24 hours for up to 100,000 PCs,” leaving traditional methods far behind. During an interview with VentureBeat, Tony Miller, Ivanti’s VP of enterprise services, emphasized that “Ivanti Neurons for Patch Management and implementing Ring Deployment is an important part of our Customer Zero journey.” He said the company uses many of its own products, which allows for a quick feedback loop and gives developers insight into customers’ pain points. Miller added: “We’ve tested out Ring Deployment internally with a limited group, and we are in the process of rolling it out organization-wide. In our test group, we have benefited from deploying patches based on real-world risk, and ensuring that updates don’t interrupt employee productivity–a significant challenge for any IT organization.” VentureBeat also spoke with Jesse Miller, SVP and director of IT at Southstar Bank, about leveraging Ivanti’s dynamic Vulnerability Risk Rating, an AI-driven system continuously recalibrated with real-time threat intelligence, live exploit activity, and current attack data. Miller stated clearly: “This is an important change for us and the entire industry. Judging a patch based on its CVSS now is like working in a vacuum. When judging how impactful something can be, you have to take everything from current events, your industry, your environment and more into the equation. Ultimately, we are just making wiser decisions as we are not disregarding CVSS scoring; we are simply adding to it.” Miller also highlighted his team’s prioritization strategy: “We have been able to focus on prioritizing Zero-Day and Priority patches to get out first, as well as anything being exploited live in the wild. Using patch prioritization helps us eliminate our biggest risk first so that we can reduce our attack surface as quickly as possible.” By combining ring deployment and dynamic VRR technology, Ivanti Neurons provides enterprises with structured visual orchestration of incremental patch rollouts. This approach sharply reduces Mean-Time-to-Patch, accelerating patches from targeted testing through full deployment and significantly decreasing the exposure windows that attackers exploit. Caption: The Ivanti Neurons interface visually manages deployment rings, success thresholds, patching progress and streamlining operational clarity. Source: Ivanti Neurons Comparing Ivanti Neurons, Microsoft Autopatch, Tanium and ServiceNow: Key strengths and gaps When selecting enterprise patch management solutions, apparent differences emerge among leading providers, including Microsoft Autopatch, Tanium, ServiceNow and Ivanti Neurons. Microsoft Autopatch relies on ring deployment but is restricted to Windows environments, including Microsoft 365 applications. Ivanti Neurons expands on this concept by covering a broader spectrum, including Windows, macOS, Linux and various third-party applications. This enables enterprise-wide patch management for organizations with large-scale, diverse infrastructure. Tanium stands out for its robust endpoint visibility and detailed reporting features, but its infrastructure requirements typically align better with resource-intensive enterprises. Meanwhile, ServiceNow’s strength lies in workflow automation and IT service management integrations. Executing actual patches often demands significant additional customization or third-party integrations. Ivanti Neurons aims to differentiate by integrating dynamic risk assessments, phased ring deployments and automated workflows within a single platform. It directly addresses common enterprise challenges in patch management, including visibility gaps, operational complexity and uncertainty about vulnerability prioritization with real-time risk assessments and intuitive visual dashboards. Caption: Ivanti Neurons provides real-time patch status, vulnerability assessments, and risk exposure metrics, ensuring continuous visibility. Source: Ivanti Neurons Transforming patch management into a strategic advantage Patching alone cannot eliminate vulnerability exposure. Gartner’s analysts continue to stress the necessity of integrating compensating controls, including endpoint protection platforms, multifactor authentication, and network segmentation to reinforce security beyond basic patching. Combining ring deployment with integrated compensating controls that are part of a broader zero-trust framework ensures security, allows IT teams to shrink exposure windows, and better manage cyber risks. Ivanti’s approach to ring deployment incorporates real-time risk assessments, automated remediation workflows, and built-in threat management, directly aligning patch management with broader business resilience strategies. The design decision to make it part of Neurons for Patch Management delivers the scale enterprises need to improve risk management’s real-time visibility.   Bottom line: Integrating ring deployment with compensating controls and prioritization tools transforms patch management from a reactive burden to a strategic advantage. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured. #shrink #exploit #windows #slash #mttp
    VENTUREBEAT.COM
    Shrink exploit windows, slash MTTP: Why ring deployment is now a must for enterprise defense
    Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More Unpatched systems are a ticking time bomb. Fifty-seven percent of cyberattack victims acknowledge that available patches would have prevented breaches, yet nearly one-third admit failing to act, compounding the risk. Ponemon research shows organizations now take an alarming average of 43 days to detect cyberattacks, even after a patch is released, up from 36 days the previous year. According to the Verizon 2024 Data Breach Investigations Report, attackers’ ability to exploit vulnerabilities surged by 180% from 2023 to 2024. Chronic firefighting makes manual or partially automated patching overly burdensome, further pushing patching down teams’ priority lists. Relying on manual or partially automated patching systems is considered too time-consuming, further reducing patching to the bottom of a team’s action item list. This is consistent with an Ivanti study that found that the majority (71%) of IT and security professionals think patching is overly complex, cumbersome and time-consuming. When it comes to patching, complacency kills Attackers aggressively exploit legacy Common Vulnerabilities and Exposures (CVEs), often ten or more years old. A sure sign of how effective attackers’ tradecraft is becoming at targeting legacy CVEs is their success with vulnerabilities in some cases, 10-plus years old. A sure sign that attackers are finding new ways to weaponize old vulnerabilities is reflected in the startling stat that 76% of vulnerabilities leveraged by ransomware were reported between 2010 and 2019. The misalignment between IT and security teams compounds delays, with 27% lacking cohesive patch strategies and nearly a quarter disagreeing on patch schedules. One of the unexpected benefits of automating patch management is breaking the impasse between IT and security when it comes to managing the patch workload.    “Typically, on average, an enterprise may patch 90% of desktops within two to four weeks, 80% of Windows servers within six weeks and only 25% of Oracle Databases within six months from patch release date”, writes Gartner in their recent report, “We’re not patching our way out of vulnerability exposure.” The report states that “the cold, hard reality is that no one is out patching threat actors at scale in any size organization, geography or industry vertical.” Ring deployment: proactive defense at scale Every unpatched endpoint or threat surface invites attackers to exploit it. Enterprises are losing the patching race, which motivates attackers even more. In the meantime, patching has become exponentially more challenging for security and IT teams to manage manually. Approximately a decade ago, ring deployment began to rely on Microsoft-dominated networks. Since then, ring deployments have proliferated across on-premise and cloud-based patch and risk management systems. Ring deployment provides a phased, automated strategy, shrinking attacker windows and breach risks. Ring deployment rolls out patches incrementally through carefully controlled stages or “rings:” Test Ring (1%): Core IT teams quickly validate patch stability. Early Adopter Ring (5–10%): A broader internal group confirms real-world compatibility. Production Ring (80–90%): Enterprise-wide rollout after stability is conclusively proven. Ivanti’s recent release of ring deployment is designed to give security teams greater control over when patches will be deployed, to which systems and how each sequence of updates will be managed. By addressing patching issues early, the goal is to minimize risks and reduce and eliminate disruptions. Gartner’s ring deployment strategy escalates patches from internal IT outward, providing continuous validation and dramatically reducing deployment risk. Source: Gartner, “Modernize Windows and Third-Party Application Patching,” p. 6. Ring deployment crushes MTTP, ends reactive patching chaos Relying on outdated vulnerability ratings to lead patch management strategies only increases the risk of a breach as enterprises race to keep up with growing patch backlogs. That’s often when patching becomes cybersecurity’s endless nightmare, with attackers looking to capitalize on the many legacy CVEs that remain unprotected. Gartner’s take in their recent report “Modernize windows and third-party application patching” makes the point brutally clear, showing how traditional patching methods routinely fail to keep pace. In contrast, enterprises embracing ring deployment are getting measurable results. Their research finds ring deployment achieves a “99% patch success within 24 hours for up to 100,000 PCs,” leaving traditional methods far behind. During an interview with VentureBeat, Tony Miller, Ivanti’s VP of enterprise services, emphasized that “Ivanti Neurons for Patch Management and implementing Ring Deployment is an important part of our Customer Zero journey.” He said the company uses many of its own products, which allows for a quick feedback loop and gives developers insight into customers’ pain points. Miller added: “We’ve tested out Ring Deployment internally with a limited group, and we are in the process of rolling it out organization-wide. In our test group, we have benefited from deploying patches based on real-world risk, and ensuring that updates don’t interrupt employee productivity–a significant challenge for any IT organization.” VentureBeat also spoke with Jesse Miller, SVP and director of IT at Southstar Bank, about leveraging Ivanti’s dynamic Vulnerability Risk Rating (VRR), an AI-driven system continuously recalibrated with real-time threat intelligence, live exploit activity, and current attack data. Miller stated clearly: “This is an important change for us and the entire industry. Judging a patch based on its CVSS now is like working in a vacuum. When judging how impactful something can be, you have to take everything from current events, your industry, your environment and more into the equation. Ultimately, we are just making wiser decisions as we are not disregarding CVSS scoring; we are simply adding to it.” Miller also highlighted his team’s prioritization strategy: “We have been able to focus on prioritizing Zero-Day and Priority patches to get out first, as well as anything being exploited live in the wild. Using patch prioritization helps us eliminate our biggest risk first so that we can reduce our attack surface as quickly as possible.” By combining ring deployment and dynamic VRR technology, Ivanti Neurons provides enterprises with structured visual orchestration of incremental patch rollouts. This approach sharply reduces Mean-Time-to-Patch (MTTP), accelerating patches from targeted testing through full deployment and significantly decreasing the exposure windows that attackers exploit. Caption: The Ivanti Neurons interface visually manages deployment rings, success thresholds, patching progress and streamlining operational clarity. Source: Ivanti Neurons Comparing Ivanti Neurons, Microsoft Autopatch, Tanium and ServiceNow: Key strengths and gaps When selecting enterprise patch management solutions, apparent differences emerge among leading providers, including Microsoft Autopatch, Tanium, ServiceNow and Ivanti Neurons. Microsoft Autopatch relies on ring deployment but is restricted to Windows environments, including Microsoft 365 applications. Ivanti Neurons expands on this concept by covering a broader spectrum, including Windows, macOS, Linux and various third-party applications. This enables enterprise-wide patch management for organizations with large-scale, diverse infrastructure. Tanium stands out for its robust endpoint visibility and detailed reporting features, but its infrastructure requirements typically align better with resource-intensive enterprises. Meanwhile, ServiceNow’s strength lies in workflow automation and IT service management integrations. Executing actual patches often demands significant additional customization or third-party integrations. Ivanti Neurons aims to differentiate by integrating dynamic risk assessments, phased ring deployments and automated workflows within a single platform. It directly addresses common enterprise challenges in patch management, including visibility gaps, operational complexity and uncertainty about vulnerability prioritization with real-time risk assessments and intuitive visual dashboards. Caption: Ivanti Neurons provides real-time patch status, vulnerability assessments, and risk exposure metrics, ensuring continuous visibility. Source: Ivanti Neurons Transforming patch management into a strategic advantage Patching alone cannot eliminate vulnerability exposure. Gartner’s analysts continue to stress the necessity of integrating compensating controls, including endpoint protection platforms (EPP), multifactor authentication, and network segmentation to reinforce security beyond basic patching. Combining ring deployment with integrated compensating controls that are part of a broader zero-trust framework ensures security, allows IT teams to shrink exposure windows, and better manage cyber risks. Ivanti’s approach to ring deployment incorporates real-time risk assessments, automated remediation workflows, and built-in threat management, directly aligning patch management with broader business resilience strategies. The design decision to make it part of Neurons for Patch Management delivers the scale enterprises need to improve risk management’s real-time visibility.   Bottom line: Integrating ring deployment with compensating controls and prioritization tools transforms patch management from a reactive burden to a strategic advantage. Daily insights on business use cases with VB Daily If you want to impress your boss, VB Daily has you covered. We give you the inside scoop on what companies are doing with generative AI, from regulatory shifts to practical deployments, so you can share insights for maximum ROI. Read our Privacy Policy Thanks for subscribing. Check out more VB newsletters here. An error occured.
    0 Комментарии 0 Поделились 0 предпросмотр
  • Access to experimental medical treatments is expanding across the US

    A couple of weeks ago I was in Washington, DC, for a gathering of scientists, policymakers, and longevity enthusiasts. They had come together to discuss ways to speed along the development of drugs and other treatments that might extend the human lifespan.

    One approach that came up was to simply make experimental drugs more easily accessible. Let people try drugs that might help them live longer, the argument went. Some groups have been pushing bills to do just that in Montana, a state whose constitution explicitly values personal liberty.

    A couple of years ago, a longevity lobbying group helped develop a bill that expanded on the state’s existing Right to Try law, which allowed seriously ill people to apply for access to experimental drugs. The expansion, which was passed in 2023, opened access for people who are not seriously ill. 

    Over the last few months, the group has been pushing further—for a new bill that sets out exactly how clinics can sell experimental, unproven treatments in the state to anyone who wants them. At the end of the second day of the event, the man next to me looked at his phone. “It just passed,” he told me.The passing of the bill could make Montana something of a US hub for experimental treatments. But it represents a wider trend: the creep of Right to Try across the US. And a potentially dangerous departure from evidence-based medicine.

    In the US, drugs must be tested in human volunteers before they can be approved and sold. Early-stage clinical trials are small and check for safety. Later trials test both the safety and efficacy of a new drug.

    The system is designed to keep people safe and to prevent manufacturers from selling ineffective or dangerous products. It’s meant to protect us from snake oil.

    But people who are seriously ill and who have exhausted all other treatment options are often desperate to try experimental drugs. They might see it as a last hope. Sometimes they can volunteer for clinical trials, but time, distance, and eligibility can rule out that option.

    Since the 1980s, seriously or terminally ill people who cannot take part in a trial for some reason can apply for access to experimental treatments through a “compassionate use” program run by the US Food and Drug Administration. The FDA authorizes almost all of the compassionate use requests it receives.

    But that wasn’t enough for the Goldwater Institute, a libertarian organization that in 2014 drafted a model Right to Try law for people who are terminally ill. Versions of this draft have since been passed into law in 41 US states, and the US has had a federal Right to Try law since 2018. These laws generally allow people who are seriously ill to apply for access to drugs that have only been through the very first stages of clinical trials, provided they give informed consent.

    Some have argued that these laws have been driven by a dislike of both drug regulation and the FDA. After all, they are designed to achieve the same result as the compassionate use program. The only difference is that they bypass the FDA.

    Either way, it’s worth noting just how early-stage these treatments are. A drug that has been through phase I trials might have been tested in just 20 healthy people. Yes, these trials are designed to test the safety of a drug, but they are never conclusive. At that point in a drug’s development, no one can know how a sick person—who is likely to be taking other medicines— will react to it.

    Now these Right to Try laws are being expanded even more. The Montana bill, which goes the furthest, will enable people who are not seriously ill to access unproven treatments, and other states have been making moves in the same direction.

    Just this week, Georgia’s governor signed into law the Hope for Georgia Patients Act, which allows people with life-threatening illnesses to access personalized treatments, those that are “unique to and produced exclusively for an individual patient based on his or her own genetic profile.” Similar laws, known as “Right to Try 2.0,”  have been passed in other states, too, including Arizona, Mississippi, and North Carolina.

    And last year, Utah passed a law that allows health care providersto deliver unapproved placental stem cell therapies. These treatments involve cells collected from placentas, which are thought to hold promise for tissue regeneration. But they haven’t been through human trials. They can cost tens of thousands of dollars, and their effects are unknown. Utah’s law was described as a “pretty blatant broadbrush challenge to the FDA’s authority” by an attorney who specializes in FDA law. And it’s one that could put patients at risk.

    Laws like these spark a lot of very sensitive debates. Some argue that it’s a question of medical autonomy, and that people should have the right to choose what they put in their own bodies.

    And many argue there’s a cost-benefit calculation to be made. A seriously ill person potentially has more to gain and less to lose from trying an experimental drug, compared to someone who is in good health.

    But everyone needs to be protected from ineffective drugs. Most ethicists think it’s unethical to sell a treatment when you have no idea if it will work, and that argument has been supported by numerous US court decisions over the years. 

    There could be a financial incentive for doctors to recommend an experimental drug, especially when they are granted protections by law.On top of all this, many ethicists are also concerned that the FDA’s drug approval process itself has been on a downward slide over the last decade or so. An increasing number of drug approvals are fast-tracked based on weak evidence, they argue.

    Scientists and ethicists on both sides of the debate are now waiting to see what unfolds under the new US administration.  

    In the meantime, a quote from Diana Zuckerman, president of the nonprofit National Center for Health Research, comes to mind: “Sometimes hope helps people do better,” she told me a couple of years ago. “But in medicine, isn’t it better to have hope based on evidence rather than hope based on hype?”

    This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.
    #access #experimental #medical #treatments #expanding
    Access to experimental medical treatments is expanding across the US
    A couple of weeks ago I was in Washington, DC, for a gathering of scientists, policymakers, and longevity enthusiasts. They had come together to discuss ways to speed along the development of drugs and other treatments that might extend the human lifespan. One approach that came up was to simply make experimental drugs more easily accessible. Let people try drugs that might help them live longer, the argument went. Some groups have been pushing bills to do just that in Montana, a state whose constitution explicitly values personal liberty. A couple of years ago, a longevity lobbying group helped develop a bill that expanded on the state’s existing Right to Try law, which allowed seriously ill people to apply for access to experimental drugs. The expansion, which was passed in 2023, opened access for people who are not seriously ill.  Over the last few months, the group has been pushing further—for a new bill that sets out exactly how clinics can sell experimental, unproven treatments in the state to anyone who wants them. At the end of the second day of the event, the man next to me looked at his phone. “It just passed,” he told me.The passing of the bill could make Montana something of a US hub for experimental treatments. But it represents a wider trend: the creep of Right to Try across the US. And a potentially dangerous departure from evidence-based medicine. In the US, drugs must be tested in human volunteers before they can be approved and sold. Early-stage clinical trials are small and check for safety. Later trials test both the safety and efficacy of a new drug. The system is designed to keep people safe and to prevent manufacturers from selling ineffective or dangerous products. It’s meant to protect us from snake oil. But people who are seriously ill and who have exhausted all other treatment options are often desperate to try experimental drugs. They might see it as a last hope. Sometimes they can volunteer for clinical trials, but time, distance, and eligibility can rule out that option. Since the 1980s, seriously or terminally ill people who cannot take part in a trial for some reason can apply for access to experimental treatments through a “compassionate use” program run by the US Food and Drug Administration. The FDA authorizes almost all of the compassionate use requests it receives. But that wasn’t enough for the Goldwater Institute, a libertarian organization that in 2014 drafted a model Right to Try law for people who are terminally ill. Versions of this draft have since been passed into law in 41 US states, and the US has had a federal Right to Try law since 2018. These laws generally allow people who are seriously ill to apply for access to drugs that have only been through the very first stages of clinical trials, provided they give informed consent. Some have argued that these laws have been driven by a dislike of both drug regulation and the FDA. After all, they are designed to achieve the same result as the compassionate use program. The only difference is that they bypass the FDA. Either way, it’s worth noting just how early-stage these treatments are. A drug that has been through phase I trials might have been tested in just 20 healthy people. Yes, these trials are designed to test the safety of a drug, but they are never conclusive. At that point in a drug’s development, no one can know how a sick person—who is likely to be taking other medicines— will react to it. Now these Right to Try laws are being expanded even more. The Montana bill, which goes the furthest, will enable people who are not seriously ill to access unproven treatments, and other states have been making moves in the same direction. Just this week, Georgia’s governor signed into law the Hope for Georgia Patients Act, which allows people with life-threatening illnesses to access personalized treatments, those that are “unique to and produced exclusively for an individual patient based on his or her own genetic profile.” Similar laws, known as “Right to Try 2.0,”  have been passed in other states, too, including Arizona, Mississippi, and North Carolina. And last year, Utah passed a law that allows health care providersto deliver unapproved placental stem cell therapies. These treatments involve cells collected from placentas, which are thought to hold promise for tissue regeneration. But they haven’t been through human trials. They can cost tens of thousands of dollars, and their effects are unknown. Utah’s law was described as a “pretty blatant broadbrush challenge to the FDA’s authority” by an attorney who specializes in FDA law. And it’s one that could put patients at risk. Laws like these spark a lot of very sensitive debates. Some argue that it’s a question of medical autonomy, and that people should have the right to choose what they put in their own bodies. And many argue there’s a cost-benefit calculation to be made. A seriously ill person potentially has more to gain and less to lose from trying an experimental drug, compared to someone who is in good health. But everyone needs to be protected from ineffective drugs. Most ethicists think it’s unethical to sell a treatment when you have no idea if it will work, and that argument has been supported by numerous US court decisions over the years.  There could be a financial incentive for doctors to recommend an experimental drug, especially when they are granted protections by law.On top of all this, many ethicists are also concerned that the FDA’s drug approval process itself has been on a downward slide over the last decade or so. An increasing number of drug approvals are fast-tracked based on weak evidence, they argue. Scientists and ethicists on both sides of the debate are now waiting to see what unfolds under the new US administration.   In the meantime, a quote from Diana Zuckerman, president of the nonprofit National Center for Health Research, comes to mind: “Sometimes hope helps people do better,” she told me a couple of years ago. “But in medicine, isn’t it better to have hope based on evidence rather than hope based on hype?” This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here. #access #experimental #medical #treatments #expanding
    WWW.TECHNOLOGYREVIEW.COM
    Access to experimental medical treatments is expanding across the US
    A couple of weeks ago I was in Washington, DC, for a gathering of scientists, policymakers, and longevity enthusiasts. They had come together to discuss ways to speed along the development of drugs and other treatments that might extend the human lifespan. One approach that came up was to simply make experimental drugs more easily accessible. Let people try drugs that might help them live longer, the argument went. Some groups have been pushing bills to do just that in Montana, a state whose constitution explicitly values personal liberty. A couple of years ago, a longevity lobbying group helped develop a bill that expanded on the state’s existing Right to Try law, which allowed seriously ill people to apply for access to experimental drugs (that is, drugs that have not been approved by drug regulators). The expansion, which was passed in 2023, opened access for people who are not seriously ill.  Over the last few months, the group has been pushing further—for a new bill that sets out exactly how clinics can sell experimental, unproven treatments in the state to anyone who wants them. At the end of the second day of the event, the man next to me looked at his phone. “It just passed,” he told me. (The lobbying group has since announced that the state’s governor Greg Gianforte has signed the bill into law, but when I called his office, Gianforte’s staff told me they could not legally tell me whether or not he has.) The passing of the bill could make Montana something of a US hub for experimental treatments. But it represents a wider trend: the creep of Right to Try across the US. And a potentially dangerous departure from evidence-based medicine. In the US, drugs must be tested in human volunteers before they can be approved and sold. Early-stage clinical trials are small and check for safety. Later trials test both the safety and efficacy of a new drug. The system is designed to keep people safe and to prevent manufacturers from selling ineffective or dangerous products. It’s meant to protect us from snake oil. But people who are seriously ill and who have exhausted all other treatment options are often desperate to try experimental drugs. They might see it as a last hope. Sometimes they can volunteer for clinical trials, but time, distance, and eligibility can rule out that option. Since the 1980s, seriously or terminally ill people who cannot take part in a trial for some reason can apply for access to experimental treatments through a “compassionate use” program run by the US Food and Drug Administration (FDA). The FDA authorizes almost all of the compassionate use requests it receives (although manufacturers don’t always agree to provide their drug for various reasons). But that wasn’t enough for the Goldwater Institute, a libertarian organization that in 2014 drafted a model Right to Try law for people who are terminally ill. Versions of this draft have since been passed into law in 41 US states, and the US has had a federal Right to Try law since 2018. These laws generally allow people who are seriously ill to apply for access to drugs that have only been through the very first stages of clinical trials, provided they give informed consent. Some have argued that these laws have been driven by a dislike of both drug regulation and the FDA. After all, they are designed to achieve the same result as the compassionate use program. The only difference is that they bypass the FDA. Either way, it’s worth noting just how early-stage these treatments are. A drug that has been through phase I trials might have been tested in just 20 healthy people. Yes, these trials are designed to test the safety of a drug, but they are never conclusive. At that point in a drug’s development, no one can know how a sick person—who is likely to be taking other medicines— will react to it. Now these Right to Try laws are being expanded even more. The Montana bill, which goes the furthest, will enable people who are not seriously ill to access unproven treatments, and other states have been making moves in the same direction. Just this week, Georgia’s governor signed into law the Hope for Georgia Patients Act, which allows people with life-threatening illnesses to access personalized treatments, those that are “unique to and produced exclusively for an individual patient based on his or her own genetic profile.” Similar laws, known as “Right to Try 2.0,”  have been passed in other states, too, including Arizona, Mississippi, and North Carolina. And last year, Utah passed a law that allows health care providers (including chiropractors, podiatrists, midwives, and naturopaths) to deliver unapproved placental stem cell therapies. These treatments involve cells collected from placentas, which are thought to hold promise for tissue regeneration. But they haven’t been through human trials. They can cost tens of thousands of dollars, and their effects are unknown. Utah’s law was described as a “pretty blatant broadbrush challenge to the FDA’s authority” by an attorney who specializes in FDA law. And it’s one that could put patients at risk. Laws like these spark a lot of very sensitive debates. Some argue that it’s a question of medical autonomy, and that people should have the right to choose what they put in their own bodies. And many argue there’s a cost-benefit calculation to be made. A seriously ill person potentially has more to gain and less to lose from trying an experimental drug, compared to someone who is in good health. But everyone needs to be protected from ineffective drugs. Most ethicists think it’s unethical to sell a treatment when you have no idea if it will work, and that argument has been supported by numerous US court decisions over the years.  There could be a financial incentive for doctors to recommend an experimental drug, especially when they are granted protections by law. (Right to Try laws tend to protect prescribing doctors from disciplinary action and litigation should something go wrong.) On top of all this, many ethicists are also concerned that the FDA’s drug approval process itself has been on a downward slide over the last decade or so. An increasing number of drug approvals are fast-tracked based on weak evidence, they argue. Scientists and ethicists on both sides of the debate are now waiting to see what unfolds under the new US administration.   In the meantime, a quote from Diana Zuckerman, president of the nonprofit National Center for Health Research, comes to mind: “Sometimes hope helps people do better,” she told me a couple of years ago. “But in medicine, isn’t it better to have hope based on evidence rather than hope based on hype?” This article first appeared in The Checkup, MIT Technology Review’s weekly biotech newsletter. To receive it in your inbox every Thursday, and read articles like this first, sign up here.
    0 Комментарии 0 Поделились 0 предпросмотр
  • ASRock Motherboards Show Fluctuating SoC Voltage, Reaching 1.27V; A Possible Risk Factor for Ryzen 9000 CPU Damage

    Bryan from Tech Yes City tested Ryzen 7000/9000 processors on a few motherboards, revealing a difference in how ASRock motherboards supply SoC voltage in contrast to others.
    YouTuber Tech Yes City Demonstrates the Difference Between Motherboards from ASRock and Other Vendors; Analysis Indicates Both CPUs and ASRock Boards Cause Physical Damage
    If you are aware of the reports of dead Ryzen 9000 CPUs, particularly the Ryzen 7 9800X3D, nothing conclusive has been found regarding the issues that are causing these deaths. Perhaps this is one of the first investigations that shed some light on the issue and might help users understand what is actually causing CPU deaths.
    A few days ago, a popular tech YouTuber, Tech Yes City, reported his first dead Ryzen 9 9950X CPU on an ASRock X870 Steel Legend motherboard. This isn't the first time we have seen a Ryzen 9000 CPU dying on an ASRock motherboard, but there are actually nearly 200 such reports, most of them on Reddit.

    Bryan tried to find out what actually causes this, more on the ASRock motherboards than on boards from other vendors, and as per his investigation, there is a difference in how the ASRock motherboards handle the SoC voltage request from the CPU. SoC voltage is basically what the SoC section of the CPU needs for operation, and in most cases, it is static. It may fluctuate occasionally, but in the case of ASRock motherboards such as the ASRock X870E Taichi Lite with Ryzen 9800X3D installed, the SoC voltage kept fluctuating all the time.

    Even though the fluctuation isn't that significant, the upper limit is somewhat higher than what is considered the maximum limit. We can see that the SoC voltage in both casesexceeds 1.250V and comes close to 1.270V. This is higher than what motherboards from other vendors could supply to the SoC of the CPU, and remains mostly near 1.20V, except for the ASUS X870E Crosshair Hero, which has by default added another 50 mV to add more stability.

    Nonetheless, it doesn't budge and remains constant all the time while the ASRock motherboard keeps fluctuating, which may result in permanent CPU damage, as we have seen before. Nonetheless, it should be kept in mind that it is the CPU that "dictates" how much SoC voltage is needed, and the deaths appear to be the result of how both CPU and motherboard handle the SoC request.

    Surely, this needs even deeper investigation as it isn't necessarily the only cause of CPU damage. Till then, ASRock might need to fix this through a BIOS update, but in case you want to mitigate the risk yourself, you will have to "enable" the Uncore Voltage from the BIOS.

    Deal of the Day
    #asrock #motherboards #show #fluctuating #soc
    ASRock Motherboards Show Fluctuating SoC Voltage, Reaching 1.27V; A Possible Risk Factor for Ryzen 9000 CPU Damage
    Bryan from Tech Yes City tested Ryzen 7000/9000 processors on a few motherboards, revealing a difference in how ASRock motherboards supply SoC voltage in contrast to others. YouTuber Tech Yes City Demonstrates the Difference Between Motherboards from ASRock and Other Vendors; Analysis Indicates Both CPUs and ASRock Boards Cause Physical Damage If you are aware of the reports of dead Ryzen 9000 CPUs, particularly the Ryzen 7 9800X3D, nothing conclusive has been found regarding the issues that are causing these deaths. Perhaps this is one of the first investigations that shed some light on the issue and might help users understand what is actually causing CPU deaths. A few days ago, a popular tech YouTuber, Tech Yes City, reported his first dead Ryzen 9 9950X CPU on an ASRock X870 Steel Legend motherboard. This isn't the first time we have seen a Ryzen 9000 CPU dying on an ASRock motherboard, but there are actually nearly 200 such reports, most of them on Reddit. Bryan tried to find out what actually causes this, more on the ASRock motherboards than on boards from other vendors, and as per his investigation, there is a difference in how the ASRock motherboards handle the SoC voltage request from the CPU. SoC voltage is basically what the SoC section of the CPU needs for operation, and in most cases, it is static. It may fluctuate occasionally, but in the case of ASRock motherboards such as the ASRock X870E Taichi Lite with Ryzen 9800X3D installed, the SoC voltage kept fluctuating all the time. Even though the fluctuation isn't that significant, the upper limit is somewhat higher than what is considered the maximum limit. We can see that the SoC voltage in both casesexceeds 1.250V and comes close to 1.270V. This is higher than what motherboards from other vendors could supply to the SoC of the CPU, and remains mostly near 1.20V, except for the ASUS X870E Crosshair Hero, which has by default added another 50 mV to add more stability. Nonetheless, it doesn't budge and remains constant all the time while the ASRock motherboard keeps fluctuating, which may result in permanent CPU damage, as we have seen before. Nonetheless, it should be kept in mind that it is the CPU that "dictates" how much SoC voltage is needed, and the deaths appear to be the result of how both CPU and motherboard handle the SoC request. Surely, this needs even deeper investigation as it isn't necessarily the only cause of CPU damage. Till then, ASRock might need to fix this through a BIOS update, but in case you want to mitigate the risk yourself, you will have to "enable" the Uncore Voltage from the BIOS. Deal of the Day #asrock #motherboards #show #fluctuating #soc
    WCCFTECH.COM
    ASRock Motherboards Show Fluctuating SoC Voltage, Reaching 1.27V; A Possible Risk Factor for Ryzen 9000 CPU Damage
    Bryan from Tech Yes City tested Ryzen 7000/9000 processors on a few motherboards, revealing a difference in how ASRock motherboards supply SoC voltage in contrast to others. YouTuber Tech Yes City Demonstrates the Difference Between Motherboards from ASRock and Other Vendors; Analysis Indicates Both CPUs and ASRock Boards Cause Physical Damage If you are aware of the reports of dead Ryzen 9000 CPUs, particularly the Ryzen 7 9800X3D, nothing conclusive has been found regarding the issues that are causing these deaths. Perhaps this is one of the first investigations that shed some light on the issue and might help users understand what is actually causing CPU deaths. A few days ago, a popular tech YouTuber, Tech Yes City, reported his first dead Ryzen 9 9950X CPU on an ASRock X870 Steel Legend motherboard. This isn't the first time we have seen a Ryzen 9000 CPU dying on an ASRock motherboard, but there are actually nearly 200 such reports, most of them on Reddit. Bryan tried to find out what actually causes this, more on the ASRock motherboards than on boards from other vendors, and as per his investigation, there is a difference in how the ASRock motherboards handle the SoC voltage request from the CPU. SoC voltage is basically what the SoC section of the CPU needs for operation, and in most cases, it is static. It may fluctuate occasionally, but in the case of ASRock motherboards such as the ASRock X870E Taichi Lite with Ryzen 9800X3D installed, the SoC voltage kept fluctuating all the time. Even though the fluctuation isn't that significant, the upper limit is somewhat higher than what is considered the maximum limit. We can see that the SoC voltage in both cases (Ryzen 7 7700 and Ryzen 9800X3D installed) exceeds 1.250V and comes close to 1.270V. This is higher than what motherboards from other vendors could supply to the SoC of the CPU, and remains mostly near 1.20V, except for the ASUS X870E Crosshair Hero, which has by default added another 50 mV to add more stability. Nonetheless, it doesn't budge and remains constant all the time while the ASRock motherboard keeps fluctuating, which may result in permanent CPU damage, as we have seen before. Nonetheless, it should be kept in mind that it is the CPU that "dictates" how much SoC voltage is needed, and the deaths appear to be the result of how both CPU and motherboard handle the SoC request. Surely, this needs even deeper investigation as it isn't necessarily the only cause of CPU damage. Till then, ASRock might need to fix this through a BIOS update, but in case you want to mitigate the risk yourself, you will have to "enable" the Uncore Voltage from the BIOS. Deal of the Day
    0 Комментарии 0 Поделились 0 предпросмотр
CGShares https://cgshares.com