• IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029

    By John P. Mello Jr.
    June 11, 2025 5:00 AM PT

    IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT
    Enterprise IT Lead Generation Services
    Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more.

    IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible.
    The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent.
    “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.”
    IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del.
    “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.”
    A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time.
    Realistic Roadmap
    Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld.
    “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany.
    “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.”
    Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.”
    “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.”
    “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada.
    “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.”
    Solving the Quantum Error Correction Puzzle
    To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits.
    “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.”
    IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published.

    Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices.
    In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer.
    One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing.
    According to IBM, a practical fault-tolerant quantum architecture must:

    Suppress enough errors for useful algorithms to succeed
    Prepare and measure logical qubits during computation
    Apply universal instructions to logical qubits
    Decode measurements from logical qubits in real time and guide subsequent operations
    Scale modularly across hundreds or thousands of logical qubits
    Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources

    Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained.
    “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.”
    Q-Day Approaching Faster Than Expected
    For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated.
    “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif.
    “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.”

    “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said.
    Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years.
    “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld.
    “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.”
    “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.”
    “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.”

    John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John.

    Leave a Comment

    Click here to cancel reply.
    Please sign in to post or reply to a comment. New users create a free account.

    Related Stories

    More by John P. Mello Jr.

    view all

    More in Emerging Tech
    #ibm #plans #largescale #faulttolerant #quantum
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system.ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillionof the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity checkcodes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling, can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQCpreparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EOthat relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech #ibm #plans #largescale #faulttolerant #quantum
    WWW.TECHNEWSWORLD.COM
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029
    IBM Plans Large-Scale Fault-Tolerant Quantum Computer by 2029 By John P. Mello Jr. June 11, 2025 5:00 AM PT IBM unveiled its plan to build IBM Quantum Starling, shown in this rendering. Starling is expected to be the first large-scale, fault-tolerant quantum system. (Image Credit: IBM) ADVERTISEMENT Enterprise IT Lead Generation Services Fuel Your Pipeline. Close More Deals. Our full-service marketing programs deliver sales-ready leads. 100% Satisfaction Guarantee! Learn more. IBM revealed Tuesday its roadmap for bringing a large-scale, fault-tolerant quantum computer, IBM Quantum Starling, online by 2029, which is significantly earlier than many technologists thought possible. The company predicts that when its new Starling computer is up and running, it will be capable of performing 20,000 times more operations than today’s quantum computers — a computational state so vast it would require the memory of more than a quindecillion (10⁴⁸) of the world’s most powerful supercomputers to represent. “IBM is charting the next frontier in quantum computing,” Big Blue CEO Arvind Krishna said in a statement. “Our expertise across mathematics, physics, and engineering is paving the way for a large-scale, fault-tolerant quantum computer — one that will solve real-world challenges and unlock immense possibilities for business.” IBM’s plan to deliver a fault-tolerant quantum system by 2029 is ambitious but not implausible, especially given the rapid pace of its quantum roadmap and past milestones, observed Ensar Seker, CISO at SOCRadar, a threat intelligence company in Newark, Del. “They’ve consistently met or exceeded their qubit scaling goals, and their emphasis on modularity and error correction indicates they’re tackling the right challenges,” he told TechNewsWorld. “However, moving from thousands to millions of physical qubits with sufficient fidelity remains a steep climb.” A qubit is the fundamental unit of information in quantum computing, capable of representing a zero, a one, or both simultaneously due to quantum superposition. In practice, fault-tolerant quantum computers use clusters of physical qubits working together to form a logical qubit — a more stable unit designed to store quantum information and correct errors in real time. Realistic Roadmap Luke Yang, an equity analyst with Morningstar Research Services in Chicago, believes IBM’s roadmap is realistic. “The exact scale and error correction performance might still change between now and 2029, but overall, the goal is reasonable,” he told TechNewsWorld. “Given its reliability and professionalism, IBM’s bold claim should be taken seriously,” said Enrique Solano, co-CEO and co-founder of Kipu Quantum, a quantum algorithm company with offices in Berlin and Karlsruhe, Germany. “Of course, it may also fail, especially when considering the unpredictability of hardware complexities involved,” he told TechNewsWorld, “but companies like IBM exist for such challenges, and we should all be positively impressed by its current achievements and promised technological roadmap.” Tim Hollebeek, vice president of industry standards at DigiCert, a global digital security company, added: “IBM is a leader in this area, and not normally a company that hypes their news. This is a fast-moving industry, and success is certainly possible.” “IBM is attempting to do something that no one has ever done before and will almost certainly run into challenges,” he told TechNewsWorld, “but at this point, it is largely an engineering scaling exercise, not a research project.” “IBM has demonstrated consistent progress, has committed $30 billion over five years to quantum computing, and the timeline is within the realm of technical feasibility,” noted John Young, COO of Quantum eMotion, a developer of quantum random number generator technology, in Saint-Laurent, Quebec, Canada. “That said,” he told TechNewsWorld, “fault-tolerant in a practical, industrial sense is a very high bar.” Solving the Quantum Error Correction Puzzle To make a quantum computer fault-tolerant, errors need to be corrected so large workloads can be run without faults. In a quantum computer, errors are reduced by clustering physical qubits to form logical qubits, which have lower error rates than the underlying physical qubits. “Error correction is a challenge,” Young said. “Logical qubits require thousands of physical qubits to function reliably. That’s a massive scaling issue.” IBM explained in its announcement that creating increasing numbers of logical qubits capable of executing quantum circuits with as few physical qubits as possible is critical to quantum computing at scale. Until today, a clear path to building such a fault-tolerant system without unrealistic engineering overhead has not been published. Alternative and previous gold-standard, error-correcting codes present fundamental engineering challenges, IBM continued. To scale, they would require an unfeasible number of physical qubits to create enough logical qubits to perform complex operations — necessitating impractical amounts of infrastructure and control electronics. This renders them unlikely to be implemented beyond small-scale experiments and devices. In two research papers released with its roadmap, IBM detailed how it will overcome the challenges of building the large-scale, fault-tolerant architecture needed for a quantum computer. One paper outlines the use of quantum low-density parity check (qLDPC) codes to reduce physical qubit overhead. The other describes methods for decoding errors in real time using conventional computing. According to IBM, a practical fault-tolerant quantum architecture must: Suppress enough errors for useful algorithms to succeed Prepare and measure logical qubits during computation Apply universal instructions to logical qubits Decode measurements from logical qubits in real time and guide subsequent operations Scale modularly across hundreds or thousands of logical qubits Be efficient enough to run meaningful algorithms using realistic energy and infrastructure resources Aside from the technological challenges that quantum computer makers are facing, there may also be some market challenges. “Locating suitable use cases for quantum computers could be the biggest challenge,” Morningstar’s Yang maintained. “Only certain computing workloads, such as random circuit sampling [RCS], can fully unleash the computing power of quantum computers and show their advantage over the traditional supercomputers we have now,” he said. “However, workloads like RCS are not very commercially useful, and we believe commercial relevance is one of the key factors that determine the total market size for quantum computers.” Q-Day Approaching Faster Than Expected For years now, organizations have been told they need to prepare for “Q-Day” — the day a quantum computer will be able to crack all the encryption they use to keep their data secure. This IBM announcement suggests the window for action to protect data may be closing faster than many anticipated. “This absolutely adds urgency and credibility to the security expert guidance on post-quantum encryption being factored into their planning now,” said Dave Krauthamer, field CTO of QuSecure, maker of quantum-safe security solutions, in San Mateo, Calif. “IBM’s move to create a large-scale fault-tolerant quantum computer by 2029 is indicative of the timeline collapsing,” he told TechNewsWorld. “A fault-tolerant quantum computer of this magnitude could be well on the path to crack asymmetric ciphers sooner than anyone thinks.” “Security leaders need to take everything connected to post-quantum encryption as a serious measure and work it into their security plans now — not later,” he said. Roger Grimes, a defense evangelist with KnowBe4, a security awareness training provider in Clearwater, Fla., pointed out that IBM is just the latest in a surge of quantum companies announcing quickly forthcoming computational breakthroughs within a few years. “It leads to the question of whether the U.S. government’s original PQC [post-quantum cryptography] preparation date of 2030 is still a safe date,” he told TechNewsWorld. “It’s starting to feel a lot more risky for any company to wait until 2030 to be prepared against quantum attacks. It also flies in the face of the latest cybersecurity EO [Executive Order] that relaxed PQC preparation rules as compared to Biden’s last EO PQC standard order, which told U.S. agencies to transition to PQC ASAP.” “Most US companies are doing zero to prepare for Q-Day attacks,” he declared. “The latest executive order seems to tell U.S. agencies — and indirectly, all U.S. businesses — that they have more time to prepare. It’s going to cause even more agencies and businesses to be less prepared during a time when it seems multiple quantum computing companies are making significant progress.” “It definitely feels that something is going to give soon,” he said, “and if I were a betting man, and I am, I would bet that most U.S. companies are going to be unprepared for Q-Day on the day Q-Day becomes a reality.” John P. Mello Jr. has been an ECT News Network reporter since 2003. His areas of focus include cybersecurity, IT issues, privacy, e-commerce, social media, artificial intelligence, big data and consumer electronics. He has written and edited for numerous publications, including the Boston Business Journal, the Boston Phoenix, Megapixel.Net and Government Security News. Email John. Leave a Comment Click here to cancel reply. Please sign in to post or reply to a comment. New users create a free account. Related Stories More by John P. Mello Jr. view all More in Emerging Tech
    0 Yorumlar 0 hisse senetleri 0 önizleme
  • From Controversy to Comeback: The State of Star Wars Battlefront 2 in 2025

    Released to the wrong kind of fanfare back in November 2017, EA DICE’s sequel to their rebooted mass-arena warfare series set in the Disney-owned space opera universe courted controversy to near-comical degree before a blaster had even been fired. Overloaded by microtransactions and predatory loot box practices, Star Wars Battlefront II had a bad start. The biggest offence to early players on EA Access was that the franchise’s signature heroes – Luke Skywalker, Darth’s Vader and Maul, Obi Wan, Boba Fett, and so on – were hidden behind paywalls or, as was becoming increasingly commonplace at the time, only obtainable after unfathomably long hours spent accumulating whatever skill points or in-game currency was required to unlock them. Worse still was this over-abundance of loot boxes wasn’t present in the open beta conducted a few weeks earlier in October, so their emergence – as bad as it was – resonated much worse because it felt deceitful.  

    The comedy element in this pre-release debacle emerged during EA’s robust defence of their choices on a Reddit post; a post which became the most downvoted the website had ever seen and netted the not-yet-released title its first accolade: an unwanted Guinness World Record, for having – you guessed it – the most downvoted post in history.  

    The heads in EA’s boardroom didn’t take too kindly to this, but instead of doubling down they opted to listen to player concerns and perform a partial U-turn. First by dramatically reducing the cost of the franchise’s heroes then, the day before the game’s official release, by disabling microtransactions. We describe this as a partial U-turn as these microtransactions were re-enabled several months after release, but the only items for purchase were cosmetics, emotes, victory poses, that sort of thing. Certainly not the pay to win practices EA originally planned.  

    Arguably, these changes weren’t solely the result of player discontent. The loot boxes, in particular, harboured the very serious implication that they presented a form of gambling. So strong was the furore surrounding their inclusion that governments throughout Europe and North America began to investigate; not just their presence in Star Wars Battlefront II but in video games as a whole. After all, children play games and given Star Wars’ fanbase there was undoubtably a substantial player base awaiting Battlefront II’s release that were under eighteen. The comedic element in EA’s initial response is an undertone if anything. The loot box controversy clouding Star Wars Battlefront II grew into a seismic event for the industry. Paying for loot boxes which may or may not contain the items you want was officially declared as gambling, and it simply couldn’t continue to exist.   

    The industry isn’t cleansed of the practice altogether, of course, but the direction it was heading in 2017 was certainly for the worse. In a way, it’s great that this issue reared its head so prevalently when it did and, arguably, it’s because this was a Star Wars game that the uproar was so strong. After all, this is a beloved franchise with a then forty-year legacy. Its fans are passionate, and they’ll voice their discontent more rabidly than players of EA’s annual sports titles.

    Despite EA making wholesale changes to Star Wars Battlefront II’s pay to win progression, it’s release was still cloaked in negativity. However, there was a decent game underneath all the furore, and it was only improved upon by EA DICE in the subsequent years following post-release content which steadily emerged until support was abandoned in 2020.

    Changes to progression which came in March 2018 transformed the game into the one that’s playable today. To answer the question heading this feature, Star Wars Battlefront II is fixed, and it was the change to linear progression which did it. Now, troopers earn in-game skill points via playing not by paying. Completing objectives, blasting opponents, traditional levelling up.

    Enhancing the capability of the game’s characters are Star Cards, and these became unlockable through experience once they were removed from loot boxes. They govern progression for each of the game’s classes, heroes, AI reinforcements, and vehicles, with skill points being capable of upgrading a Star Card to its next tier or being put towards crafting new ones. This mechanic returned from Battlefront II’s predecessor, of course, but they were adjusted slightly to incorporate abilities and boosts, and they are crucial to gaining an edge during largescale battles. Boost cards enhance your unit’s pre-existing abilities, whereas ability cards unique to each trooper class can be swapped in and out. The latter is a rewarding endeavour for anyone who wishes to pursue a specific class of trooper – stealthier specialists, more destructive heavies, tougher assault troopers, et cetera.

    If Star Wars Battlefront II is indeed ‘fixed’ then an extra question we land upon now is this: is Star Wars Battlefront II still worth playing some eight years after release and approaching five years since any form of update? Well, if you’re playing on PC annoyingly the answer might be no, as currently hackers have spoiled the experience with game-ruining cheats and reports of harassment towards honest players. It’s a shame, as on console Star Wars Battlefront II is still a wonderfully cinematic, chaotic multiplayer experience.

    Make no mistake, concurrent players are nowhere near the level they’ve been in the past but in 2025 playing Star Wars Battlefront II online is alive and well. Matchmaking can take a few minutes, more so if you opt to select your chosen multiplayer mode instead of choosing quick match, but performance – on PlayStation, at least – is smooth and responsive.

    Graphically too, for a game that’s eight years old Star Wars Battlefront II still looks sublime, presenting a foray of scenery. It’d be nice to see the game remastered for current-gen hardware but given EA DICE’s decision to stop supporting the title and move onto other things – and to sadly drop production of a third Battlefront title – the past-gen version will have to do. Still looks and plays great, the latter of which is most important.

    Multiplayer is still the way to go though, as it always was. The game’s single player campaign, following Iden Versio as she commands the elite Imperial special forces unit Inferno Squad, pledges an inverse perspective on the Empire but never truly achieves it during the run-of-the-mill missions. There’re hints that Versio is empathetic – a perspective we’ve never really seen in any Star Wars game – but the interest here is never truly explored. No, EA DICE clearly concentrated much of their effort into online play, it’s numerous modes of which all have something good to offer, and the game’s offline arcade modes.

    Worth pointing out too is that the game’s cast of heroes are still overpowered. Once you earn enough battle points to take to the field as a lightsabre wielding Jedi or Sith, it’s ultra-satisfying scything through troopers. For anyone else caught in a lightsabre’s proximity, however, it can be devastating. Still, those panicky moments you round a corner and spot Darth Vader stomping towards you are quite amusing. The best moments during Star Wars Battlefront II, too, are when heroes face off against each other. Seeing Luke and Maul going at it through a shower of blaster fire and thermal detonator explosions is ultimate fan fiction material – there’s perhaps no other Star Wars game that can replicate those unique moments.      

    Note: The views expressed in this article are those of the author and do not necessarily represent the views of, and should not be attributed to, GamingBolt as an organization.
    #controversy #comeback #state #star #wars
    From Controversy to Comeback: The State of Star Wars Battlefront 2 in 2025
    Released to the wrong kind of fanfare back in November 2017, EA DICE’s sequel to their rebooted mass-arena warfare series set in the Disney-owned space opera universe courted controversy to near-comical degree before a blaster had even been fired. Overloaded by microtransactions and predatory loot box practices, Star Wars Battlefront II had a bad start. The biggest offence to early players on EA Access was that the franchise’s signature heroes – Luke Skywalker, Darth’s Vader and Maul, Obi Wan, Boba Fett, and so on – were hidden behind paywalls or, as was becoming increasingly commonplace at the time, only obtainable after unfathomably long hours spent accumulating whatever skill points or in-game currency was required to unlock them. Worse still was this over-abundance of loot boxes wasn’t present in the open beta conducted a few weeks earlier in October, so their emergence – as bad as it was – resonated much worse because it felt deceitful.   The comedy element in this pre-release debacle emerged during EA’s robust defence of their choices on a Reddit post; a post which became the most downvoted the website had ever seen and netted the not-yet-released title its first accolade: an unwanted Guinness World Record, for having – you guessed it – the most downvoted post in history.   The heads in EA’s boardroom didn’t take too kindly to this, but instead of doubling down they opted to listen to player concerns and perform a partial U-turn. First by dramatically reducing the cost of the franchise’s heroes then, the day before the game’s official release, by disabling microtransactions. We describe this as a partial U-turn as these microtransactions were re-enabled several months after release, but the only items for purchase were cosmetics, emotes, victory poses, that sort of thing. Certainly not the pay to win practices EA originally planned.   Arguably, these changes weren’t solely the result of player discontent. The loot boxes, in particular, harboured the very serious implication that they presented a form of gambling. So strong was the furore surrounding their inclusion that governments throughout Europe and North America began to investigate; not just their presence in Star Wars Battlefront II but in video games as a whole. After all, children play games and given Star Wars’ fanbase there was undoubtably a substantial player base awaiting Battlefront II’s release that were under eighteen. The comedic element in EA’s initial response is an undertone if anything. The loot box controversy clouding Star Wars Battlefront II grew into a seismic event for the industry. Paying for loot boxes which may or may not contain the items you want was officially declared as gambling, and it simply couldn’t continue to exist.    The industry isn’t cleansed of the practice altogether, of course, but the direction it was heading in 2017 was certainly for the worse. In a way, it’s great that this issue reared its head so prevalently when it did and, arguably, it’s because this was a Star Wars game that the uproar was so strong. After all, this is a beloved franchise with a then forty-year legacy. Its fans are passionate, and they’ll voice their discontent more rabidly than players of EA’s annual sports titles. Despite EA making wholesale changes to Star Wars Battlefront II’s pay to win progression, it’s release was still cloaked in negativity. However, there was a decent game underneath all the furore, and it was only improved upon by EA DICE in the subsequent years following post-release content which steadily emerged until support was abandoned in 2020. Changes to progression which came in March 2018 transformed the game into the one that’s playable today. To answer the question heading this feature, Star Wars Battlefront II is fixed, and it was the change to linear progression which did it. Now, troopers earn in-game skill points via playing not by paying. Completing objectives, blasting opponents, traditional levelling up. Enhancing the capability of the game’s characters are Star Cards, and these became unlockable through experience once they were removed from loot boxes. They govern progression for each of the game’s classes, heroes, AI reinforcements, and vehicles, with skill points being capable of upgrading a Star Card to its next tier or being put towards crafting new ones. This mechanic returned from Battlefront II’s predecessor, of course, but they were adjusted slightly to incorporate abilities and boosts, and they are crucial to gaining an edge during largescale battles. Boost cards enhance your unit’s pre-existing abilities, whereas ability cards unique to each trooper class can be swapped in and out. The latter is a rewarding endeavour for anyone who wishes to pursue a specific class of trooper – stealthier specialists, more destructive heavies, tougher assault troopers, et cetera. If Star Wars Battlefront II is indeed ‘fixed’ then an extra question we land upon now is this: is Star Wars Battlefront II still worth playing some eight years after release and approaching five years since any form of update? Well, if you’re playing on PC annoyingly the answer might be no, as currently hackers have spoiled the experience with game-ruining cheats and reports of harassment towards honest players. It’s a shame, as on console Star Wars Battlefront II is still a wonderfully cinematic, chaotic multiplayer experience. Make no mistake, concurrent players are nowhere near the level they’ve been in the past but in 2025 playing Star Wars Battlefront II online is alive and well. Matchmaking can take a few minutes, more so if you opt to select your chosen multiplayer mode instead of choosing quick match, but performance – on PlayStation, at least – is smooth and responsive. Graphically too, for a game that’s eight years old Star Wars Battlefront II still looks sublime, presenting a foray of scenery. It’d be nice to see the game remastered for current-gen hardware but given EA DICE’s decision to stop supporting the title and move onto other things – and to sadly drop production of a third Battlefront title – the past-gen version will have to do. Still looks and plays great, the latter of which is most important. Multiplayer is still the way to go though, as it always was. The game’s single player campaign, following Iden Versio as she commands the elite Imperial special forces unit Inferno Squad, pledges an inverse perspective on the Empire but never truly achieves it during the run-of-the-mill missions. There’re hints that Versio is empathetic – a perspective we’ve never really seen in any Star Wars game – but the interest here is never truly explored. No, EA DICE clearly concentrated much of their effort into online play, it’s numerous modes of which all have something good to offer, and the game’s offline arcade modes. Worth pointing out too is that the game’s cast of heroes are still overpowered. Once you earn enough battle points to take to the field as a lightsabre wielding Jedi or Sith, it’s ultra-satisfying scything through troopers. For anyone else caught in a lightsabre’s proximity, however, it can be devastating. Still, those panicky moments you round a corner and spot Darth Vader stomping towards you are quite amusing. The best moments during Star Wars Battlefront II, too, are when heroes face off against each other. Seeing Luke and Maul going at it through a shower of blaster fire and thermal detonator explosions is ultimate fan fiction material – there’s perhaps no other Star Wars game that can replicate those unique moments.       Note: The views expressed in this article are those of the author and do not necessarily represent the views of, and should not be attributed to, GamingBolt as an organization. #controversy #comeback #state #star #wars
    GAMINGBOLT.COM
    From Controversy to Comeback: The State of Star Wars Battlefront 2 in 2025
    Released to the wrong kind of fanfare back in November 2017, EA DICE’s sequel to their rebooted mass-arena warfare series set in the Disney-owned space opera universe courted controversy to near-comical degree before a blaster had even been fired. Overloaded by microtransactions and predatory loot box practices, Star Wars Battlefront II had a bad start. The biggest offence to early players on EA Access was that the franchise’s signature heroes – Luke Skywalker, Darth’s Vader and Maul, Obi Wan, Boba Fett, and so on – were hidden behind paywalls or, as was becoming increasingly commonplace at the time, only obtainable after unfathomably long hours spent accumulating whatever skill points or in-game currency was required to unlock them. Worse still was this over-abundance of loot boxes wasn’t present in the open beta conducted a few weeks earlier in October, so their emergence – as bad as it was – resonated much worse because it felt deceitful.   The comedy element in this pre-release debacle emerged during EA’s robust defence of their choices on a Reddit post; a post which became the most downvoted the website had ever seen and netted the not-yet-released title its first accolade: an unwanted Guinness World Record, for having – you guessed it – the most downvoted post in history.   The heads in EA’s boardroom didn’t take too kindly to this, but instead of doubling down they opted to listen to player concerns and perform a partial U-turn. First by dramatically reducing the cost of the franchise’s heroes then, the day before the game’s official release, by disabling microtransactions. We describe this as a partial U-turn as these microtransactions were re-enabled several months after release, but the only items for purchase were cosmetics, emotes, victory poses, that sort of thing. Certainly not the pay to win practices EA originally planned.   Arguably, these changes weren’t solely the result of player discontent. The loot boxes, in particular, harboured the very serious implication that they presented a form of gambling. So strong was the furore surrounding their inclusion that governments throughout Europe and North America began to investigate; not just their presence in Star Wars Battlefront II but in video games as a whole. After all, children play games and given Star Wars’ fanbase there was undoubtably a substantial player base awaiting Battlefront II’s release that were under eighteen. The comedic element in EA’s initial response is an undertone if anything. The loot box controversy clouding Star Wars Battlefront II grew into a seismic event for the industry. Paying for loot boxes which may or may not contain the items you want was officially declared as gambling, and it simply couldn’t continue to exist.    The industry isn’t cleansed of the practice altogether, of course, but the direction it was heading in 2017 was certainly for the worse. In a way, it’s great that this issue reared its head so prevalently when it did and, arguably, it’s because this was a Star Wars game that the uproar was so strong. After all, this is a beloved franchise with a then forty-year legacy. Its fans are passionate, and they’ll voice their discontent more rabidly than players of EA’s annual sports titles. Despite EA making wholesale changes to Star Wars Battlefront II’s pay to win progression, it’s release was still cloaked in negativity. However, there was a decent game underneath all the furore, and it was only improved upon by EA DICE in the subsequent years following post-release content which steadily emerged until support was abandoned in 2020. Changes to progression which came in March 2018 transformed the game into the one that’s playable today. To answer the question heading this feature, Star Wars Battlefront II is fixed, and it was the change to linear progression which did it. Now, troopers earn in-game skill points via playing not by paying. Completing objectives, blasting opponents, traditional levelling up. Enhancing the capability of the game’s characters are Star Cards, and these became unlockable through experience once they were removed from loot boxes. They govern progression for each of the game’s classes, heroes, AI reinforcements, and vehicles, with skill points being capable of upgrading a Star Card to its next tier or being put towards crafting new ones. This mechanic returned from Battlefront II’s predecessor, of course, but they were adjusted slightly to incorporate abilities and boosts, and they are crucial to gaining an edge during largescale battles. Boost cards enhance your unit’s pre-existing abilities, whereas ability cards unique to each trooper class can be swapped in and out. The latter is a rewarding endeavour for anyone who wishes to pursue a specific class of trooper – stealthier specialists, more destructive heavies, tougher assault troopers, et cetera. If Star Wars Battlefront II is indeed ‘fixed’ then an extra question we land upon now is this: is Star Wars Battlefront II still worth playing some eight years after release and approaching five years since any form of update? Well, if you’re playing on PC annoyingly the answer might be no, as currently hackers have spoiled the experience with game-ruining cheats and reports of harassment towards honest players. It’s a shame, as on console Star Wars Battlefront II is still a wonderfully cinematic, chaotic multiplayer experience. Make no mistake, concurrent players are nowhere near the level they’ve been in the past but in 2025 playing Star Wars Battlefront II online is alive and well. Matchmaking can take a few minutes, more so if you opt to select your chosen multiplayer mode instead of choosing quick match, but performance – on PlayStation, at least – is smooth and responsive. Graphically too, for a game that’s eight years old Star Wars Battlefront II still looks sublime, presenting a foray of scenery. It’d be nice to see the game remastered for current-gen hardware but given EA DICE’s decision to stop supporting the title and move onto other things – and to sadly drop production of a third Battlefront title – the past-gen version will have to do. Still looks and plays great, the latter of which is most important. Multiplayer is still the way to go though, as it always was. The game’s single player campaign, following Iden Versio as she commands the elite Imperial special forces unit Inferno Squad, pledges an inverse perspective on the Empire but never truly achieves it during the run-of-the-mill missions. There’re hints that Versio is empathetic – a perspective we’ve never really seen in any Star Wars game – but the interest here is never truly explored. No, EA DICE clearly concentrated much of their effort into online play, it’s numerous modes of which all have something good to offer, and the game’s offline arcade modes. Worth pointing out too is that the game’s cast of heroes are still overpowered. Once you earn enough battle points to take to the field as a lightsabre wielding Jedi or Sith, it’s ultra-satisfying scything through troopers. For anyone else caught in a lightsabre’s proximity, however, it can be devastating. Still, those panicky moments you round a corner and spot Darth Vader stomping towards you are quite amusing. The best moments during Star Wars Battlefront II, too, are when heroes face off against each other. Seeing Luke and Maul going at it through a shower of blaster fire and thermal detonator explosions is ultimate fan fiction material – there’s perhaps no other Star Wars game that can replicate those unique moments.       Note: The views expressed in this article are those of the author and do not necessarily represent the views of, and should not be attributed to, GamingBolt as an organization.
    0 Yorumlar 0 hisse senetleri 0 önizleme
CGShares https://cgshares.com