• Future-Proofing Business Continuity: BCDR Trends and Challenges for 2025
    thehackernews.com
    Mar 13, 2025The Hacker NewsIT Resilience / Disaster RecoveryAs IT environments grow more complex, IT professionals are facing unprecedented pressure to secure business-critical data. With hybrid work the new standard and cloud adoption on the rise, data is increasingly distributed across different environments, providers and locations, expanding the attack surface for emerging cyberthreats. While the need for a strong data protection strategy has become more critical than ever, organizations find themselves caught in a difficult balancing act. They are struggling to manage the rising costs and complexities of business continuity and disaster recovery (BCDR) while ensuring that their business-critical data remains secure and recoverable.To help IT teams and managed service providers (MSPs) understand how their peers are navigating these challenges, the State of Backup and Recovery Report 2025 has gathered insights from more than 3,000 IT professionals, security experts and administrators worldwide. The report reveals how businesses are tackling today's biggest data protection challenges, the strategies they're adopting and the critical gaps that could leave them vulnerable to data loss and downtime.So, where do organizations stand? The survey indicates that the confidence in backup systems is declining, cloud adoption is outpacing data protection strategies and recovery expectations often don't match reality. In this article, we'll explore the key findings from the report to help IT teams and MSPs stay prepared for what comes next. Meanwhile, for full insights and actionable strategies, you can download the complete report now and see how your organization compares.The backup paradox: Essential yet increasingly unreliableData backup and recovery should be a safety net for businesses, but for many, it has become a source of frustration, complexity and risk. The numbers tell a clear story backup inefficiencies are rising, IT teams are overburdened and security vulnerabilities remain widespread. Let's dive into the key findings.Trend #1: Data loss is no longer a question of "if" but "when."9 in 10 organizations experienced operational downtime in the past 12 months.Trend #2: Confidence in backup systems is declining.Trust in backup solutions is slipping, leaving many businesses questioning whether they can reliably recover from data loss.Only 40% of IT teams feel confident in their backup systems.About 30% worry their backup strategy is inadequate, raising concerns about data security and recoverability.More than half of organizations plan to switch backup providers, citing cost, inefficiency and limited disaster recovery capabilities as major pain points.Trend #3: Backup management is a time-consuming burden.Managing backups isn't just complex it's draining IT resources. As data volumes grow, IT teams are spending more time than ever maintaining backup systems, testing recovery processes and troubleshooting failures.IT teams now spend 10+ hours per week managing backups, adding to operational strain.The number of businesses spending more than three hours weekly on backups jumped from 5% in 2022 to 23% in 2024, denoting a significant rise in time and effort.Around 35% of organizations wouldn't even know if backups were skipped or missed, highlighting critical gaps in monitoring and testing.Trend #4: Security gaps are leaving backups exposed.Backup systems are supposed to be the last line of defense against cyberthreats. However, many contain serious security flaws that put data at risk.About 25% of workloads lack policies that limit unauthorized access to backups, leaving them vulnerable to malicious attacks.There are varying levels of protection for credentials across businesses. Only 33% of businesses use dedicated password managers. Others rely on less secure methods like document storage platforms or browser-based password tools, introducing potential vulnerabilities.The recovery gap: Why businesses can't bounce back fast enoughHaving the backup of data is one thing; recovering it quickly and reliably is another. IT teams face significant hurdles in ensuring fast, seamless recovery when disaster strikes.Trend #1: Quick and reliable data recovery remains a major challenge in data protection.The top concerns cited by IT teams with respect to data protection are costs, compliance requirements and the actual process of recovering data. Since IT teams spend hours managing and troubleshooting backup issues, it leaves little time for testing and validating recovery processes, increasing the risk of failure when it matters most.Trend #2: Backup and DR testing gaps leave businesses vulnerable.A backup solution is only as good as its ability to restore data, yet testing remains inconsistent across organizations.Only 15% of businesses conduct daily backup tests, meaning most operate with a level of risk that could jeopardize recovery in a crisis.Disaster recovery (DR) testing goes beyond just verifying backups it involves assessing recovery locations, timelines and effectiveness. While around 20% of businesses conduct DR tests weekly and another 23% test monthly, the rest either test irregularly or not at all, leaving them unprepared for real-world recovery scenarios.Trend #3: Most businesses overestimate their recovery readiness.The lack of frequent testing is evident when looking at actual recovery times.While close to 60% of businesses believe they can recover in a day, only 35% actually do.Alarmingly, more than 10% of businesses don't even know how long it would take to recover their business-critical SaaS data, if they could recover it at all.Among businesses using public cloud services like Azure, almost 90% rely on native data protection tools, yet more than 60% of them lack true DR capabilities.The cloud dilemma: Embracing growth without sacrificing protectionThe cloud is now the backbone of modern IT, powering everything from infrastructure to collaboration. Businesses are rapidly adopting cloud and SaaS solutions to enhance flexibility and scalability, but many are overlooking a critical factor: data protection. Trend #1: Cloud adoption continues to surge.The shift to cloud-hosted workloads is only growing stronger, driven by the need for agility and resilience.More than 50% of workloads are now hosted in the cloud, with that number expected to reach 61% within two years.Most organizations now leverage hybrid and multicloud strategies to increase flexibility and avoid reliance on a single provider. However, gaps in cloud and SaaS data protection remain, putting critical business information at risk. Notably, SaaS platforms now serve as the backbone of daily business operations, but without the right backup strategies, this data remains vulnerable.Trend #2: Small and midsize businesses (SMBs) prefer Google Workspace, while enterprises favor Microsoft 365.Microsoft 365 dominates the market, with over 50% of organizations relying on it for collaboration and productivity.Google Workspace (35%) remains a top choice as well, particularly among SMBs.Microsoft 365 Entra ID (31%) and Dynamics (30%) show that businesses are also increasingly adopting specialized Microsoft products.Salesforce (25%) rounds out the top five.Trend #3: Cost, workload compatibility, vendor lock-in and security concerns are the biggest barriers to cloud migration.While cloud adoption continues to accelerate, businesses still face major hurdles in ensuring a seamless transition and securing their data.The lessons learned: What IT leaders must prioritize nowThe State of Backup and Recovery Report 2025 reveals that critical security gaps remain while securing on-premises, cloud, endpoint and SaaS data. There is a growing disconnect between backup investments and actual recovery confidence, with IT teams unsure whether they can restore data when it matters most. Without a more resilient approach to data protection, businesses risk prolonged downtime, drastic financial losses and irreversible data breaches.Have you considered how much an outage could cost your organization per minute? According to the IT Outages: 2024 Costs and Containment Report, the average cost of unplanned downtime is $14,056 per minute per organization.Let's take a closer look at the breakdown of downtime costs across different business sizes.Business continuity depends on faster, more resilient recovery. However, many organizations aren't as prepared as they think. To minimize downtime and financial losses, IT leaders must rethink their approach to BCDR. A modern BCDR strategy goes beyond basic backup, incorporating multilayered security, automation and hybrid cloud solutions to strengthen resilience and ensure business continuity against today's sophisticated cyberthreats.Protection alone isn't enough, though. Without regular testing, organizations are left guessing whether their recovery plans will hold up in a real crisis. More frequent backup and disaster recovery testing ensures that recovery objectives are met when it matters most. Automation plays a key role there. By automating testing, IT teams can continuously verify their ability to restore data within required timeframes all without disrupting the production environment. This removes the manual burden and provides real insights into recovery readiness.At the same time, stronger security controls are also essential to protecting backup environments from unauthorized access. Almost 94% of ransomware victims have their backups targeted by attackers, which leaves them with no other choice but to pay the ransom to get back their access. On that front, improving credential management and enforcing stricter access controls can help prevent malicious actors from accessing backup infrastructure.Final thoughts: The future of BCDR starts nowThe IT landscape is changing, and with it, the risks are escalating. As businesses push further into the cloud and rely more on SaaS applications, their backup and disaster recovery strategies must evolve just as quickly. Cyberthreats are more advanced, downtime is more expensive and organizations can no longer afford to treat backup as an afterthought. To keep pace with this new reality, businesses must reassess their approach and strengthen their defenses against the growing threats that could bring operations to a halt.For IT teams and MSPs, the insights from the State of Backup and Recovery Report 2025 provide a clear roadmap to assess vulnerabilities and improve resilience before disaster strikes. Download the full report now to benchmark your strategy, uncover critical gaps, and build a stronger, more reliable BCDR plan for the future.Found this article interesting? This article is a contributed piece from one of our valued partners. Follow us on Twitter and LinkedIn to read more exclusive content we post.SHARE
    0 Yorumlar ·0 hisse senetleri ·28 Views
  • AI Hallucinations Can Prove Costly
    www.informationweek.com
    Samuel Greengard, Contributing ReporterMarch 13, 20255 Min ReadDavid Kashakhi via Alamy StockLarge language models (LLMs) and generative AI are fundamentally changing the way businesses operate -- and how they manage and use information. Theyre ushering in efficiency gains and qualitative improvements that would have been unimaginable only a few years ago.But all this progress comes with a caveat. Generative AI models sometimes hallucinate. They fabricate facts, deliver inaccurate assertions and misrepresent reality. The resulting errors can lead to flawed assessments, poor decision-making, automation errors and ill will among partners, customers and employees.Large language models are fundamentally pattern recognition and pattern generation engines, points out Van L. Baker, research vice president at Gartner. They have zero understanding of the content they produce.Adds Mark Blankenship, director of risk at Willis A&E: Nobody is going to establish guardrails for you. Its critical that humans verify content from an AI system. A lack of oversight can lead to breakdowns with real-world repercussions.False PromisesAlready, 92% of Fortune 500 companies use ChatGPT. As GenAI tools become embedded across business operations -- from chatbots and research tools to content generation engines -- the risks associated with the technology multiply.Related:There are several reasons why hallucinations occur, including mathematical errors, outdated knowledge or training data and an inability for models to reason symbolically, explains Chris Callison-Burch, a professor of computer and information science at the University of Pennsylvania. For instance, a model might treat satirical content as factual or misinterpret a word that can have different contexts.Regardless of the root cause, AI hallucinations can lead to financial harm, legal problems, regulatory sanctions, and damage to trust and reputation that ripples out to partners and customers.In 2023, a New York City lawyer using ChatGPT filed a lawsuit that contained egregious errors, including fabricated legal citations and cases. The judge later sanctioned the attorney and imposed a $5,000 fine. In 2024, Air Canada lost a lawsuit when it failed to honor the price its chatbot quoted to a customer. The case resulted in minor damages and bad publicity.At the center of the problem is the fact that LLMs and GenAI models are autoregressive, meaning they arrange words and pixels logically with no inherent understanding of what they are creating. AI hallucinations, most associated with GenAI, differ from traditional software bugs and human errors because they generate false yet plausible information rather than failing in predictable ways, says Jenn Kosar, US AI assurance leader at PwC.Related:The problem can be especially glaring in widely used public models like ChatGPT, Gemini and Copilot. The largest models have been trained on publicly available text from the Internet, Baker says. As a result, some of the information ingested into the model is incorrect or biased. The errors become numeric arrays that represent words in the vector database, and the model pulls words that seems to make sense in the specific context.Internal LLM models are at risk of hallucinations as well. AI-generated errors in trading models or risk assessments can lead to misinterpretation of market trends, inaccurate predictions, inefficient resource allocation or failing to account for rare but impactful events, Kosar explains. These errors can disrupt inventory forecasting and demand planning by producing unrealistic predictions, misinterpreting trends, or generating false supply constraints, she notes.Smarter AIAlthough theres no simple fix for AI hallucinations, experts say that business and IT leaders can take steps to keep the risks in check. The way to avoid problems is to implement safeguards surrounding things like model validation, real-time monitoring, human oversight and stress testing for anomalies, Kosar says.Related:Training models with only relevant and accurate data is crucial. In some cases, its wise to plug in only domain-specific data and construct a more specialized GenAI system, Kosar says. In some cases, a small language model (SLM) can pay dividends. For example, AI thats fine-tuned with tax policies and company data will handle a wide range of tax-related questions on your organization more accurately, she explains.Identifying vulnerable situations is also paramount. This includes areas where AI is more likely to trigger problems or fail outright. Kosar suggests reviewing and analyzing processes and workflows that intersect with AI. For instance, A customer service chatbot might deliver incorrect answers if someone asks about technical details of a product that was not part of its training data. Recognizing these weak spots helps prevent hallucinations, she says.Specific guardrails are also essential, Baker says. This includes establishing rules and limitations for AI systems and conducting audits using AI augmented testing tools. It also centers on fact-checking and failsafe mechanisms such as retrieval augmented generation (RAG), which comb the Internet or trusted databases for additional information. Including humans in the loop and providing citations that verify the accuracy of a statement or claim can also help.Finally, users must understand the limits of AI, and an organization must set expectations accordingly. Teaching people how to refine their prompts can help them get better results, and avoid some hallucination risks, Kosar explains. In addition, she suggests that organizations include feedback tools so that users can flag mistakes and unusual AI responses. This information can help teams improve an AI model as well as the delivery mechanism, such as a chatbot.Truth and ConsequencesEqually important is tracking the rapidly evolving LLM and GenAI spaces and understanding performance results across different models. At present, nearly two dozen major LLMs exist, including ChatGPT, Gemini, Copilot, LLaMA, Claude, Mistral, Grok, and DeepSeek. Hundreds of smaller niche programs have also flooded the app marketplace. Regardless of the approach an organization takes, In early stages of adoption, greater human oversight may make sense while teams are upskilling and understanding risks, Kosar says.Fortunately, organizations are becoming savvier about how and where they use AI, and many are constructing more robust frameworks that reduce the frequency and severity of hallucinations. At the same time, vendor software and open-source projects are maturing. Concludes Blankenship: AI can create risks and mitigate risks. Its up to organizations to design frameworks that use it safely and effectively.About the AuthorSamuel GreengardContributing ReporterSamuel Greengard writes about business, technology, and cybersecurity for numerous magazines and websites. He is author of the books "The Internet of Things" and "Virtual Reality" (MIT Press).See more from Samuel GreengardWebinarsMore WebinarsReportsMore ReportsNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also Like
    0 Yorumlar ·0 hisse senetleri ·28 Views
  • Compliance in the Age of AI
    www.informationweek.com
    Raghav K.A., Global Head of Engineering, IOT and Blockchain, InfosysMarch 13, 20254 Min ReadAndriy Popov via Alamy StockAccording to a 2024 survey, 97% of US business leaders whose companies had invested in AI confirmed positive returns. A third of those with existing investments are planning to top that off with US $10 million or more this year.While AI adoption is on a roll, public trust in the technology is declining rapidly amid rising threats such as phishing, deepfakes and ransomware. A global online survey of trust and credibility found that peoples trust in AI organizations fell eight percentage points between 2019 and 2024. In the United States, there was a precipitous fall -- from 50% to 35% -- signaling US consumers concerns around AI.Regulators have responded to the growing perils of digitization by evolving compliance mandates to govern the use of data and digital technologies. For example, from 2023 to 2025, different administrations added the G7 AI Principles, the EU AI Act, new OECD AI Guidelines and an Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligencein the US to the list of AI regulations. The US also has a separate law, namely the US IoT Cybersecurity Improvement Act of 2020, to address the security of specific types of IoT devices.As products and services turn increasingly digital, industry standards are changing to align with the transformation. Think HIPAA, PCI DSS, ISO 27001 and the US National Institute of Standards and Technology (NIST) framework, which extended its scope of guidance from critical infrastructure to organizations of all sizes in 2024.Related:These entities are working toward essential goals, such as ensuring safety, protecting fundamental rights and promoting ethical development and use of digital technologies. However, amid a growing sprawl of regulations across sectors, it is becoming challenging for enterprises to remain compliant. Large organizations must continually perform compliance checks to meet requirements of mandates at significant cost. This task becomes harder when checks involve departments operating in silos.With this, businesses must adopt technologies to innovate and stay relevant. By aligning technology and regulatory objectives, they can ensure that innovation and compliance do not work at cross-purposes. In addition, they should take a systematic approach to compliance by doing the following:Reassessing existing compliance practices: Regular review of compliance measures, including data governance policies, access and security protocols and breach response mechanisms can help organizations identify any gaps and vulnerabilities, prioritize areas of maximum risk and proactively strengthen compliance processes.Related:Adopting robust information security: As data and data regulations proliferate, a solid information security management framework becomes essential for ensuring data security and privacy in line with regulations, such as GDPR, COPPA, HIPAA, SEC/FINRA and so on. Besides recommending policies, controls and best practices for mitigating various information security risks, a framework facilitates continuous improvement by guiding enterprises to periodically examine and update controls, thereby fostering a security culture.Laying down data policies and procedures: Procedures and policies enforce compliance with evolving regulations by detailing the rules and responsibilities for collecting, storing, accessing or disposing of data. Involving stakeholders from different functions in policy formulation builds a compliance mindset among employees.Implementing comprehensive data protection: Data protection measures, including data governance, mitigate digital transformation risks and improve compliance. While data governance stipulates the guidelines for handling data, data management covers the tools and steps required to implement governance across the enterprise. A privacy-by-design approach helps embed data privacy in systems right from the start, rather than bolting it on later (which is less effective).Related:Performing periodic internal data audits: Regular audits of data policies, practices and assets help organizations better understand their data and how its being used, as well as align data management practices with compliance expectations. Advantages include increase in customer trust, efficient data management and improvement in quality, and strengthening of the organizations security standing.Compliance first approach: Enterprises have adopted mobile-first, cloud-first, secure-first and AI-first approaches for their enterprise architecture and business functions. The same needs to be extended by adding a compliance-first approach. Frameworks governing enterprise IT architecture should have compliance checklists.The explosion in generative AI has brought ethical implications to the forefront, stressing the need for transparency, traceability, accountability, fairness and privacy in AI development. Responsible AI (RAI) combines technology and governance to help organizations pursue their AI ambitions without compromising customer interest or stakeholder trust. RAI emphasizes fairness in AI models to prevent the perpetuation of bias and demands accountability from organizations for AI usage. It addresses concerns around AIs lack of transparency by providing insights into data inputs, algorithmic models and decision-making criteria. It also improves explainability and reproducibility, allowing organizations to use AI confidently and safeguard data privacy rights. However, organizations should always provide a human-in-the-loop on top of RAI governance to ensure complete compliance and trust.Read more about:RegulationAbout the AuthorRaghav K.A.Global Head of Engineering, IOT and Blockchain, InfosysRaghav K.A. is SVP and global head of engineering, blockchain & sustainability services at Infosys, a global leader in next-generation digital services and consulting. At Infosys, Raghav is responsible for overseeing and growing client engagements in core product development, next generation engineering technologies including digital thread, generative design and AI / generative AI across all industry verticals. He is an advisor to CTOs and CDOs in defining and implementing product strategy and digital transformation initiatives across the product value stream.See more from Raghav K.A.WebinarsMore WebinarsReportsMore ReportsNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also Like
    0 Yorumlar ·0 hisse senetleri ·28 Views
  • Fosters Whitechapel officer tower finally approved after eight years
    www.bdonline.co.uk
    But council warned conservation area may need to be reviewed if controversial 17-storey scheme is builtFoster & Partners latest designs for the scheme on Commercial StreetA conservation area in Whitechapel may have to be reviewed following a decision by councillors to approve a Foster & Partners office tower which breaches local development policy, planning officers have warned.Councillors at Tower Hamlets voted unanimously to approve Fosters long-delayed plans for the 17-storey building at 2-6 Commercial Street yesterday evening at the schemes fourth planning committee hearing.Planning officers had previously recommended the plans for refusal on two occasions for being located outside of a tall building zone and within the Whitechapel High Street conservation area.The 41,000sq m proposals will now be sent to the mayor of London for stage two approval.It marks the furthest progress yet for a scheme which has been on the books for eight years, with plans designed by Fosters for a 20-storey tower on the site first being submitted in 2017.But its approval now means that the site may have to be removed from the surrounding conservation area, at a minimum, in a future review if the scheme is built, planning officers said.The site is located outside of a designated tall building zoneThis is because the modern townscape that would emerge would be at odds with the special character that the Conservation Area seeks to preserve or enhance, the officers report said.Officers also warned the schemes location outside of a designation tall building zone has the potential to undermine the consistent and proper application of the Development Plan going forward, contrary to good spatial planning.Last months decision to effectively back the scheme was criticised by campaign group Save Britains Heritage, which described it as a blow to the local community who have consistently and strongly pushed back against these controversial plans which are oversized and unnecessarily destructive.Conservation Officer for the group Lydia Franklin said: Conservation areas are created to protect our historic environment and guide development in a particular context. A tall building in this location would make a mockery of these protections and erode this areas unique character.Fosters first proposals for the site were withdrawn in 2020 and replaced with a 14-storey redesign, which was recommended for approval by planning officers but rejected the following year after amassing more than 200 objections.A 14-storey version of the scheme which was rejected in 2021Although the latest version of the scheme, submitted in February last year, was three storeys taller and had been recommended for refusal, councillors voted to defer the decision at a planning committee meeting in December to allow a site visit.It was again recommended for refusal in January but councillors voted to back the scheme, pushing a final decision to a fourth committee hearing which took place yesterday evening.Councillors have argued that the scheme would be an effective use of the site, which is currently a car park, and would reduce anti-social behaviour in the area including alleged drug dealing activity.
    0 Yorumlar ·0 hisse senetleri ·27 Views
  • Northern Bureau for Architecture to build city-centre pavilion in Middlesbrough
    www.architectsjournal.co.uk
    The pavilion will be constructed in a courtyard between the interlocking, L-shaped plans of the 19th-century Middlesbrough Central Library and 21st-century Middlesbrough Institute for Modern Art (MIMA).The 46m pavilion, for Teesside University and MIMA, will be comprised of three parts: an enclosed structure, a semi-covered canopy, and an open frame.The building will house space for workshops, formal events, and gardening space for local groups, as well as extra room to extend MIMA art exhibitions into a more public setting.AdvertisementNorthern Bureau for Architecture said it developed the pavilion design over a very short timescale to meet funding deadlines, delivering RIBA Stages 1-4 in 12 weeks after winning an open tender for the project. Construction will start on-site on 31 March and is expected to complete in mid-May.The pavilion is one of a series of connected cultural projects being delivered in Middlesbrough city centre for MIMA, in partnership with Middlesbrough Council, Middlesbrough Library, The Auxiliary, Platform Arts and Navigator North. Source:Northern Bureau for ArchitectureNorthern Bureau for Architectures garden pavilion for MIMA in MiddlesbroughThe projects are all being funded through the Cultural Development Fund, a Department for Digital, Culture, Media and Sport (DCMS) fund administered by Arts Council England.Northern Bureau for Architecture says the MIMA garden pavilion will become a significant cultural addition to Middlesbroughs city centre, offering grades of threshold into the garden and supporting different activities throughout the seasons.The pavilions enclosed element will consist of a simple rectangular form, opening out the semi-covered element, a steel-framed cylindrical drum structure inspired by silos, cooling towers, and steel gasometers, with a clear roof to see the tree canopy above.AdvertisementThe architect says the open-framed element will consist of a second cylindrical drum structure at the opposite end of the garden, which invites unplanned appropriation and plays. The structures will be surrounded by a circular track supporting an enclosing curtain, to offer flexibility in response to function, and weather.A material palette of galvanised steel, black metal cladding, Cor-ten, and red brickwork borrows directly from the existing buildings, and is organised to establish a tonal gradient from the cool monotonality of MIMA to the rich, warm reds of Central Library.Northern Bureau for Architecture said the new interventions aim to establish threads of connection between the two bold but contrasting buildings that frame the garden as well as recognising the rich layering of collective activity that has taken place in the garden over the years.The practice, which was founded in 2022 and is based in County Durham, featured in a 2023 AJ article describing its practice ethos of 'testing, experimentation and careful building assembly'.Project dataLocation MiddlesbroughLocal authority Tees Valley Combined AuthorityType of project CulturalClient Teesside University + Middlesbrough Institute of Modern Art (MIMA)Architect Northern Bureau for ArchitectureLandscape architect Southern GreenStructural engineer Billinghurst George & PartnersM&E consultant NEECOMain contractor AWG EngineeringFunding Arts Council EnglandContract duration 12 weeksGross internal floor area 46mForm of contract and/or procurement JCT Minor Works with Contractor Design
    0 Yorumlar ·0 hisse senetleri ·28 Views
  • Best Internet Providers in Everett, Washington
    www.cnet.com
    Whether you're a new resident or have lived there for years, it's time to expand your broadband horizons in Everett. Our CNET experts have picked the best home internet providers in this Washington city for you.
    0 Yorumlar ·0 hisse senetleri ·29 Views
  • iPhone 16E vs. iPhone SE (2022): How Apple Upgraded Its Budget Phone
    www.cnet.com
    A higher price tag also brings a handful of new features and capabilities.
    0 Yorumlar ·0 hisse senetleri ·28 Views
  • The Next Flu Pandemic Could Be Worse Than Covid If We Don't Heed History
    www.scientificamerican.com
    March 13, 20259 min readThe Next Flu Pandemic Could Be Worse Than Covid If We Don't Heed HistoryCOVID and the 1918 flu pandemic gave us playbooks on how to prepare for the next pandemic. But we arent using it.By Arijit Chakravarty, Lyne Filiatrault & T. Ryan Gregory edited by Madhusree MukerjeeDead birds are collected along the coast in the Vadso municipality of Finnmark in Norway following a major outbreak of bird flu on July 20, 2023. Oyvind Zahl Arntzen/NTB/AFP via Getty ImagesThe world is divided by war. Influenza outbreaks smolder in livestock herds and bird flocks for years. The public is deeply skeptical of the value of medical interventions. Public health agencies offer misleading advice and are focused only on keeping the public calm. There is a shortage of qualified medical professionals, with no end in sight.No, this isnt 2025its 1918. In the pivotal book The Great Influenza, historian John Barry lays out the conditions that primed the population of the U.S. that year for one of the worst plagues in history and acted like so much dry tinder just waiting for a spark. That spark exploded into the conflagration of the 1918 influenza pandemic, which killed an estimated 50 million people worldwide and left many others disabled.A little more than a century later, now is perhaps as good a time as any to ask the question: How prepared are we for another influenza pandemic? On the surface, this is an easy question to answer. Modern medicine and public health have advanced far beyond 1918. Whereas the scientists of that era struggled to identify the germ that caused the pandemic, we live in a time of genomic sequencing and global infectious disease surveillance, of mRNA vaccine technology and antiviral medications. Our governments have pandemic preparedness plans, stockpiles of vaccines and drugs, and, having dealt with the COVID pandemic, experience with contact tracing and isolation.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Other conditions, however, are eerily similar to those of 1918. Geopolitical crises crowd public health concerns off the front page of newspapers. A dangerous influenza strain, in this case the H5N1 avian flu virus, has recently been circulating freely within poultry flocks, spreading widely in livestock herds in the U.S. and causing infections in farm workers. False lessons drawn from the COVID pandemic have driven public skepticism of medical information to all-time highs. Public health agencies sometimes offer contradictory and falsely soothing messages, further eroding their credibility. And after five years of COVID, hospital systems are stretched thin, and burnout and staffing shortages have thinned the ranks of the doctors and nurses who will be on the front line of the next pandemic. Making matters worse, the Trump administrations interventions over the past two months have gravely weakened surveillance of and control over the viruss spread.The global response to the COVID pandemic offers little solace. In late 2019, as SARS-CoV-2, the virus that causes COVID, gripped China, infectious disease surveillance failed across much of the rest of the globe. Western governments faltered right out of the gate at limiting the spread of the viruscontact tracing detected fewer than 2 percent of all COVID cases in the U.S., for example. The pandemic response plan was ignored, and molecular tests were too few and too late. There were not enough high-quality masks, and antiviral drugs for COVID had not yet been developed. The plan was to flatten the curve, but in practice, hospitals ran out of beds, intensive care units ran out of oxygen and morgues ran out of space. While lives were saved by social distancing and eventually vaccines, millions also died needlessly across the globe. They were victims of poor pandemic policy and a sluggish public health response, as well as misinformation and disinformation about vaccines and other health measures.But that wasand still isa different pandemic, one caused by a coronavirus rather than influenza, with a far lower death rate for acute cases and a somewhat different set of challenges. In contrast, when pandemic influenza hit in 1918, it killed 3 to 5 percent of the worlds population, and around half of those deaths were in young and healthy people. A pandemic similar in scale today would leave 200 to 400 million dead.Revisiting the Deadly 1918 PandemicIts hard to imagine now, but the 1918 influenza was far worse than the flu we know. Although many affected people experienced a severe bout of seasonal flufever, chills, body aches and headaches, followed by recoverysome fared a lot worse. As Barry puts it, these people came with an extraordinary array of symptoms, symptoms either previously unknown entirely in influenza or experienced with previously unknown intensity. Those symptoms included agonizing joint pain, burning pain above the diaphragm, subcutaneous emphysema (which occurs when pockets of air accumulate just beneath the skin), ruptured eardrums, kidney failure and severe nosebleeds.Scientists today think that 1918 influenza mainly killed in one of three ways: through cytokine storms, acute respiratory distress syndrome (ARDS) or secondary bacterial pneumonia. (If these terms sound familiar, its because COVID killed in much the same way in an immunologically naive population in 2020.) Cytokine storms occur during an extreme immune response in which too many immune signals in the form of cytokine proteins are released in a short period of time and cause severe tissue damage. When a cytokine storm hits the lungs, the result is ARDS, a virtual scorching of lung tissue, according to Barry. The lungs of people with ARDS fill with fluid, which gums up oxygen transfer to blood vessels and eventually causes organs throughout the body to fail. Some survivors of the 1918 influenza were left with profoundly weakened immune systems and fell victim to bacterial pneumonia in the weeks that followed infection. Any of these three conditions can land a person in the intensive care unit today and have a high risk of death. Similar effects were seen with hospitalized patients during the 2003 SARS outbreak; one of us (Filiatrault) was the emergency physician on duty when the first case of SARS in Vancouver was detected.Unfortunately, we gain very little protection against pandemic influenza from our past infections and vaccinations against seasonal flu. A hallmark of pandemic influenza viruses is that they are just different enough from preceding strains that they evade the bodys immune defenses almost entirely. Influenza has a segmented genome, which increases the chance that its genetic material, or RNA, is shuffled into new forms through recombination when two different influenza viruses infect the same host. And back-and-forth transmission between humans and animals lets the virus mix and match parts of its genome. The recombination route is a pretty efficient way to get to a brand new influenza virus, and its what led to the 1918, 1957, 1968 and 2009 influenza pandemics.At this point, if a seasonal flu virus particle, which has evolved to spread efficiently between people, recombines with an H5N1 avian flu viruswhich has historically killed about half of those it has infected, although this number has varied widely from year to yearthe resulting virus could be at once fairly deadly and highly transmissible.The U.S. is currently in the midst of the worst seasonal flu outbreak in more than a decade, and in addition, H5N1 has been ravaging poultry flocks. If the risk of recombination between human and avian influenza from poultry wasnt enough, theres also a threat from cows. Two versions of the virus are circulating in dairy herds as well, and one of the strains is particularly concerning. So far, the mortality rate for H5N1 this year seems to be low, but that low mortality rate is far from guaranteed in a dynamic situation such as this.Imagining a Flu Pandemic TodayWhat would happen if a flu strain capable of causing a pandemic hit today? Lets walk through a scenario in which an outbreak is spreading quickly through, say, New York City, and see how things would go. First, we know that contact tracing and surveillance would likely fail in such a situationit failed during the early days of COVID, and it has failed already for some cases of H5N1. In some human cases of H5N1, the source of transmission is unknown, and there is evidence of asymptomatic transmission between people. And the Trump administrations recent actions have created dangerous new vulnerabilities.The White Houses plan to lower egg prices emphasizes alternatives to culling infected flocks, but that could increase transmission among hens and to humans working with those flocks. Bird flu surveillance efforts have been undermined by turmoil at the U.S. Department of Agriculture, where 25 percent of staff members in an office responsible for coordinating the bird flu response were fired last monththough at least some termination letters were later rescindedand at the Epidemic Intelligence Service of the Centers for Disease Control and Prevention (CDC), which faced similar cuts (that were later reversed) around the same time. The shutdown of the U.S. Agency for International Development, which supported the control of infectious disease worldwide, and the directive to limit communication between the CDC and the World Health Organization has only made things worse at a critical time.Vaccines are unlikely to completely deflect the course of an oncoming pandemic either. Although H5N1 vaccines exist in the nations stockpile, they are not yet approved by the Food and Drug Agency. Additionally, they would only cover a fraction of the worlds population, and their efficacy against a brand-new mutant strain of H5N1 is unknown. While mRNA vaccines have been talked up, it would take months to ramp up production of themthey would arrive too late to nip an incipient pandemic in the bud. Falling rates of vaccine coverage for seasonal flu suggest that vaccine uptake would be an uphill battle made harder by changes to the CDCs communication policy. Although the U.S. government has contracts in place for a pandemic influenza vaccine, those contracts are now being reconsidered.The situation is worse for antivirals. There is good reason to question the efficacy of the one flu drug that is stockpiled, oseltamivir (Tamiflu), even against seasonal flu, and how well it will work against a pandemic strain is completely unknown. Cuts to medical research and a shift away from infectious disease research could hardly come at a worse time.The one thing that we do know works against influenza is masking. Increased masking and physical distancing during 2020 led to the first known extinction of a seasonal flu strain. Once again, though, there may not be enough masks to go around, and not everyone will use them. It doesnt help that public health guidance for influenza emphasizes handwashing and vaccination, and it mentions masking only as an additional preventive strategy, if at all. To date, there is little evidence that handwashing reduces the spread of influenza, which transmits through the air.When it comes to the ways in which we can expect pandemic influenza to killARDS, cytokine storms and secondary bacterial pneumoniamedicine doesnt handle these deadly threats that much better than it did a century ago. Malnutritiona predisposing factor for death in 1918remains a concern worldwide and in the U.S.This past experience, along with the fact that infectious disease surveillance for H5N1 has failed already and the time lag between infections and hospitalizations, during which a virus can spread, suggests that if an outbreak of H5N1 were to erupt in New York City tomorrow, it would be spreading in the community by the time the first cases showed up in hospitals. Without contact tracing in place, not much could be done to keep the outbreak from growing exponentially. With medical infrastructure stretched thin, hospitals would overflow eventually, and refrigerated trucks would be back. Commercial flights would scatter the virus across the globe far more efficiently than the troop ships and trains of 1918. Within a week, the sparks of the next pandemic would land all around the planet. And as we watched the world go up in flames, all that would be left for public health agencies to offer would be soothing reassurances that it is not yet time to panic.While it is never time to panic, it is never too soon to prepare. Sadly, it is relatively clear to anyone paying attention thatplans on paper notwithstandingwe are every bit as vulnerable to pandemic influenza now as we were a century ago.Those who forget history, the saying goes, are condemned to repeat it. At this point, the next influenza pandemic is not a question of if but when. And the when gets closer with every new H5N1 infection in humans. There are practical steps that can be taken to make us safer: public health officials can implement better infectious disease surveillance and biosecurity, share public health information in a more transparent way, make sure stockpiles of masks and other safety equipment are replenished, plan for surges in hospital utilization, and update guidelines on airborne spread and the importance of masking in preventing transmission. Rather than downplaying potential risks, public health officials should focus on mitigating them now.
    0 Yorumlar ·0 hisse senetleri ·29 Views
  • Mummies From Ancient Egypt Smell Surprisingly Nice, Scientists Say
    www.scientificamerican.com
    March 13, 20254 min readWhat Sniffing Mummies Taught Scientists about Ancient SocietyMummy aroma may provide insight into social class and historical period, according to a team of trained mummy sniffersBy Gayoung Lee edited by Andrea ThompsonA selection of the mummified bodies in the exhibition area of the Egyptian museum in Cairo. Emma PaolinIf you were asked to describe the scent emanating from an ancient Egyptian mummy like youd discuss a high-end perfume or the bouquet of a fine wine, you might mention fragrance notes of old linen, pine resin and citrus oilswith just a whiff of pest repellent.These vivid comparisons stem from a new laboratory analysis of nine mummies from various social classes and historical periods. Researchers from Slovenia, England, Poland and Egypt collaborated with the Egyptian Museum in Cairo to identify more than 50 unique compounds from air samples taken around each mummy. The samples were chemically analyzed and then presented to specially trained human sniffers, who were asked to describe them in descriptive, sensory language such as sour or spicy. The teams findings, recently published in the Journal of the American Chemical Society, demonstrate how the study of smell can help to enrich our understanding of cultural artifacts such as mummies in a noninvasive way that includes local scientists.The researchers vetted each mummy candidate carefully to obtain a wide range of smells, says study co-author Abdelrazek Elnaggar, professor of cultural heritage studies at Ain Shams University in Egypt. To collect smell samples, they inserted small tubes around each mummy (being careful not to touch the fragile remains) to siphon off gas molecules that the remains were still emitting. The team used gas chromatography and mass spectrometry to identify the chemical compounds in the samples.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Emma Paolin, PhD researcher at University of Ljubljana, smelling at the olfactory port of a gas chromatograph-mass spectrometer instrument. The analyst describes the smell in terms of quality, intensity and hedonic tone.Andrej Kriz for University of Ljubljana, Faculty of Chemistry and Chemical TechnologyA group of people, mostly museum employees, was tasked with describing the smells. Elnaggar and colleagues trained the participants to identify particular materials used in mummification that could be associated with specific smells. They also learned how to distinguish between smells coming from mummies and those that might be caused by museum infrastructure or conservation treatments.The resulting scent profiles were complex, but the mummies were most often described by the trained sniffers as sweet, woody and spicy. The researchers hope that the method could be used on a larger set of mummies to better understand the varied mummification practices that occurred in ancient Egypt. Different historical methods [of mummification] represent different materials used in mummification and also different quality of materials, Elnaggar explains. For example, the earliest Egyptian mummies, dating back to roughly around 5000 B.C.E., formed when the remains of deceased individuals who were buried in hot, dry sand naturally mummified. Artificial mummification techniques began around 2700 B.C.E. and were the most sophisticated during the New Kingdom, which started around 1500 B.C.E. During that time, bodies received thorough treatment with a variety of oils and resins. The oldest mummies in the study were from the New Kingdom, but the researchers found that mummies from the even later Late Period (around 660 to 330 B.C.E.) did show some olfactory similarities to one another.Additionally, its possible that some differences in smell may be caused by variations in mummification practices for individuals of different social classes. Across time, individuals of high social status would be mummified with better-smelling or more intensively smelling natural extracts, says study co-author Matija Strlič, an analytical chemist at the University of Ljubljana in Slovenia. For instance, he explains, the bodies of pharaohs and other elites were treated with fresh natural salts and resins derived from expensive herbs, whereas salts and other materials were reused multiple times for the bodies of people from poorer classes. In the study, the most well-preserved mummy was in a coffin with a gilded mask, and even though it was one of the oldest, it had a wide variety of odor compounds that were often found in higher concentrations than in the other mummies.Using local conservators was a key part of this study, Elnaggar says, as they have a stake as caretakers of Egyptian cultural heritage and are exposed to the smell of artifacts in their work. In many ways, he says, this makes them well prepared to describe mummy scents for both researchers and casual museum visitors.Smell is very closely linked to the area in our brainthe amygdala and hippocampuswhich is responsible for processing memory and emotions, says Barbara Huber, an archaeochemist at the Max Planck Institute of Geoanthropology in Jena, Germany, who was not involved in the new study but curated a 2023 exhibit on mummy scents in Denmark. Very often you see this glass [blocking] objects inside, and stories about the past are missing. Thus, in some ways, commonly used museum display methods betray how critical smell can be to our understanding of historical narratives, Huber says, especially for an incredibly aromatic experience, such as mummification. In order to truly experience cultural heritage, we need to involve all our sensesbecause smells and sounds of heritage are inherent in getting a full experience of the past, Strlič says.Can we expect to grab a bottle of mummy perfume from the museum shop soon? The researchers say this might not be off the table. Everyone would like to smell like ancient Egyptians: sweet, woody and spicy, Elnaggar jokes. What wed like to do now is to share our experience with museum visitors so they can enjoy it in exhibitionand even take it home!
    0 Yorumlar ·0 hisse senetleri ·37 Views
  • Sony quietly expands PS Portal cloud streaming with PlayStation classics
    www.eurogamer.net
    Sony quietly expands PS Portal cloud streaming with PlayStation classicsFeature remains in beta.Image credit: PlayStation News by Ed Nightingale Deputy News Editor Published on March 13, 2025 Sony has quietly expanded the cloud streaming feature on its PS Portal handheld to include a number of games from its Classics Catalogue.The streaming feature launched back in November last year and allows PS Plus Premium subscribers to stream games directly to the handheld without the need for a PS5 console.Since then, Sony has been quietly adding to the service, which now includes almost every PS1 and PSP game from the Classics Catalogue in the streaming library (thanks TrueTrophies).At its launch, the beta allowed players to access "select PS5 games", with over 120 games included. Now players can dive into the likes of Ape Escape, Tekken, Legend of Dragoon and more.As it remains in beta, the cloud streaming service is only available in 30 countries, though that does include the UK.As per the PlayStation Blog, the service can be accessed in: Austria, Belgium, Bulgaria, Canada, Croatia, Czech Republic, Denmark, Finland, France, Germany, Greece, Hungary, Ireland, Italy, Japan, Luxembourg, Malta, Netherlands, Norway, Poland, Portugal, Republic of Cyprus, Romania, Slovakia, Slovenia, Spain, Sweden, Switzerland, United Kingdom, and United States.It's unclear when the service will arrive in full globally, but for now it's clear Sony is keen to add to the catalogue of games - even if it's not including news in its regular PS Plus updates.If you own a PS Portal, instructions on accessing the beta can be found on the PlayStation Support website.The PS Portal was released in November 2023 - it's impressive hardware but does require a strong internet connection to function well. It quickly sold out after release, though scalpers took advantage of demand.The device is also included in Sony's new PlayStation rental service, priced at 6.50 per month. Separately, you can rent PS5 consoles each month, or a PS VR2 too.Yesterday, Sony announced its PS Plus Premium and Extra lineup for this month, including the excellent Prince of Persia: The Lost Crown and a number of classic Armored Core games from FromSoftware.
    0 Yorumlar ·0 hisse senetleri ·37 Views