• THEHACKERNEWS.COM
    ResolverRAT Campaign Targets Healthcare, Pharma via Phishing and DLL Side-Loading
    Apr 14, 2025Ravie LakshmananMalware / Cybercrime Cybersecurity researchers have discovered a new, sophisticated remote access trojan called ResolverRAT that has been observed in attacks targeting healthcare and pharmaceutical sectors. "The threat actor leverages fear-based lures delivered via phishing emails, designed to pressure recipients into clicking a malicious link," Morphisec Labs researcher Nadav Lorber said in a report shared with The Hacker News. "Once accessed, the link directs the user to download and open a file that triggers the ResolverRAT execution chain." The activity, observed as recently as March 10, 2025, shares infrastructure and delivery mechanism overlap with phishing campaigns that have delivered information stealer malware such as Lumma and Rhadamanthys, as documented by Cisco Talos and Check Point last year. A notable aspect of the campaign is the use of localized phishing lures, with the emails crafted in the languages predominantly spoken in the targeted countries. This includes Hindi, Italian, Czech, Turkish, Portuguese, and Indonesian, indicating the threat actor's attempts to cast a wide net through region-specific targeting and maximize infection rates. The textual content in the email messages employs themes related to legal investigations or copyright violations that seek to induce a false sense of urgency and increase the likelihood of user interaction. The infection chain is characterized by the use of the DLL side-loading technique to initiate the process. The first stage is an in-memory loader that decrypts and executes the main payload while also incorporating a bevy of tricks to fly under the radar. Not only does the ResolverRAT payload use encryption and compression, but it also exists only in memory once it's decoded. "The ResolverRAT's initialization sequence reveals a sophisticated, multi-stage bootstrapping process engineered for stealth and resilience," Lorber said, adding it "implements multiple redundant persistence methods" by means of Windows Registry and on the file system by installing itself in different locations as a fallback mechanism. Once launched, the malware utilizes a bespoke certificate-based authentication prior to establishing contact with a command-and-control (C2) server such that it bypasses the machine's root authorities. It also implements an IP rotation system to connect to an alternate C2 server if the primary C2 server becomes unavailable or gets taken down. Furthermore, ResolverRAT is fitted with capabilities to sidestep detection efforts through certificate pinning, source code obfuscation, and irregular beaconing patterns to the C2 server. "This advanced C2 infrastructure demonstrates the advanced capabilities of the threat actor, combining secure communications, fallback mechanisms, and evasion techniques designed to maintain persistent access while evading detection by security monitoring systems," Morphisec said. The ultimate goal of the malware is to process commands issued by the C2 server and exfiltrate the responses back, breaking data over 1 MB in size into 16 KB chunks so as to minimize the chances of detection. The campaign has yet to be attributed to a specific group or country, although the similarities in lure themes and the use of DLL side-loading with previously observed phishing attacks allude to a possible connection. "The alignment [...] indicates a possible overlap in threat actor infrastructure or operational playbooks, potentially pointing to a shared affiliate model or coordinated activity among related threat groups," the company said. The development comes as CYFIRMA detailed another remote access trojan codenamed Neptune RAT that uses a modular, plugin-based approach to steal information, maintain persistence on the host, demand a $500 ransom, and even overwrite the Master Boot Record (MBR) to disrupt the normal functioning of the Windows system. It's being propagated freely via GitHub, Telegram, and YouTube. That said, the GitHub profile associated with the malware, called the MasonGroup (aka FREEMASONRY), is no longer accessible. "Neptune RAT incorporates advanced anti-analysis techniques and persistence methods to maintain its presence on the victim's system for extended periods and comes packed with dangerous features," the company noted in an analysis published last week. It includes a "crypto clipper, password stealer with capabilities to exfiltrate over 270+ different applications' credentials, ransomware capabilities, and live desktop monitoring, making it an extremely serious threat." Found this article interesting? Follow us on Twitter  and LinkedIn to read more exclusive content we post. SHARE    
    0 Commentarios 0 Acciones 60 Views
  • WWW.INFORMATIONWEEK.COM
    Trends in Neuromorphic Computing CIOs Should Know
    John Edwards, Technology Journalist & AuthorApril 14, 20255 Min ReadScience Photo Library via Alamy Stock PhotoNeuromorphic computing is the term applied to computer elements that emulate the way the human brain and nervous system function. Proponents believe that the approach will take artificial intelligence to new heights while reducing computing platform energy requirements. "Unlike traditional computing, which incorporates separate memory and processors, neuromorphic systems rely on parallel networks of artificial neurons and synapses, similar to biological neural networks," observes Nigel Gibbons, director and senior advisor at consulting firm NCC Group in an online interview. Potential Applications The current neuromorphic computing application landscape is largely research-based, says Doug Saylors, a partner and cybersecurity co-lead with technology research and advisory firm ISG. "It's being used in multiple areas for pattern and anomaly detection, including cybersecurity, healthcare, edge AI, and defense applications," he explains via email. Potential applications will generally fall into the same areas as artificial intelligence or robotics, says Derek Gobin, a researcher in the AI division of Carnegie Mellon University's Software Engineering Institute. "The ideal is you could apply neuromorphic intelligence systems anywhere you would need or want a human brain," he notes in an online interview. Related:"Most current research is focused on edge-computing applications in places where traditional AI systems would be difficult to deploy, Gobin observes. Many neuromorphic techniques also intrinsically incorporate temporal aspects, similar to how the human brain operates in continuous time, as opposed to the discrete input-output cycles that artificial neural networks utilize." He believes that this attribute could eventually lead to the development of time-series-focused applications, such as audio processing and computer vision-based control systems. Current Development As with quantum computing research, there are multiple approaches to both neuromorphic hardware and algorithm development, Saylors says. The best-known platforms, he states, are BrainScaleS and SpiNNaker. Other players include GrAI Matter labs and BrainChip. Neuromorphic strategies are a very active area of research, Gobin says. "There are a lot of exciting findings happening every day, and you can see them starting to take shape in various public and commercial projects." He reports that both Intel and IBM are developing neuromorphic hardware for deploying neural models with extreme efficiency. "There are also quite a few startups and government proposals looking at bringing neuromorphic capabilities to the forefront, particularly for extreme environments, such as space, and places where current machine learning techniques have fallen short of expectations, such as autonomous driving." Related:Next Steps Over the short term, neuromorphic computing will likely be focused on adding AI capabilities to specialty edge devices in healthcare and defense applications, Saylors says. "AI-enabled chips for sensory use cases are a leading research area for brain/spinal trauma, remote sensors, and AI enabled platforms in aerospace and defense," he notes. An important next step for neuromorphic computing will be maturing a technology that has already proven successful in academic settings, particularly when it comes to scaling, Gobin says. "As we're beginning to see a plateau in performance from GPUs, there's interest in neuromorphic hardware that can better run artificial intelligence models -- some companies have already begun developing and prototyping chips for this purpose." Another promising use case is event-based camera technology, which shows promise as a practical and effective medium for satellite and other computer vision applications, Gobin says. "However, we have yet to see any of these technologies get wide-scale deployment," he observes. "While research is still very active with exciting developments, the next step for the neuromorphic community is really proving that this tech can live up to the hype and be a real competitor to the traditional hardware and generative AI models that are currently dominating the market." Related:Looking Ahead Given the technology's cost and complexity, coupled with the lack of skilled resources, it's likely to take another seven to 10 years before widespread usage of complex neuromorphic computing occurs, Saylors says. "However, recent research in combining neuromorphic computing with GenAI and emerging quantum computing capabilities could accelerate this by a year or two in biomedical and defense applications." Mainstream adoption hinges on hardware maturity, cost reduction, and robust software, Gibbons says. "We may see initial regular usage within the next five to 10 years in specialized low-power applications," he predicts. "Some of this will be dictated by the maturation of quantum computing." Gibbons believes that neuromorphic computing's next phase will focus on scaling integrated chips, refining and spiking neural network algorithms, and commercializing low-power systems for applications in robotics, edge AI, and real-time decision-making. Gibbons notes that neuromorphic computing may soon play an important role in advancing cybersecurity. The technology promises to offer improved anomaly detection and secure authentication, thanks to event-driven intelligence, he explains. Yet novel hardware vulnerabilities, unknown exploit vectors, and data confidentiality remain critical concerns that may hamper widespread adoption. About the AuthorJohn EdwardsTechnology Journalist & AuthorJohn Edwards is a veteran business technology journalist. His work has appeared in The New York Times, The Washington Post, and numerous business and technology publications, including Computerworld, CFO Magazine, IBM Data Management Magazine, RFID Journal, and Electronic Design. He has also written columns for The Economist's Business Intelligence Unit and PricewaterhouseCoopers' Communications Direct. John has authored several books on business technology topics. His work began appearing online as early as 1983. Throughout the 1980s and 90s, he wrote daily news and feature articles for both the CompuServe and Prodigy online services. His "Behind the Screens" commentaries made him the world's first known professional blogger.See more from John EdwardsWebinarsMore WebinarsReportsMore ReportsNever Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.SIGN-UPYou May Also Like
    0 Commentarios 0 Acciones 70 Views
  • SCREENCRUSH.COM
    ‘The Last of Us’: Full Season 1 Recap
    Joel, Ellie, Tommy, the Fireflies, FEDRA. Who are all these people?The Last of Us is back for Season 2, and it has been a while since we’ve seen any clickers or Cordyceps monsters — or any of the human characters or their ongoing storylines, for that matter. If you feel like you need some refreshing before you watch Season 2, if you didn’t watch Season 1 and you would rather dive right in with the start of Season 2, our latest Last of Us video will help.It recaps the events of all nine Last of Us Season 1 episodes in just under 15 minutes. It tells you everything you need to know about this world and its people so you can watch Season 2 fully prepared for all the gnarly zombie carnage as well as the clever symbolism. Watch our full video recap below:If you liked that video recapping everything you need to know from The Last of Us Season 1, check out more of our videos below, including one on all the Easter eggs in Season 2 Episode 1 of The Last of Us, one explaining why The Last of Us is such a great television show, and one on why the controversial The Last of Us Part II is actually a great video game. Plus, there’s tons more videos over at ScreenCrush’s YouTube channel. Be sure to subscribe to catch all our future episodes. New episodes of The Last of Us premiere weekly on Sundays on HBO and Max.Get our free mobile appEvery Marvel Cinematic Universe Movie, Ranked From Worst to BestIt started with Iron Man and it’s continued and expanded ever since. It’s the Marvel Cinematic Universe, with 35 movies and counting. But what’s the best and the worst? We ranked them all.
    0 Commentarios 0 Acciones 64 Views
  • WEWORKREMOTELY.COM
    Clipp.com: Senior Full-Stack Software Engineer
    All jobs Senior Full-Stack Software Engineer Posted 3 hours agoClipp.com, the consumer-facing side of ValpakClipp, sells great local deals to consumers around the U.S., helping local businesses get new customers and families save money. Apply now At Clipp.com, we're looking for a talented Senior Software Engineer to help us build amazing things together.What You'll DoYou'll lead software development projects, creating solutions that are robust, scalable, and easy to maintain. As a key technical voice on our team, you'll get to design systems that make a real impact.Your Day-to-DayDesign and implement thoughtful solutions to interesting problemsShare your knowledge by mentoring team members in your areas of expertiseBreak down complex technical challenges into manageable piecesMake smart, cost-effective decisions about our architectureWrite quality code that you'll be proud to ship to productionIf you enjoy collaborative problem-solving and building software that matters, we'd love to chat with you about joining our team!Essential Duties and ResponsibilitiesDesign, build, and maintain products and features supporting Clipp.com. Partner with the organization’s technical leaders to plan for growth of platform infrastructure. Partner with other teams to use and provide feedback on internal APIs. Translate user needs and business goals from stakeholders into technical terms  in order to design solutions.  Implement business logic and write automated tests to deliver designs to production. Adjust team processes, listening to feedback and guiding the team through the changes  Maintain very deep knowledge of relevant Clipp.com systems. Education/ExperienceBachelor's degree in computer science  or another engineering discipline; Master’s degree a plus Five years of progressively more complex experience delivering high-quality code centered around user needs to production.  Production experience with relational database query optimization, serverless functions, designing for cloud native architecture, automated software testing, containers, and client-side performance tuning.Requirements and QualificationsHigh level of fluency in Ruby and JavaScript.  Interest in optimizing the reliability, latency, and UX of user-facing applications. Problem-solving skills, determination, and a growth mindset. Excellent collaboration and communication skills. Good at mental math. Apply NowLet's start your dream job Apply now Clipp.com View company Jobs posted: 2 Related Jobs Remote Full-Stack Programming jobs→
    0 Commentarios 0 Acciones 55 Views
  • WWW.TECHNOLOGYREVIEW.COM
    DOGE’s tech takeover threatens the safety and stability of our critical data
    Tech buzzwords are clanging through the halls of Washington, DC. The Trump administration has promised to “leverage blockchain technology” to reorganize the US Agency for International Development, and Elon Musk’s DOGE has already unleashed an internal chatbot to automate agency tasks—with bigger plans on the horizon to take over for laid-off employees. The executive order that created DOGE in the first place claims the agency intends to “modernize Federal technology and software.” But jamming hyped-up tech into government workflows isn’t a formula for efficiency. Successful, safe civic tech requires a human-centered approach that understands and respects the needs of citizens. Unfortunately, this administration laid off all the federal workers with the know-how for that—seasoned design and technology professionals, many of whom left careers in the private sector to serve their government and compatriots. What’s going on now is not unconventional swashbuckling—it’s wild incompetence. Musk may have run plenty of tech companies, but building technology for government is an entirely different beast. If this administration doesn’t change its approach soon, American citizens are going to suffer far more than they probably realize. Many may wince remembering the rollout of Healthcare.gov under the Obama administration. Following passage of the Affordable Care Act, Healthcare.gov launched in October of 2013 to facilitate the anticipated wave of insurance signups. But enormous demand famously took down the website two hours after launch. On that first day, only six people were able to complete the registration process. In the wake of the mess, the administration formed the US Digital Service (USDS) and 18F, the digital services office of the General Services Administration. These agencies—the ones now dismantled at the hands of DOGE—pulled experienced technologists from industry to improve critical infrastructure across the federal government, including the Social Security Administration and Veterans Affairs.  Over the last decade, USDS and 18F have worked to build safe, accessible, and secure infrastructure for the people of the United States. DirectFile, the free digital tax filing system that the IRS launched last year, emerged from years of careful research, design, and engineering and a thoughtful, multi-staged release. As a result, 90% of people who used DirectFile and responded to a survey said their experience was excellent or above average, and 86% reported that DirectFile increased their trust in the IRS. Recently, Sam Corcos, a DOGE engineer, told IRS employees he plans to kill the program. When 21 experienced technologists quit their jobs at USDS in January after their colleagues were let go, they weren’t objecting on political grounds. Rather, they preferred to quit rather than “compromise core government services” under DOGE, whose orders are incompatible with USDS’s original mission. As DOGE bulldozes through technological systems, firewalls between government agencies are collapsing and the floodgates are open for data-sharing disasters that will affect everyone. For example, the decision to give Immigration and Customs Enforcement access to IRS data and to databases of unaccompanied minors creates immediate dangers for immigrants, regardless of their legal status. And it threatens everyone else, albeit perhaps less imminently, as every American’s Social Security number, tax returns, benefits, and health-care records are agglomerated into one massive, poorly secured data pool.  That’s not just speculation. We’ve already seen how data breaches at companies like Equifax can expose the sensitive information of hundreds of millions of people. Now imagine those same risks with all your government data, managed by a small crew of DOGE workers without a hint of institutional knowledge between them.  Making data sets speak to each other is one of the most difficult technological challenges out there. Anyone who has ever had to migrate from one CRM system to another knows how easy it is to lose data in the process. Centralization of data is on the administration’s agenda—and will more than likely involve the help of contracting tech companies. Giants like Palantir have built entire business models around integrating government data for surveillance, and they stand to profit enormously from DOGE’s dismantling of privacy protections. This is the playbook: Gut public infrastructure, pay private companies millions to rebuild it, and then grant those companies unprecedented access to our data.  DOGE is also coming for COBOL, a programming language that the entire infrastructure of the Social Security Administration is built on. According to reporting by Wired, DOGE plans to rebuild that system from the ground up in mere months—even though the SSA itself estimated that a project like that would take five years. The difference in those timelines isn’t due to efficiency or ingenuity; it’s the audacity of naïveté and negligence. If something goes wrong, more than 65 million people in the US currently receiving Social Security benefits will feel it where it hurts. Any delay in a Social Security payment can mean the difference between paying rent and facing eviction, affording medication or food and going without.  There are so many alarms to ring about the actions of this administration, but the damage to essential technical infrastructure may be one of the effects with the longest tails. Once these systems are gutted and these firewalls are down, it could take years or even decades to put the pieces back together from a technical standpoint. And since the administration has laid off the in-house experts who did the important and meticulous work of truly modernizing government technology, who will be around to clean up the mess?   Last month, an 83-year-old pastor in hospice care summoned her strength to sue this administration over its gutting of the Consumer Financial Protection Bureau, and we can follow her example. Former federal tech workers have both the knowledge and the legal standing to challenge these reckless tech initiatives. And everyday Americans who rely on government services, which is all of us, have a stake in this fight. Support the lawyers challenging DOGE’s tech takeover, document and report any failures you encounter in government systems, and demand that your representatives hold hearings on what’s happening to our digital infrastructure. It may soon be too late. Steven Renderos is the executive director of Media Justice. Correction: Due to a CMS error, this article was originally published with an incorrect byline. Steven Renderos is the author.
    0 Commentarios 0 Acciones 56 Views
  • WWW.BDONLINE.CO.UK
    Green light for Morris & Co’s plans to turn Aldgate office into resi
    Scheme being developed by HUB and Bridges Fund Management How the revamped building will look The City of London has approved Morris & Co’s plans for an office-to-residential conversion scheme in Aldgate. Developed by HUB and Bridges Fund Management, the Assemblies scheme at 150 Minories, which was approved under delegated powers, will provide 277 co-living homes and shared amenity spaces on the ground floor for both residents and the wider public. These include a pocket park, health hub, co-working space and café. The scheme will also see the existing building repurposed with a rooftop and rear extension. Others working on the plans include QS Circle, landscape architect Macgregor Smith and structural engineer London Structures Lab. Last year, the pair won planning approval for Cornerstone, a 174-home co-living scheme and office conversion on Beech Street in the City, next door to the Barbican. HUB and Bridges bought the 150 Minories site two years ago for £39m. View of the proposals at street level
    0 Commentarios 0 Acciones 63 Views
  • WWW.ARCHITECTSJOURNAL.CO.UK
    New Rogaland Theatre and Stavanger Museum
    The two-stage competition – organised by Stavanger Municipality – invites architects to step forward for an opportunity to create a new home for the city’s main theatre and museum which occupy neighbouring sites close to Stavanger Harbour. The project will retain parts of the historic theatre – which was founded in 1883 and enlarged several times throughout the 20th century – and the nearby museum while creating a new building between the two facilities in Kannikhøyden. According to the brief: ‘Stavanger City Council and Rogaland County Council have adopted a joint development of Rogaland Theatre and Stavanger Museum on the current plots.Advertisement ‘The oldest parts of the theatre building and the historic museum building will be preserved, and a new building will be established next to the historic facilities. ‘In this connection, the client needs to have an architectural solution proposal prepared. The solution proposal prepared in the planning and design competition will form the basis for the project in the further work.’ Stavanger is the third largest city in Norway. The latest competition comes four years after an international contest was held for a new ‘modern and future-orientated’ addition to Norway’s landmark Anno Museum in Hamar. Last year, the National Association of Norwegian Architects launched an international contest for a new visitor centre at the former home of the artist Nikolai Astrup. The latest project will create a new enlarged home for the theatre which has long outgrown its facilities and has been in the process of exploring regeneration options for the past 15 years.Advertisement Judges will include Stavanger Museum chief executive Siri Aavitsland; Glenn André Kaada, director of the Rogaland Theatre; Ole Ueland, county mayor of Rogaland County Municipality; three yet-to-be-named architects and a landscape architect. Six shortlisted teams will be invited to draw up concepts during the competition’s second round. An overall winner will be announced on 10 November. Competition details Project title New Rogaland Theatre and Stavanger Museum Client Stavanger Municipality Contract value Tbc First round deadline 19 May 2025 Restrictions The competition begins with a qualification phase. Only those suppliers who meet the qualification requirements and are invited to participate in the competition will be given the opportunity to submit their solution proposals in the planning and design competition More information https://ted.europa.eu/en/notice/-/detail/236637-2025
    0 Commentarios 0 Acciones 60 Views
  • WWW.CNET.COM
    Nvidia Says It's Making Chips in Arizona, Supercomputers in Texas
    Uncertainty around tariffs has put a new focus on where the chips behind AI are made.
    0 Commentarios 0 Acciones 59 Views
  • WWW.SCIENTIFICAMERICAN.COM
    Does Your Language’s Grammar Change How You Think?
    April 14, 20252 min readLanguage Differences Control Your Brain’s Sentence-Prediction HabitsThe brain’s response to information depends on language’s grammatical structureBy Gayoung Lee edited by Sarah Lewin Frasier Jiri Studnicky/Getty ImagesUnderstanding a simple-looking sentence such as “I read this article yesterday” actually requires some sophisticated conceptual computation: a subject (“I”) performed an action (“read”) on an object (“article”) at a specific time (“yesterday”). But the human brain routinely does this work nearly instantaneously based on the language’s grammatical rules, says linguist Andrea E. Martin of the Max Planck Institute for Psycholinguistics in the Netherlands. And Martin’s team has now found that the human brain accommodates fundamental grammatical differences across languages by adjusting how it processes each sentence.For a recent study in PLOS Biology, the researchers observed variations in Dutch-speaking participants’ brain waves while they listened to a Dutch-language audiobook. To visualize these changes, the scientists used a metric quantifying how many new “predictions” the brain makes of words that could come next in a sentence. This framework was then tested against three different parsing strategies, or linguistic models that illustrate how the brain builds information over time.Previous English-based studies with a similar setup concluded in favor of a model where listeners “wait and see” how each phrase in a sentence will end before interpreting it. But the Dutch speakers in Martin’s study leaned strongly toward a highly predictive model; participants tended to preemptively finish each phrase in their head before it was complete. (A third model, in which listeners wait to hear all the phrases in a sentence before interpreting any part of it, is seldom used in either language.)On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.In Dutch language structure, verbs come near the end of a sentence rather than immediately after the subject like they do in English, explains study lead author Cas W. Coopmans, a postdoctoral researcher in New York University’s department of psychology. For instance, “‘I ate a cookie with chocolate’ in Dutch would be ‘I the cookie with chocolate ate.’ You would have to wait very long for the verb to come,” Coopmans says. “And that’s probably unrealistically late; you’re probably much more predictive in processing” the sentence.Ripley CleghornNeither parsing strategy is necessarily “better or worse” than the other, Coopmans adds. “It just happens to be suited to the language [people] are processing. So we seem to be quite flexible in that you might process one language differently from another simply because they have different properties.”The findings support the need for scientists to incorporate more diversity when crafting linguistic models, says Jixing Li, a linguist at City University Hong Kong, who was not involved in the new study. Her own work has illustrated how different brain regions activate when processing English or Chinese sentences because of their differing linguistic properties. If these studies are done only in neurotypical English-speaking adults, she says, crucial differences in processing will be missed. Li contends that this limitation defeats the purpose of the models, which are meant to provide a realistic picture of human language-based thinking.Diversifying subjects in studies of how the brain processes language “is going to help us capture how the brain is [understanding] the structured meaning of language, and the social utility of language, in many different ways,” Martin says. “There’s so much yet to be understood in the brain.”
    0 Commentarios 0 Acciones 49 Views
  • WWW.EUROGAMER.NET
    Mario Kart World player discovers unannounced Thunderbird-like vehicle
    More people have managed to take Mario Kart World out for a test drive, and new discoveries are now popping up online. Read more
    0 Commentarios 0 Acciones 50 Views