
FBI Warns iPhone And Android UsersHang Up Now, Use This Code
www.forbes.com
Hang up and use a secret code to combat AI smartphone attacks, FBI says.NurPhoto via Getty ImagesUpdate, March 21, 2025: This story, originally published March 20, has been updated with further details regarding the AI attacks facing Gmail users and the FBI secret code warning issued in response, as well as expert opinion from cybersecurity professionals. There has been no shortage of AI-powered security threat warnings in recent weeks from code that can compromise your Chrome password manager credentials to critical AI attacks costing hackers as little as $5 to create. But its the deepfake attacks hitting smartphone users, despite the best efforts of the likes of Google to defend against them, that are of most concern. Indeed, these ongoing attacks are so convincing that security experts, including the FBI, have issued warnings to the public to hang up now and create a secret code by way of protection. Heres what you need to know and do.FBI And Security Experts Warn Of AI-Powered Smartphone AttacksAlthough you might immediately think of face-swapping videos when it comes to deepfake attacks, that is far from the complete threat picture. If you want to see how good you are at spotting a deepfake face, theres a quick test you can take but be warned its much more challenging than you think. Voice fakes, driven by AI, really started gaining public attention after I wrote a viral article in 2024 concerning a security expert who almost got fooled, with potentially very costly consequences.Adrianus Warmenhoven, a cybersecurity expert at NordVPN, has now added to the voices warning iPhone and Android users about the threat. Phone scammers increasingly use voice cloning tools for their fraudulent activities because this kind of software has become more affordable and effective over time, Warmenhoven told me. A common approach, and one that is seen in ongoing attacks currently, is to use this deepfake audio to approach family members of the individual they are impersonating, Warmenhoven said, and extort money by simulating an emergency.Referencing an October 2024 report from Truecaller and The Harris Poll, AmericaUnder Attack: The Shifting Landscape of Spam and Scam Calls in America, Warmenhoven pointed to the startling statistic that when it came to the U.S. alone, across the previous 12 months, the total number of phone scam victims exceeded 50 million and losses were estimated at $452 per victim. As deepfakes dramatically change the landscape of scam phone calls, Warmenhoven warned, it is crucial to ensure that everyone in the family understands what voice cloning is, how it works, and how it could be used in scams, such as impersonating a family member to request money or personal information.Deepfakes will become unrecognizable, Siggi Stefnisson, cyber safety chief technical officer at trust-based security platform Gen, whose brands include Norton and Avast, warned. AI will become sophisticated enough that even experts may not be able to tell whats authentic.The FBI Deepfake Smartphone Audio Attack Mitigation AdviceAs I reported Dec. 7, 2024, the Federal Bureau of Investigation has also been warning the public of such attacks. Indeed, the FBI went as far as to issue public service alert number I-120324-PSA addressing this very subject. Both the FBI and Warmenhoven recommend the same mitigation, as brutal and startling as it sounds, to hang up and create a secret code known only to your close family and friends.Warmenhoven also advised that people should be cautious about the content of their social media postings. Social media is the largest publicly available resource of voice samples for cybercriminals, Warmenhoven warned. This means that everyone should be wary of what they post in terms of how it could be used to negatively impact their security through the rise of deepfakes, voice cloning, and other scams enabled by AI tools.To mitigate the risk of these sophisticated and increasingly dangerous AI attacks against iPhone and Android users, the FBI said that people should hang up the phone immediately if they get a call claiming to be from a family member or close friend asking for money in such a fashion, and verify the identity of the person calling using direct means yourself. The FBI also warned that all members of the public should create a secret word or phrase that is only known to you and your close contacts and use this to identify a caller claiming to be someone in trouble, no matter how convincing they sound. And convincing they will be as the deepfake call will be based on public audio clips, from social media videos, for example, that are then fed through the AI tooling to produce, in effect, that person saying anything that is typed in.
0 Reacties
·0 aandelen
·34 Views