Consumer Reports finds popular voice cloning tools lack safeguards
techcrunch.com
In BriefPosted:7:28 AM PDT March 10, 2025Image Credits:Pexels (opens in a new window)Consumer Reports finds popular voice cloning tools lack safeguardsSeveral popular voice cloning tools on the market dont have meaningful safeguards to prevent fraud or abuse, according to a new study from Consumer Reports. Consumer Reports probed voice cloning products from six companies Descript, ElevenLabs, Lovo, PlayHT, Resemble AI, and Speechify for mechanisms that might make it more difficult for malicious users to clone someones voice without their permission. The publication found that only two, Descript and Resemble AI, took steps to combat misuse.Others required only that users check a box confirming that they had the legal right to clone a voice or make a similar self-attestation.Grace Gedye, policy analyst at Consumer Reports, said that AI voice cloning tools have the potential to supercharge impersonation scams if adequate safety measures arent put in place.Our assessment shows that there are basic steps companies can take to make it harder to clone someones voice without their knowledge but some companies arent taking them, Gedye said in a statement.Topics
0 Comentários ·0 Compartilhamentos ·67 Visualizações