ME.PCMAG.COM
Deepfake 'Revenge Porn' Bill Clears Congress: What You Need to Know
Congress this week had a rare moment of bipartisanship, with the House passing the deepfake-focused Take It Down Act by a 409-2 vote. The bill criminalizes "revenge porn," or the non-consensual publishing of sexually explicit content, including those generated using AI. Social media platforms must take down such media within 48 hours of notice. The Senate unanimously approved the Take it Down Act in February; it now heads to President Trump's desk. He's expected to sign the bill, which also has the support of First Lady Melania Trump via her "Be Best" anti-cyberbullying campaign.Non-consensual intimate images (NCII) are a growing problem for younger users. One of the inspirations behind the bill is a 14-year-old girl from Texas whose male classmate created a deepfake nude image of her and posted it on social media. Though the image was eventually taken down, no law required its immediate removal.Multiple AI tools help convert normal photographs or videos to deepfake pornography. Apple removed three such apps from the App Store last year, and San Francisco sued 16 AI-powered websites that help "undress" women in August. According to NBC News, at least 15% of high school students know about the existence of deepfake images of someone they know. Recommended by Our Editors"If you're a victim of revenge or AI-generated explicit imagery, your life changes forever. Most likely, you've been targeted by someone you know, and you're likely struggling to have that material removed from the internet," says bill sponsor Sen. Ted Cruz (R-TX). "The Take It Down Act empowers victims across the entire United States. It makes it a felony for these deviants to publish any non-consensual intimate images."Critics argue the law could be misused, mainly since its enforcement lies with the Federal Trade Commission. "This is an alarming expansion of the FTC's enforcement authority, especially under an administration that has openly expressed hostility to nonprofit organizations that do not serve its political interests," writes Cyber Civil Rights Initiative (CCRI), an organization dedicated to combating image-based sexual abuse."Platforms that feel confident that they are unlikely to be targeted by the FTC (for example, platforms that are closely aligned with the current administration) may feel emboldened to simply ignore reports of NDII," CCRI adds.The organization also found a loophole in the law "that would seemingly allow a person to disclose intimate images without consent so long as that person also appears in the image."The bill could also be used for broader censorship, according to Electronic Frontier Foundation (EFF). Platforms must view flagged content to remove it, whether publicly available or circulated through direct messages (DMs). Therefore, they "may respond by abandoning encryption entirely (for messages) in order to be able to monitor content—turning private conversations into surveilled spaces," says India McKinney, EFF's director of federal affairs. 
0 Reacties 0 aandelen 72 Views