Trump to sign law forcing platforms to remove revenge porn in 48 hours About time Trump to sign law forcing platforms to remove revenge porn in 48 hours Take It Down Act’s 48-hour timeline may be both too fast and too slow. Ashley..."> Trump to sign law forcing platforms to remove revenge porn in 48 hours About time Trump to sign law forcing platforms to remove revenge porn in 48 hours Take It Down Act’s 48-hour timeline may be both too fast and too slow. Ashley..." /> Trump to sign law forcing platforms to remove revenge porn in 48 hours About time Trump to sign law forcing platforms to remove revenge porn in 48 hours Take It Down Act’s 48-hour timeline may be both too fast and too slow. Ashley..." />

Passa a Pro

Trump to sign law forcing platforms to remove revenge porn in 48 hours

About time

Trump to sign law forcing platforms to remove revenge porn in 48 hours

Take It Down Act’s 48-hour timeline may be both too fast and too slow.

Ashley Belanger



May 19, 2025 2:42 pm

|

27

Melania Trump smiles at Survivor of Non-Consensual Intimate Imagery, Elliston Berryduring a roundtable discussion on the Take It Down Act.

Credit:

Kayla Bartkowski / Staff | Getty Images News

Melania Trump smiles at Survivor of Non-Consensual Intimate Imagery, Elliston Berryduring a roundtable discussion on the Take It Down Act.

Credit:

Kayla Bartkowski / Staff | Getty Images News

Story text

Size

Small
Standard
Large

Width
*

Standard
Wide

Links

Standard
Orange

* Subscribers only
  Learn more

After dragging its feet for years, America is finally taking its first big step toward shielding victims of non-consensual intimate imagery—also known as revenge porn—from constantly being retraumatized online.
On Monday afternoon, Donald Trump is scheduled to sign the Take It Down Act into law. That means that within one year, every online platform will be required to remove both actual NCII and fake nudes generated by artificial intelligence within 48 hours of victims' reports or face steep penalties.
Supporters have touted the 48-hour timeline as remarkably fast, empowering victims to promptly stop revenge porn from spreading widely online. The law's passing comes at a time when AI-generated revenge porn is increasingly harming a wider pool of victims—including some who may have never shared a compromising photo, like dozens of kids in middle and high schools nationwide. Acknowledging the substantial harm to kids already, the law includes steeper penalties for NCII targeting minor victims, a threat lawmakers hope will help minors get harmful images removed "as soon as possible."
Critics have attacked the 48-hour timeline, though, warning that platforms will be rushed to remove NCII and likely censor a broader range of content online. Trump has claimed that he would use the law to censor content he doesn't like, and the law does not explicitly exempt encrypted messages. So, if platforms decide to break encryption to prevent the liability threat, that could possibly pave the way for officials to censor critics' private messages, critics fear. For these reasons, legal challenges are expected.
Whether the Take It Down Act can survive a potential constitutional challenge or not, the 48-hour timeline may be deficient for other reasons, though, victim testimony suggests.

Likely wearisome for victims, the law won't be widely enforced for about a year, while any revenge porn already online continues spreading. Perhaps most frustrating, once the law kicks in, victims will still need to police their own revenge porn online. And the 48-hour window leaves time for content to be downloaded and reposted, leaving them vulnerable on any unmonitored platforms.
Some victims are already tired of fighting this fight. Last July, when Google started downranking deepfake porn apps to make AI-generated NCII less discoverable, one deepfake victim, Sabrina Javellana, told The New York Times that she spent months reporting harmful content on various platforms online. And that didn’t stop the fake images from spreading. Joe Morelle, a Democratic US representative who has talked to victims of deepfake porn and sponsored laws to help them, agreed that "these images live forever."
"It just never ends," Javellana said. "I just have to accept it."
Andrea Powell—director of Alecto AI, an app founded by a revenge porn survivor that helps victims remove NCII online—warned on a 2024 panel that Ars attended that requiring victims to track down "their own imagerymultiple claims across different platformstheir sense of isolation, shame, and fear."
While the Take It Down Act seems flawed, passing a federal law imposing penalties for allowing deepfake porn posts could serve as a deterrent for bad actors or possibly spark a culture shift by making it clear that posting AI-generated NCII is harmful.
Victims have long suggested that consistency is key to keeping revenge porn offline, and the Take It Down Act certainly offers that, creating a moderately delayed delete button on every major platform.
Although it seems clear that the Take It Down Act will surely make it easier than ever to report NCII, whether the law will effectively reduce the spread of NCII online is an unknown and will likely hinge on the 48-hour timeline overcoming criticisms.

Ashley Belanger
Senior Policy Reporter

Ashley Belanger
Senior Policy Reporter

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

27 Comments
#trump #sign #law #forcing #platforms
Trump to sign law forcing platforms to remove revenge porn in 48 hours
About time Trump to sign law forcing platforms to remove revenge porn in 48 hours Take It Down Act’s 48-hour timeline may be both too fast and too slow. Ashley Belanger – May 19, 2025 2:42 pm | 27 Melania Trump smiles at Survivor of Non-Consensual Intimate Imagery, Elliston Berryduring a roundtable discussion on the Take It Down Act. Credit: Kayla Bartkowski / Staff | Getty Images News Melania Trump smiles at Survivor of Non-Consensual Intimate Imagery, Elliston Berryduring a roundtable discussion on the Take It Down Act. Credit: Kayla Bartkowski / Staff | Getty Images News Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more After dragging its feet for years, America is finally taking its first big step toward shielding victims of non-consensual intimate imagery—also known as revenge porn—from constantly being retraumatized online. On Monday afternoon, Donald Trump is scheduled to sign the Take It Down Act into law. That means that within one year, every online platform will be required to remove both actual NCII and fake nudes generated by artificial intelligence within 48 hours of victims' reports or face steep penalties. Supporters have touted the 48-hour timeline as remarkably fast, empowering victims to promptly stop revenge porn from spreading widely online. The law's passing comes at a time when AI-generated revenge porn is increasingly harming a wider pool of victims—including some who may have never shared a compromising photo, like dozens of kids in middle and high schools nationwide. Acknowledging the substantial harm to kids already, the law includes steeper penalties for NCII targeting minor victims, a threat lawmakers hope will help minors get harmful images removed "as soon as possible." Critics have attacked the 48-hour timeline, though, warning that platforms will be rushed to remove NCII and likely censor a broader range of content online. Trump has claimed that he would use the law to censor content he doesn't like, and the law does not explicitly exempt encrypted messages. So, if platforms decide to break encryption to prevent the liability threat, that could possibly pave the way for officials to censor critics' private messages, critics fear. For these reasons, legal challenges are expected. Whether the Take It Down Act can survive a potential constitutional challenge or not, the 48-hour timeline may be deficient for other reasons, though, victim testimony suggests. Likely wearisome for victims, the law won't be widely enforced for about a year, while any revenge porn already online continues spreading. Perhaps most frustrating, once the law kicks in, victims will still need to police their own revenge porn online. And the 48-hour window leaves time for content to be downloaded and reposted, leaving them vulnerable on any unmonitored platforms. Some victims are already tired of fighting this fight. Last July, when Google started downranking deepfake porn apps to make AI-generated NCII less discoverable, one deepfake victim, Sabrina Javellana, told The New York Times that she spent months reporting harmful content on various platforms online. And that didn’t stop the fake images from spreading. Joe Morelle, a Democratic US representative who has talked to victims of deepfake porn and sponsored laws to help them, agreed that "these images live forever." "It just never ends," Javellana said. "I just have to accept it." Andrea Powell—director of Alecto AI, an app founded by a revenge porn survivor that helps victims remove NCII online—warned on a 2024 panel that Ars attended that requiring victims to track down "their own imagerymultiple claims across different platformstheir sense of isolation, shame, and fear." While the Take It Down Act seems flawed, passing a federal law imposing penalties for allowing deepfake porn posts could serve as a deterrent for bad actors or possibly spark a culture shift by making it clear that posting AI-generated NCII is harmful. Victims have long suggested that consistency is key to keeping revenge porn offline, and the Take It Down Act certainly offers that, creating a moderately delayed delete button on every major platform. Although it seems clear that the Take It Down Act will surely make it easier than ever to report NCII, whether the law will effectively reduce the spread of NCII online is an unknown and will likely hinge on the 48-hour timeline overcoming criticisms. Ashley Belanger Senior Policy Reporter Ashley Belanger Senior Policy Reporter Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience. 27 Comments #trump #sign #law #forcing #platforms
ARSTECHNICA.COM
Trump to sign law forcing platforms to remove revenge porn in 48 hours
About time Trump to sign law forcing platforms to remove revenge porn in 48 hours Take It Down Act’s 48-hour timeline may be both too fast and too slow. Ashley Belanger – May 19, 2025 2:42 pm | 27 Melania Trump smiles at Survivor of Non-Consensual Intimate Imagery, Elliston Berry (L) during a roundtable discussion on the Take It Down Act. Credit: Kayla Bartkowski / Staff | Getty Images News Melania Trump smiles at Survivor of Non-Consensual Intimate Imagery, Elliston Berry (L) during a roundtable discussion on the Take It Down Act. Credit: Kayla Bartkowski / Staff | Getty Images News Story text Size Small Standard Large Width * Standard Wide Links Standard Orange * Subscribers only   Learn more After dragging its feet for years, America is finally taking its first big step toward shielding victims of non-consensual intimate imagery (NCII)—also known as revenge porn—from constantly being retraumatized online. On Monday afternoon, Donald Trump is scheduled to sign the Take It Down Act into law. That means that within one year, every online platform will be required to remove both actual NCII and fake nudes generated by artificial intelligence within 48 hours of victims' reports or face steep penalties. Supporters have touted the 48-hour timeline as remarkably fast, empowering victims to promptly stop revenge porn from spreading widely online. The law's passing comes at a time when AI-generated revenge porn is increasingly harming a wider pool of victims—including some who may have never shared a compromising photo, like dozens of kids in middle and high schools nationwide. Acknowledging the substantial harm to kids already, the law includes steeper penalties for NCII targeting minor victims, a threat lawmakers hope will help minors get harmful images removed "as soon as possible." Critics have attacked the 48-hour timeline, though, warning that platforms will be rushed to remove NCII and likely censor a broader range of content online. Trump has claimed that he would use the law to censor content he doesn't like, and the law does not explicitly exempt encrypted messages. So, if platforms decide to break encryption to prevent the liability threat, that could possibly pave the way for officials to censor critics' private messages, critics fear. For these reasons, legal challenges are expected. Whether the Take It Down Act can survive a potential constitutional challenge or not, the 48-hour timeline may be deficient for other reasons, though, victim testimony suggests. Likely wearisome for victims, the law won't be widely enforced for about a year, while any revenge porn already online continues spreading. Perhaps most frustrating, once the law kicks in, victims will still need to police their own revenge porn online. And the 48-hour window leaves time for content to be downloaded and reposted, leaving them vulnerable on any unmonitored platforms. Some victims are already tired of fighting this fight. Last July, when Google started downranking deepfake porn apps to make AI-generated NCII less discoverable, one deepfake victim, Sabrina Javellana, told The New York Times that she spent months reporting harmful content on various platforms online. And that didn’t stop the fake images from spreading. Joe Morelle, a Democratic US representative who has talked to victims of deepfake porn and sponsored laws to help them, agreed that "these images live forever." "It just never ends," Javellana said. "I just have to accept it." Andrea Powell—director of Alecto AI, an app founded by a revenge porn survivor that helps victims remove NCII online—warned on a 2024 panel that Ars attended that requiring victims to track down "their own imagery [and submit] multiple claims across different platforms [increases] their sense of isolation, shame, and fear." While the Take It Down Act seems flawed, passing a federal law imposing penalties for allowing deepfake porn posts could serve as a deterrent for bad actors or possibly spark a culture shift by making it clear that posting AI-generated NCII is harmful. Victims have long suggested that consistency is key to keeping revenge porn offline, and the Take It Down Act certainly offers that, creating a moderately delayed delete button on every major platform. Although it seems clear that the Take It Down Act will surely make it easier than ever to report NCII, whether the law will effectively reduce the spread of NCII online is an unknown and will likely hinge on the 48-hour timeline overcoming criticisms. Ashley Belanger Senior Policy Reporter Ashley Belanger Senior Policy Reporter Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience. 27 Comments
·88 Views