Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It Users Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It The TAKE IT DOWN Act passed with bipartisan support and glowing coverage...."> Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It Users Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It The TAKE IT DOWN Act passed with bipartisan support and glowing coverage...." /> Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It Users Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It The TAKE IT DOWN Act passed with bipartisan support and glowing coverage...." />

Upgrade to Pro

Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It

Users
Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It
The TAKE IT DOWN Act passed with bipartisan support and glowing coverage. Experts warn that it threatens the very users it claims to protect.

By

Nitish Pahwa

Enter your email to receive alerts for this author.

Sign in or create an account to better manage your email preferences.

May 22, 20252:03 PM

Donald and Melania Trump during the signing of the TAKE IT DOWN Act at the White House on Monday.
Jim Watson/AFP via Getty Images

Sign up for the Slatest to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily.
Had you scanned any of the latest headlines around the TAKE IT DOWN Act, legislation that President Donald Trump signed into law Monday, you would have come away with a deeply mistaken impression of the bill and its true purpose.
The surface-level pitch is that this is a necessary law for addressing nonconsensual intimate images—known more widely as revenge porn. Obfuscating its intent with a classic congressional acronym, the TAKE IT DOWN Act purports to help scrub the internet of exploitative, nonconsensual sexual media, whether real or digitally mocked up, at a time when artificial intelligence tools and automated image generators have supercharged its spread. Enforcement is delegated to the Federal Trade Commission, which will give online communities that specialize primarily in user-generated contenta heads-up and a 48-hour takedown deadline whenever an appropriate example is reported. These platforms have also been directed to set up on-site reporting systems by May 2026. Penalties for violations include prison sentences of two to three years and steep monetary fines.

Public reception has been rapturous. CNN is gushing that “victims of explicit deepfakes will now be able to take legal action against people who create them.” A few local Fox affiliates are taking the government at its word that TAKE IT DOWN is designed to target revenge porn. Other outlets, like the BBC and USA Today, led off by noting first lady Melania Trump’s appearance at the bill signing.
Yet these headlines and pieces ignore TAKE IT DOWN’s serious potential for abuse.Rarer still, with the exception of sites like the Verge, has there been any acknowledgment of Trump’s own stated motivation for passing the act, as he’d underlined in a joint address to Congress in March: “I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.”
Sure, it’s typical for this president to make such serious matters about himself. But Trump’s blathering about having it “worse” than revenge-porn survivors, and his quip about “using that bill for myself,” is not a fluke. For a while now, activists who specialize in free speech, digital privacy, and even stopping child sexual abuse have attempted to warn that the bill will not do what it purports to do.
Late last month, after TAKE IT DOWN had passed both the House and Senate, the Electronic Frontier Foundation wrote that the bill’s legislative mechanism “lacks critical safeguards against frivolous or bad-faith takedown requests.” For one, the 48-hour takedown deadline means that digital platformswill be forced to use automated filters that often flag legal content—because there won’t be “enough time to verify whether the speech is actually illegal.” The EFF also warns that TAKE IT DOWN requires monitoring that could reach into even encrypted messages between users. If this legislation has the effect of granting law enforcement a means of bypassing encrypted communications, we may as well bid farewell to the very concept of digital privacy.
A February letter addressed to the Senate from a wide range of free-expression nonprofits—including Fight for the Future and the Authors Guild—also raised concerns over TAKE IT DOWN’s implications for content moderation and encryption. The groups noted that although the bill makes allowances for legal porn and newsworthy content, “those exceptions are not included in the bill’s takedown system.” They added that private tools like direct messages and cloud storage aren’t protected either, which could leave them open to invasive monitoring with little justification. The Center for Democracy and Technology, a signatory to the letter, later noted in a follow-up statement that the powers granted to the FTC in enforcing such a vague law could lead to politically motivated attacks, undermining progress in tackling actual nonconsensual imagery.
Techdirt’s Mike Masnick wrote last month that TAKE IT DOWN is “so badly designed that the people it’s meant to help oppose it,” pointing to public statements from the advocacy group Cyber Civil Rights Initiative, “whose entire existence is based on representing the interests of victims” of nonconsensual intimate imagery. CCRI has long criticized the bill’s takedown provisions and ultimately concluded that the nonprofit “cannot support legislation that risks endangering the very communities it is dedicated to protecting, including LGBTQIA+ individuals, people of color, and other vulnerable groups.”“The concerns are not theoretical,” Masnick continued. “The bill’s vague standards combined with harsh criminal penalties create a perfect storm for censorship and abuse.”

Related From Slate

Let’s be clear: No one here is at all opposed to sound legislation that tackles the inescapable, undeniable problem of nonconsensual sexual material. All 50 states, along with the District of Columbia, have enacted laws criminalizing exploitative sexual photos and videos to varying degrees. TAKE IT DOWN extends such coverage to deepfake revenge porn, a change that makes the bill a necessary complement to these state laws—but its text is shockingly narrow on the digital front, criminalizing only A.I. imagery that’s deemed to be “indistinguishable from an authentic visual depiction.” This just leads to more vague language that hardly addresses the underlying issue.
The CCRI has spent a full decade fighting for laws to address the crisis of nonconsensual sexual imagery, even drafting model legislation—parts of which did make it into TAKE IT DOWN. On Bluesky, CCRI President Mary Anne Franks called this fact “bittersweet,” proclaiming that the long-overdue criminalization of exploitative sexual imagery is undermined by the final law’s “lack of adequate safeguards against false reports.” A few House Democrats looked to the group’s proposed fixes and attempted to pass amendments that would have added such safeguards, only to be obstructed by their Republican colleagues.
This should worry everyone. These groups made concerted efforts to inform Congress of the issues with TAKE IT DOWN and to propose solutions, only to be all but ignored. As Masnick wrote in another Techdirt post, the United States already has enough of a problem with the infamous Digital Millennium Copyright Act, the only other American law with a notice-and-takedown measure like TAKE IT DOWN’s, albeit designed to prevent the unauthorized spread of copyright works. Just ask any creatives or platform operators who’ve had to deal with abusive flurries of bad-faith DMCA takedown requests—even though the law includes a clause meant to protect against such weaponization. There’s no reason to believe that TAKE IT DOWN won’t be similarly exploited to go after sex workers and LGBTQ+ users, as well as anyone who posts an image or animation that another user simply doesn’t like and decides to report. It’s not dissimilar to other pieces of proposed legislation, like the Kids Online Safety Act, that purport to protect young netizens via wishy-washy terms that could criminalize all sorts of free expression.

Popular in

Technology

Here’s a hypothetical: A satirical cartoonist comes up with an illustration of Trump as a baby and publishes it on a niche social media platform that they use to showcase their art. A Trump supporter finds this cartoon and decides to report it as abusive pornography, leading to a takedown notice on the cartoonist’s website. The artist and the platform do not comply, and a pissed-off Trump brings the full force of the law against this creator. The process of discovery leads prosecutors to break into the artist’s encrypted communications, revealing drafts of the drawing that the cartoonist had shared with friends. All of this gets the illustrator punished with a brief prison sentence and steep fine, fully sabotaging their bank account and career; the social media platform they used is left bankrupt and shutters. The artists are forced to migrate to another site, whose administrators see what happened to their former home and decide to censor political works. All the while, an underage user finds that their likeness has been used to generate a sexually explicit deepfake that has been spread all over Discord—yet their case is found to have no merit because the deepfake in question is not considered “indistinguishable from an authentic visual depiction,” despite all the Discord-based abusers recognizing exactly whom that deepfake is meant to represent.
It’s a hypothetical—but not an unimaginable one. It’s a danger that too few Americans understand, thanks to congressional ignorance and the media’s credulous reporting on TAKE IT DOWN. The result is a law that’s supposedly meant to protect the vulnerable but ends up shielding the powerful—and punishing the very people it promised to help.

Get the best of news and politics
Sign up for Slate's evening newsletter.
#congress #passed #sweeping #freespeech #crackdownand
Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It
Users Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It The TAKE IT DOWN Act passed with bipartisan support and glowing coverage. Experts warn that it threatens the very users it claims to protect. By Nitish Pahwa Enter your email to receive alerts for this author. Sign in or create an account to better manage your email preferences. May 22, 20252:03 PM Donald and Melania Trump during the signing of the TAKE IT DOWN Act at the White House on Monday. Jim Watson/AFP via Getty Images Sign up for the Slatest to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily. Had you scanned any of the latest headlines around the TAKE IT DOWN Act, legislation that President Donald Trump signed into law Monday, you would have come away with a deeply mistaken impression of the bill and its true purpose. The surface-level pitch is that this is a necessary law for addressing nonconsensual intimate images—known more widely as revenge porn. Obfuscating its intent with a classic congressional acronym, the TAKE IT DOWN Act purports to help scrub the internet of exploitative, nonconsensual sexual media, whether real or digitally mocked up, at a time when artificial intelligence tools and automated image generators have supercharged its spread. Enforcement is delegated to the Federal Trade Commission, which will give online communities that specialize primarily in user-generated contenta heads-up and a 48-hour takedown deadline whenever an appropriate example is reported. These platforms have also been directed to set up on-site reporting systems by May 2026. Penalties for violations include prison sentences of two to three years and steep monetary fines. Public reception has been rapturous. CNN is gushing that “victims of explicit deepfakes will now be able to take legal action against people who create them.” A few local Fox affiliates are taking the government at its word that TAKE IT DOWN is designed to target revenge porn. Other outlets, like the BBC and USA Today, led off by noting first lady Melania Trump’s appearance at the bill signing. Yet these headlines and pieces ignore TAKE IT DOWN’s serious potential for abuse.Rarer still, with the exception of sites like the Verge, has there been any acknowledgment of Trump’s own stated motivation for passing the act, as he’d underlined in a joint address to Congress in March: “I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.” Sure, it’s typical for this president to make such serious matters about himself. But Trump’s blathering about having it “worse” than revenge-porn survivors, and his quip about “using that bill for myself,” is not a fluke. For a while now, activists who specialize in free speech, digital privacy, and even stopping child sexual abuse have attempted to warn that the bill will not do what it purports to do. Late last month, after TAKE IT DOWN had passed both the House and Senate, the Electronic Frontier Foundation wrote that the bill’s legislative mechanism “lacks critical safeguards against frivolous or bad-faith takedown requests.” For one, the 48-hour takedown deadline means that digital platformswill be forced to use automated filters that often flag legal content—because there won’t be “enough time to verify whether the speech is actually illegal.” The EFF also warns that TAKE IT DOWN requires monitoring that could reach into even encrypted messages between users. If this legislation has the effect of granting law enforcement a means of bypassing encrypted communications, we may as well bid farewell to the very concept of digital privacy. A February letter addressed to the Senate from a wide range of free-expression nonprofits—including Fight for the Future and the Authors Guild—also raised concerns over TAKE IT DOWN’s implications for content moderation and encryption. The groups noted that although the bill makes allowances for legal porn and newsworthy content, “those exceptions are not included in the bill’s takedown system.” They added that private tools like direct messages and cloud storage aren’t protected either, which could leave them open to invasive monitoring with little justification. The Center for Democracy and Technology, a signatory to the letter, later noted in a follow-up statement that the powers granted to the FTC in enforcing such a vague law could lead to politically motivated attacks, undermining progress in tackling actual nonconsensual imagery. Techdirt’s Mike Masnick wrote last month that TAKE IT DOWN is “so badly designed that the people it’s meant to help oppose it,” pointing to public statements from the advocacy group Cyber Civil Rights Initiative, “whose entire existence is based on representing the interests of victims” of nonconsensual intimate imagery. CCRI has long criticized the bill’s takedown provisions and ultimately concluded that the nonprofit “cannot support legislation that risks endangering the very communities it is dedicated to protecting, including LGBTQIA+ individuals, people of color, and other vulnerable groups.”“The concerns are not theoretical,” Masnick continued. “The bill’s vague standards combined with harsh criminal penalties create a perfect storm for censorship and abuse.” Related From Slate Let’s be clear: No one here is at all opposed to sound legislation that tackles the inescapable, undeniable problem of nonconsensual sexual material. All 50 states, along with the District of Columbia, have enacted laws criminalizing exploitative sexual photos and videos to varying degrees. TAKE IT DOWN extends such coverage to deepfake revenge porn, a change that makes the bill a necessary complement to these state laws—but its text is shockingly narrow on the digital front, criminalizing only A.I. imagery that’s deemed to be “indistinguishable from an authentic visual depiction.” This just leads to more vague language that hardly addresses the underlying issue. The CCRI has spent a full decade fighting for laws to address the crisis of nonconsensual sexual imagery, even drafting model legislation—parts of which did make it into TAKE IT DOWN. On Bluesky, CCRI President Mary Anne Franks called this fact “bittersweet,” proclaiming that the long-overdue criminalization of exploitative sexual imagery is undermined by the final law’s “lack of adequate safeguards against false reports.” A few House Democrats looked to the group’s proposed fixes and attempted to pass amendments that would have added such safeguards, only to be obstructed by their Republican colleagues. This should worry everyone. These groups made concerted efforts to inform Congress of the issues with TAKE IT DOWN and to propose solutions, only to be all but ignored. As Masnick wrote in another Techdirt post, the United States already has enough of a problem with the infamous Digital Millennium Copyright Act, the only other American law with a notice-and-takedown measure like TAKE IT DOWN’s, albeit designed to prevent the unauthorized spread of copyright works. Just ask any creatives or platform operators who’ve had to deal with abusive flurries of bad-faith DMCA takedown requests—even though the law includes a clause meant to protect against such weaponization. There’s no reason to believe that TAKE IT DOWN won’t be similarly exploited to go after sex workers and LGBTQ+ users, as well as anyone who posts an image or animation that another user simply doesn’t like and decides to report. It’s not dissimilar to other pieces of proposed legislation, like the Kids Online Safety Act, that purport to protect young netizens via wishy-washy terms that could criminalize all sorts of free expression. Popular in Technology Here’s a hypothetical: A satirical cartoonist comes up with an illustration of Trump as a baby and publishes it on a niche social media platform that they use to showcase their art. A Trump supporter finds this cartoon and decides to report it as abusive pornography, leading to a takedown notice on the cartoonist’s website. The artist and the platform do not comply, and a pissed-off Trump brings the full force of the law against this creator. The process of discovery leads prosecutors to break into the artist’s encrypted communications, revealing drafts of the drawing that the cartoonist had shared with friends. All of this gets the illustrator punished with a brief prison sentence and steep fine, fully sabotaging their bank account and career; the social media platform they used is left bankrupt and shutters. The artists are forced to migrate to another site, whose administrators see what happened to their former home and decide to censor political works. All the while, an underage user finds that their likeness has been used to generate a sexually explicit deepfake that has been spread all over Discord—yet their case is found to have no merit because the deepfake in question is not considered “indistinguishable from an authentic visual depiction,” despite all the Discord-based abusers recognizing exactly whom that deepfake is meant to represent. It’s a hypothetical—but not an unimaginable one. It’s a danger that too few Americans understand, thanks to congressional ignorance and the media’s credulous reporting on TAKE IT DOWN. The result is a law that’s supposedly meant to protect the vulnerable but ends up shielding the powerful—and punishing the very people it promised to help. Get the best of news and politics Sign up for Slate's evening newsletter. #congress #passed #sweeping #freespeech #crackdownand
SLATE.COM
Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It
Users Congress Passed a Sweeping Free-Speech Crackdown—and No One’s Talking About It The TAKE IT DOWN Act passed with bipartisan support and glowing coverage. Experts warn that it threatens the very users it claims to protect. By Nitish Pahwa Enter your email to receive alerts for this author. Sign in or create an account to better manage your email preferences. May 22, 20252:03 PM Donald and Melania Trump during the signing of the TAKE IT DOWN Act at the White House on Monday. Jim Watson/AFP via Getty Images Sign up for the Slatest to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily. Had you scanned any of the latest headlines around the TAKE IT DOWN Act, legislation that President Donald Trump signed into law Monday, you would have come away with a deeply mistaken impression of the bill and its true purpose. The surface-level pitch is that this is a necessary law for addressing nonconsensual intimate images—known more widely as revenge porn. Obfuscating its intent with a classic congressional acronym (Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks), the TAKE IT DOWN Act purports to help scrub the internet of exploitative, nonconsensual sexual media, whether real or digitally mocked up, at a time when artificial intelligence tools and automated image generators have supercharged its spread. Enforcement is delegated to the Federal Trade Commission, which will give online communities that specialize primarily in user-generated content (e.g., social media, message boards) a heads-up and a 48-hour takedown deadline whenever an appropriate example is reported. These platforms have also been directed to set up on-site reporting systems by May 2026. Penalties for violations include prison sentences of two to three years and steep monetary fines. Public reception has been rapturous. CNN is gushing that “victims of explicit deepfakes will now be able to take legal action against people who create them.” A few local Fox affiliates are taking the government at its word that TAKE IT DOWN is designed to target revenge porn. Other outlets, like the BBC and USA Today, led off by noting first lady Melania Trump’s appearance at the bill signing. Yet these headlines and pieces ignore TAKE IT DOWN’s serious potential for abuse. (Jezebel and Wired were perhaps the only publications to point out in both a headline and subhead that the law merely “claims to offer victims greater protections” and that “free speech advocates warn it could be weaponized to fuel censorship.”) Rarer still, with the exception of sites like the Verge, has there been any acknowledgment of Trump’s own stated motivation for passing the act, as he’d underlined in a joint address to Congress in March: “I’m going to use that bill for myself too, if you don’t mind, because nobody gets treated worse than I do online, nobody.” Sure, it’s typical for this president to make such serious matters about himself. But Trump’s blathering about having it “worse” than revenge-porn survivors, and his quip about “using that bill for myself,” is not a fluke. For a while now, activists who specialize in free speech, digital privacy, and even stopping child sexual abuse have attempted to warn that the bill will not do what it purports to do. Late last month, after TAKE IT DOWN had passed both the House and Senate, the Electronic Frontier Foundation wrote that the bill’s legislative mechanism “lacks critical safeguards against frivolous or bad-faith takedown requests.” For one, the 48-hour takedown deadline means that digital platforms (especially smaller, less-resourced websites) will be forced to use automated filters that often flag legal content—because there won’t be “enough time to verify whether the speech is actually illegal.” The EFF also warns that TAKE IT DOWN requires monitoring that could reach into even encrypted messages between users. If this legislation has the effect of granting law enforcement a means of bypassing encrypted communications, we may as well bid farewell to the very concept of digital privacy. A February letter addressed to the Senate from a wide range of free-expression nonprofits—including Fight for the Future and the Authors Guild—also raised concerns over TAKE IT DOWN’s implications for content moderation and encryption. The groups noted that although the bill makes allowances for legal porn and newsworthy content, “those exceptions are not included in the bill’s takedown system.” They added that private tools like direct messages and cloud storage aren’t protected either, which could leave them open to invasive monitoring with little justification. The Center for Democracy and Technology, a signatory to the letter, later noted in a follow-up statement that the powers granted to the FTC in enforcing such a vague law could lead to politically motivated attacks, undermining progress in tackling actual nonconsensual imagery. Techdirt’s Mike Masnick wrote last month that TAKE IT DOWN is “so badly designed that the people it’s meant to help oppose it,” pointing to public statements from the advocacy group Cyber Civil Rights Initiative, “whose entire existence is based on representing the interests of victims” of nonconsensual intimate imagery. CCRI has long criticized the bill’s takedown provisions and ultimately concluded that the nonprofit “cannot support legislation that risks endangering the very communities it is dedicated to protecting, including LGBTQIA+ individuals, people of color, and other vulnerable groups.” (In a separate statement, the CCRI highlighted other oddities within the bill, like a loophole allowing for nonconsensual sexual media to be posted if the uploader happens to appear in the image, and the explicit inclusion of forums that specialize in “audio files,” despite otherwise focusing on visual materials.) “The concerns are not theoretical,” Masnick continued. “The bill’s vague standards combined with harsh criminal penalties create a perfect storm for censorship and abuse.” Related From Slate Let’s be clear: No one here is at all opposed to sound legislation that tackles the inescapable, undeniable problem of nonconsensual sexual material. All 50 states, along with the District of Columbia, have enacted laws criminalizing exploitative sexual photos and videos to varying degrees. TAKE IT DOWN extends such coverage to deepfake revenge porn, a change that makes the bill a necessary complement to these state laws—but its text is shockingly narrow on the digital front, criminalizing only A.I. imagery that’s deemed to be “indistinguishable from an authentic visual depiction.” This just leads to more vague language that hardly addresses the underlying issue. The CCRI has spent a full decade fighting for laws to address the crisis of nonconsensual sexual imagery, even drafting model legislation—parts of which did make it into TAKE IT DOWN. On Bluesky, CCRI President Mary Anne Franks called this fact “bittersweet,” proclaiming that the long-overdue criminalization of exploitative sexual imagery is undermined by the final law’s “lack of adequate safeguards against false reports.” A few House Democrats looked to the group’s proposed fixes and attempted to pass amendments that would have added such safeguards, only to be obstructed by their Republican colleagues. This should worry everyone. These groups made concerted efforts to inform Congress of the issues with TAKE IT DOWN and to propose solutions, only to be all but ignored. As Masnick wrote in another Techdirt post, the United States already has enough of a problem with the infamous Digital Millennium Copyright Act, the only other American law with a notice-and-takedown measure like TAKE IT DOWN’s, albeit designed to prevent the unauthorized spread of copyright works. Just ask any creatives or platform operators who’ve had to deal with abusive flurries of bad-faith DMCA takedown requests—even though the law includes a clause meant to protect against such weaponization. There’s no reason to believe that TAKE IT DOWN won’t be similarly exploited to go after sex workers and LGBTQ+ users, as well as anyone who posts an image or animation that another user simply doesn’t like and decides to report. It’s not dissimilar to other pieces of proposed legislation, like the Kids Online Safety Act, that purport to protect young netizens via wishy-washy terms that could criminalize all sorts of free expression. Popular in Technology Here’s a hypothetical: A satirical cartoonist comes up with an illustration of Trump as a baby and publishes it on a niche social media platform that they use to showcase their art. A Trump supporter finds this cartoon and decides to report it as abusive pornography, leading to a takedown notice on the cartoonist’s website. The artist and the platform do not comply, and a pissed-off Trump brings the full force of the law against this creator. The process of discovery leads prosecutors to break into the artist’s encrypted communications, revealing drafts of the drawing that the cartoonist had shared with friends. All of this gets the illustrator punished with a brief prison sentence and steep fine, fully sabotaging their bank account and career; the social media platform they used is left bankrupt and shutters. The artists are forced to migrate to another site, whose administrators see what happened to their former home and decide to censor political works. All the while, an underage user finds that their likeness has been used to generate a sexually explicit deepfake that has been spread all over Discord—yet their case is found to have no merit because the deepfake in question is not considered “indistinguishable from an authentic visual depiction,” despite all the Discord-based abusers recognizing exactly whom that deepfake is meant to represent. It’s a hypothetical—but not an unimaginable one. It’s a danger that too few Americans understand, thanks to congressional ignorance and the media’s credulous reporting on TAKE IT DOWN. The result is a law that’s supposedly meant to protect the vulnerable but ends up shielding the powerful—and punishing the very people it promised to help. Get the best of news and politics Sign up for Slate's evening newsletter.
·122 Views