www.techdirt.com
Trump Promises To Abuse Take It Down Act For Censorship, Just As We Warnedfrom the take-what-down,-mr.-president? deptThu, Mar 6th 2025 09:21am - Mike MasnickDuring his address to Congress this week, Donald Trump endorsed the Take It Down Act while openly declaring his plans to abuse it: And Im going to use that bill for myself too, if you dont mind, because nobody gets treated worse than I do online, nobody.(You might think a former president openly declaring his intent to abuse a content moderation law would be big news. The media, apparently swamped with other Trump outbursts, didnt even seem to notice.)This is, of course, exactly what we (and many others) warned about in December when discussing the Take It Down Act. The bill aims to address a legitimate problem non-consensual intimate imagery but does so with a censorship mechanism so obviously prone to abuse that the president couldnt even wait until it passed to announce his plans to misuse it.And Congress laughed. Literally.Lets talk about non-consensual intimate imagery (NCII) for a minute. (People used to call it revenge porn, but thats a terrible name its not porn, its abuse.) The tech industry, after a fairly slow start, has actually been reasonably good more recently at trying to address this problem. Youve got NCMECs Take It Down system helping kids get abusive images removed. Youve got StopNCII.org doing clever things with hashes that let platforms identify and remove bad content without anyone having to look at it. These arent perfect solutions, but they show what happens when smart people try to solve hard problems thoughtfully.But Congress (specifically Senators Ted Cruz and Amy Klobuchar) looked at all this work and said nah, lets just make websites legally liable if they dont take down anything someone claims is NCII within 48 hours. Its the nerd harder or we fine you approach to tech regulations.You cant just write a law that says take down the bad stuff. I mean, you can, but it will be a disaster. You have to think about how people might abuse it. The DMCAs notice-and-takedown system for copyright at least tried to include some safeguards theres a counternotice process, there are (theoretical) penalties for false notices. But TAKE IT DOWN? Nothing. Zero. Nada.We already see thousands of bogus DMCA notices attempting to remove content with no basis in the law, even with those safeguards in place. What do you think will happen with a law that has no safeguards at all? (Spoiler alert: The president just told us exactly what will happen.)Even given the seriousness of the topic, and the presidents support, you might think that Congress would care about the fact that the bill almost certainly violates the First Amendment, and thus would stand a high likelihood of being tossed out as unconstitutional. CDT tried to warn them, explaining that forcing websites to take down content without any court review creates some thorny constitutional problems. (Who knew that requiring private companies to censor speech based on unverified complaints might raise First Amendment concerns? Well, everyone whos ever taken a constitutional law class, but apparently not Congress.)Congress could have fixed those problems. But chose not to.As currently drafted, however, the TAKE IT DOWN Act raises complex questions implicating the First Amendment that must be addressed before final passage. As a general matter, a government mandate for a platform to take down constitutionally protected speech after receiving notice would be subject to close First Amendment scrutiny. The question is whether a narrowly drawn mandate focused on NDII with appropriate protections could pass muster. Although some NDII falls within a category of speech outside of First Amendment protection such as obscenity or defamation, at least some NDII that would be subject to the Acts takedown provisions, even though unquestionably harmful, is likely protected by the First Amendment. For example, unlike the proposed Acts criminal provisions, the takedown provision would apply to NDII even when it was a matter of public concern. Moreover, the takedown obligation would apply to all reported content upon receipt of notice, before any court has adjudicated whether the reported image constitutes NDII or violates federal law, let alone whether and how the First Amendment may apply. Legally requiring such take-down without a court order implicates the First Amendment.Even if you think the concerns about fake takedown notices are overblown, shouldnt you want to make sure that the law would pass First Amendment scrutiny when it goes to court? It seems important.Unfortunately, it does not appear that Congress paid attention. The Senate recently passed the Act via unanimous consent, and its now headed to the House with strong support. Earlier this week, Melania Trump endorsed the bill, and Donald Trump briefly mentioned it during his address to Congress, and as mentioned above, he explicitly revealed his plans to abuse it:And Elliston Berry, who became a victim of an illicit deepfake image produced by a peer. With Ellisons help, the Senate just passed the Take It Down Act and this is so important. Thank you very much, John. John Thune. Thank you. Stand up, John. [Applause] Thank you, John. Thank you all very much. Thank you and thank you to John Thune and the Senate.Great job. To criminalize the publication of such images online is terrible, terrible thing. And once it passes the House, I look forward to signing that bill into law. Thank you. And Im going to use that bill for myself too, if you dont mind, because nobody gets treated worse than I do online, nobody.There it is a sitting president openly declaring his intent to abuse a content moderation law to remove speech he doesnt like. This isnt speculation or paranoia about potential misuse its an explicit promise, made in front of both houses of Congress, as well as multiple Supreme Court Justices, of his intent to weaponize the law against protected speech.So here we are. Civil liberties groups have been jumping up and down and waving their arms about how this bill needs basic safeguards against abuse. The media, apparently suffering from Trump-crazy-statement-fatigue, has mostly yawned. Congress, eager to show theyre doing something about online abuse, doesnt seem interested in the details.And why would they be? The bill is framed as protecting people from having compromising imagery posted online. Who could be against that? Its like being against puppies or ice cream.But heres the thing: When someone tells you they plan to abuse a law, maybe listen? When that someone is the President of the United States, and hes saying it in front of Congress and multiple Supreme Court Justices, maybe pay extra attention?The good folks at EFF have set up an action alert asking people to contact their representatives about the bill. But realistically, the bill has a strong likelihood of becoming law at this point.Look, I can already hear the counterargument: NCII is so harmful that we need strong measures, even if theres some collateral damage to free speech. And yes, NCII is genuinely harmful. But heres the problem a law designed with giant, exploitable holes doesnt actually solve the problem. If it becomes primarily a tool for the powerful to suppress criticism (as Trump just promised), victims of actual NCII will be left with a discredited law that courts may eventually strike down entirely. The real goal should be a targeted, constitutional solution not a censorship free-for-all that the president openly plans to weaponize against his critics. That serves no one except those who want to silence opposition.Weve spent the last two decades watching the DMCAs takedown system be abused to silence legitimate speech, even with its (admittedly weak) safeguards. Now were about to create a similar system with no safeguards at all, precisely when the president has announced to laughter and applause his plans to weaponize it against critics.Congress is building a censorship machine and handing the controls to someone who just promised to abuse it. Thats not fighting abuse thats enabling it.