ARSTECHNICA.COM
Adults told her to move on. Instead, teen won fight to criminalize deepfakes.
"Victory" Adults told her to move on. Instead, teen won fight to criminalize deepfakes. Here's how one teen plans to fix schools failing kids affected by nudify apps. Ashley Belanger Apr 4, 2025 2:19 pm | 3 Credit: Akiko Aoki | Moment Credit: Akiko Aoki | Moment Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreWhen Francesca Mani was 14 years old, boys at her New Jersey high school used nudify apps to target her and other girls. At the time, adults did not seem to take the harassment seriously, telling her to move on after she demanded more severe consequences than just a single boy's one or two-day suspension.Mani refused to take adults' advice, going over their heads to lawmakers who were more sensitive to her demands. And now, she's won her fight to criminalize deepfakes. On Wednesday, New Jersey Governor Phil Murphy signed a law that he said would help victims "take a stand against deceptive and dangerous deepfakes" by making it a crime to create or share fake AI nudes of minors or non-consenting adultsas well as deepfakes seeking to meddle with elections or damage any individuals' or corporations' reputations.Under the law, victims targeted by nudify apps like Mani can sue bad actors, collecting up to $1,000 per harmful image created either knowingly or recklessly. New Jersey hopes these "more severe consequences" will deter kids and adults from creating harmful images, as well as emphasize to schoolswhose lax response to fake nudes has been heavily criticizedthat AI-generated nude images depicting minors are illegal and must be taken seriously and reported to police. It imposes a maximum fine of $30,000 on anyone creating or sharing deepfakes for malicious purposes, as well as possible punitive damages if a victim can prove that images were created in willful defiance of the law.Ars could not reach Mani for comment, but she celebrated the win in the governor's press release, saying, "This victory belongs to every woman and teenager told nothing could be done, that it was impossible, and to just move on. Its proof that with the right support, we can create change together."On LinkedIn, her mother, Dorota Maniwho has been working with the governor's office on a commission to protect kids from online harmsthanked lawmakers like Murphy and former New Jersey Assemblyman Herb Conaway, who sponsored the law, for "standing with us."When used maliciously, deepfake technology can dismantle lives, distort reality, and exploit the most vulnerable among us," Conaway said. "Im proud to have sponsored this legislation when I was still in the Assembly, as it will help us keep pace with advancing technology. This is about drawing a clear line between innovation and harm. Its time we take a firm stand to protect individuals from digital deception, ensuring that AI serves to empower our communities.Doing nothing is no longer an option for schools, teen saysAround the country, as cases like Mani's continue to pop up, experts expect that shame prevents most victims from coming forward to flag abuses, suspecting that the problem is much more widespread than media reports suggest.Encode Justice has a tracker monitoring reported cases involving minors, including allowing victims to anonymously report harms around the US. But the true extent of the harm currently remains unknown, as cops warn of a flood of AI child sex images obscuring investigations into real-world child abuse.Confronting this shadowy threat to kids everywhere, Mani was named as one of TIME's most influential people in AI last year due to her advocacy fighting deepfakes. She's not only pressured lawmakers to take strong action to protect vulnerable people, but she's also pushed for change at tech companies and in schools nationwide."When that happened to me and my classmates, we had zero protection whatsoever," Mani told TIME, and neither did other girls around the world who had been targeted and reached out to thank her for fighting for them. "There were so many girls from different states, different countries. And we all had three things in common: the lack of AI school policies, the lack of laws, and the disregard of consent."Yiota Souras, chief legal officer at the National Center for Missing and Exploited Children, told CBS News last year that protecting teens started with laws that criminalize sharing fake nudes and provide civil remedies, just as New Jersey's law does. That way, "schools would have protocols," she said, and "investigators and law enforcement would have roadmaps on how to investigate" and "what charges to bring."Clarity is urgently needed in schools, advocates say. At Mani's school, the boys who shared the photos had their names shielded and were pulled out of class individually to be interrogated, but victims like Mani had no privacy whatsoever. Their names were blared over the school's loud system, as boys mocked their tears in the hallway. To this day, it's unclear who exactly shared and possibly still has copies of the images, which experts say could haunt Mani throughout her life. And the school's inadequate response was a major reason why Mani decided to take a stand, seemingly viewing the school as a vehicle furthering her harassment."I realized I should stop crying and be mad, because this is unacceptable," Mani told CBS News.Mani pushed for NJ's new law and claimed the win, but she thinks that change must start at schools, where the harassment starts. In her school district, the "harassment, intimidation and bullying" policy was updated to incorporate AI harms, but she thinks schools should go even further. Working with Encode Justice, she is helping to push a plan to fix schools failing kids targeted by nudify apps."My goal is to protect women and childrenand we first need to start with AI school policies, because this is where most of the targeting is happening," Mani told TIME.Encode Justice did not respond to Ars' request to comment. But their plan noted a common pattern in schools throughout the US. Students learn about nudify apps through ads on social mediasuch as Instagram reportedly driving 90 percent of traffic to one such nudify appwhere they can also usually find innocuous photos of classmates to screenshot. Within seconds, the apps can nudify the screenshotted images, which Mani told CBS News then spread "rapid fire" by text message and DMs, and often shared over school networks.To end the abuse, schools need to be prepared, Encode Justice said, especially since "their initial response can sometimes exacerbate the situation."At Mani's school, for example, leadership was criticized for announcing the victims' names over the loudspeaker, which Encode Justice said never should have happened. Another misstep was at a California middle school, which delayed action for four months until parents went to police, Encode Justice said. In Texas, a school failed to stop images from spreading for eight months while a victim pleaded for help from administrators and police who failed to intervene. The longer the delays, the more victims will likely be targeted. In Pennsylvania, a single ninth grader targeted 46 girls before anyone stepped in.Students deserve better, Mani feels, and Encode Justice's plan recommends that all schools create action plans to stop failing students and respond promptly to stop image sharing.That starts with updating policies to ban deepfake sexual imagery, then clearly communicating to students "the seriousness of the issue and the severity of the consequences." Consequences should include identifying all perpetrators and issuing suspensions or expulsions on top of any legal consequences students face, Encode Justice suggested. They also recommend establishing "written procedures to discreetly inform relevant authorities about incidents and to support victims at the start of an investigation on deepfake sexual abuse." And, critically, all teachers must be trained on these new policies."Doing nothing is no longer an option," Mani said.Ashley BelangerSenior Policy ReporterAshley BelangerSenior Policy Reporter Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience. 3 Comments
0 Comments 0 Shares 50 Views