
Its not actually you: Teens cope while adults debate harms of fake nudes
arstechnica.com
Seeking adults in the room Its not actually you: Teens cope while adults debate harms of fake nudes Most kids know that deepfake nudes are harmful, Thorn survey says. Ashley Belanger Mar 3, 2025 9:16 am | 4 Credit: Maskot | Maskot Credit: Maskot | Maskot Story textSizeSmallStandardLargeWidth *StandardWideLinksStandardOrange* Subscribers only Learn moreTeens increasingly traumatized by deepfake nudes clearly understand that the AI-generated images are harmful.And apparently so do many of their tormentors, who typically use free or cheap "nudify" apps or web tools to "undress" innocuous pics of victims, then pass the fake nudes around school or to people they know online. A surprising recent Thorn survey suggests there's growing consensus among young people under 20 that making and sharing fake nudes is obviously abusive.That's a little bit of "good news" on the deepfake nudes front, Thorn's director of research, Melissa Stroebel, told Ars. Last year, The New York Times declared that teens are now confronting an "epidemic" of fake nudes in middle and high schools that are completely unprepared to support or in some cases even acknowledge victims.At least one 2023 survey found that deepfake pornography has already become normalized adult entertainment, with 74 percent of 1,522 US male deepfake porn users reporting they "don't feel guilty" about viewing it. And so far, adults disagreeing on the social norms around deepfake porn has stewed chaos in schools. In Pennsylvania, parents even sought to prosecute a head of a private school that had to shut down after the administrator allegedly for months ignored reports of a single student creating fake nudes that over time targeted nearly 50 female students.But as their friends and classmates are harmed, many teens are apparently starting to understand that using AI to generate non-consensual nudes is not OK.On Monday, Thorn shared results of a survey of 1,200 young people ages 1320 that found that 84 percent of young people "overwhelmingly recognize" deepfake nudes as abuse that harms the victim depicted.Stroebel told Ars that for young people, making deepfake nudes is not yet normalized. But Thorn's survey, which found that 10 percent of surveyed young people personally know a victim, suggests that "more likely than not, it is happening in almost every school system around the country," and likely "every town has had at least one person targeted in some way, shape or form," she suggested."Even if it is one person creating it, one person targeted, the likelihood is that it is spreading throughout the community," Stroebel told Ars, suggesting that such non-consensual sharing "is exponentially increasing the number of kids impacted and the scale of the harm to the child depicted."Roberta Duffield, the director of intelligence for Blackbird.AIwhich makes deepfake detection toolstold Ars that currently, "deepfake nudes targeting kids (and adults) represent one of the most insidious and wide-reaching harms of AI."And especially for young people "still forming their identities," the "psychological, reputational, and social consequences can be severe and long-lasting," Duffield said. She agreed with Stroebel that a multi-pronged approach to the problem is urgently needed, not just involving schools and parents, but also lawmakers and tech companies."Protecting kids from AI-generated sexual material should be at the forefront of content moderation policies of social media platforms and in the legislative offices of policymakers," Duffield said.I was horny: How teens justify making fake nudesSince nudify apps became more mainstream, the majority of victims have been female, but young men have been targeted, too, Thorn found, calling for more research into all young people's experiences. That trend perhaps helped push overarching sentiment, at least among young people, away from defending the sexual abuse as a mere joke. Stroebel was surprised to find that most young survey respondents didn't think the joke was funny anymore."Everyone will see it, they will be embarrassed, and it will never go away," a 13-year-old male respondent said. Another male respondent, 18 years old, seemed to understand the complexity even better, explaining that "it dehumanizes the person, as you use them for pleasure without consent.""It really is both surprising and really hopeful to me that the kids have clarity on this subject," Stroebel told Ars, especially compared to the 2023 survey results finding that many adult men don't think there's anything wrong with fake nudes.Stroebel told Ars that because young people are digital natives, they are more likely to become early adopters of emerging technologies like nudify apps; unfortunately, they also happen to be in a time of their lives when they're likely to engage in riskier behaviors. Motivations for creating deepfake nudes included "sexual curiosity, pleasure-seeking, revenge," and peer pressure, Thorn found.One 14-year-old boy claimed he created deepfake nudes to strike back at a bully.A year older, another teen boy said he just wanted to "see what it would look like.""I was dared to," an 18-year-old female respondent admitted, while an 18-year-old man confessed, "I was horny" and "WASNT thinking straight."Only 9 percent of respondents disputed harms, rationalizing decisions to create, view, or share AI nudes by claiming that nobody is harmed. Most denying harms insisted that the images can't cause real harm because they're fake, while others suggested that victims aren't physically hurt or downplayed the gravity of emotional pain inflicted."As soon as everyone knows its a deepfake, all feelings of panic and fear are gone," a 16-year-old female respondent opined. "Its not actually you, so theres no pressure. Its a little stressful, but its not actually their body."A 13-year-old male respondent agreed it was unethical but ultimately not harmful, saying, "you control what offends you. Of course, its wrong to make deepfake nudes, but ultimately its fake."Thorn suspects that young people's experiences with deepfake nudes are underreported, partly because victims may be too ashamed or traumatized to report them and partly because they may have no idea the fake nudes exist. About 27 percent of respondents told Thorn that they created fake nudes for personal consumption and never shared them.However, most young people who create fake nudes share them (65 percent), Thorn found. And they're about as likely (30 percent) to share them with other kids at school as with online-only contacts (29 percent). Duffield told Ars this urge to share is why education campaigns are needed to "help folks question content before sharing."Fake nudes risk becoming normalized for teensThorn's survey shows that teens have more nuanced thoughts on deepfake nudes than some experts might expect. And news reports tracking scandals rocking schools show that society's attitude about fake nudes matters to teens, who are more likely to seek help combating harms from fake nudes (76 percent) than they are coping with other forms of online sexual interactions (46 percent), Thorn found.Most young people surveyed use online tools like blocking or reporting when confronting deepfake nudes. But more than half (57 percent) sought help offline, Thorn said, most often turning to a parent, school authority, friends, or police. To ensure that young people keep reaching out for help, more support resources are needed, Thorn's survey said, including helplines and legal aid.Additionally, every school in the country urgently needs to draft an explicit policy making it clear to parents and students how the school responds when a deepfake nudes scandal hits, Stroebel told Ars. That should include crisis response plans, Duffield said."Educators could establish standardized guidelines for handling deepfake incidents, provide resources for educators and parents, and collaborate with law enforcement to address legal challenges," Duffield said.Thorn also recommends more research into how young people use blocking and reporting to combat threats online.Stroebel told Ars that she wants to better understand why kids choose to block versus report in certain circumstances. She's worried that kids overwhelmingly choose to block users sharing fake nudes, rather than reporting the attack and prompting a broader platform response that could help minimize harm to more users. It's also concerning, she said, to think that particularly young users who feel imminently in danger may wrongly expect that reporting a fake nude triggers immediate action.Other changes tech companies could consider include prioritizing stronger deepfake detection, Duffield recommended.Perhaps most urgently, Stroebel told Ars that institutional support is needed to stop fake nudes from becoming a normal part of the teenage experience.Right now, she said that kids want to know how to protect themselves and their friends, and, in this moment, there just isn't a clear message that making fake nudes carries serious consequences. Even schools where scandals have hit don't seem to publicly share their plans to support kids once the backlash dissipates, potentially leaving open a debate over what's right and wrong that Stroebel said leaves young people in harm's way the longer it goes unsettled."There is not a uniform institutional response, whether we are talking about at the policy level, at the family unit level, at the school level, and normalization ultimately ends up existing when it is widely accepted and it is anticipated," Stroebel said. "If there isn't an institutional response for too long" explicitly clarifying "that this is harmful behavior, then it becomes [about] individual opinion on if this is harmful or not," Stroebel said.That risks leaving today's absent standard as the status quo and keeping "the control in hands of the person making a choice of whether or not they're going to create non-consensual abuse imagery," Stroebel said.Many teens told Thorn that they're already acknowledging the abuse and already asking for help. As a 15-year-old girl explained to Thorn, "even though its not real," victims currently "have no way to prove that, and they cant just deny it, because their face is most likely on it."Ashley BelangerSenior Policy ReporterAshley BelangerSenior Policy Reporter Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience. 4 Comments
0 Комментарии
·0 Поделились
·50 Просмотры