Good morning! Today, senior correspondent Anna North is here to talk about the rise of deepfake nudes — and what teenagers are doing to fight back.
—Caroline Houck, senior editor of news |
|
|
How do you stop deepfake nudes? |
There’s a lot of debate about the role of technology in kids’ lives, but sometimes we come across something unequivocally bad. That’s the case with AI “nudification” apps, which teenagers are using to generate and share fake naked photos of their classmates.
At Issaquah High School in Washington state, boys used an app to “strip” photos of girls who attended last fall’s homecoming dance, according to the New York Times. At Westfield High School in New Jersey, 10th grade boys created fabricated explicit images of some of their female classmates and shared them around school. Students from California to Illinois have had deepfake nudes shared without their consent, in what experts call a form of “image-based sexual abuse.”
Now advocates — including some teens — are backing laws that impose penalties for creating and sharing deepfake nudes. Legislation has passed in Washington, South Dakota, and Louisiana, and is in the works in California and elsewhere. Meanwhile, Rep. Joseph Morelle (D-NY) has reintroduced a bill that would make sharing the images a federal crime.
Francesca Mani, a 15-year-old Westfield student whose deepfaked image was shared, started pushing for legislative and policy change after she saw her male classmates making fun of girls over the images. “I got super angry, and, like, enough was enough,” she told Vox in an email sent via her mother. “I stopped crying and decided to stand up for myself.”
Supporters say the laws are necessary to keep students safe. But some experts who study technology and sexual abuse argue that they’re likely to be insufficient, since the criminal justice system has been so inefficient at rooting out other sex crimes. “It just feels like it’s going to be a symbolic gesture,” said Amy Hasinoff, a communications professor at the University of Colorado Denver who has studied image-based sexual abuse.
She and others recommend tighter regulation of the apps themselves so the tools people use to make deepfake nudes are less accessible in the first place. “I am struggling to imagine a reason why these apps should exist’’ without some form of consent verification, Hasinoff said. |
Arne Dedert/picture alliance via Getty Images |
Deepfake nudes are a new kind of sexual abuse |
So-called revenge porn — nude photos or videos shared without consent — has been a problem for years. But with deepfake technology, “anybody can just put a face into this app and get an image of somebody — friends, classmates, coworkers, whomever — completely without clothes,” said Britt Paris, an assistant professor of library and information science at Rutgers who has studied deepfakes.
There’s no hard data on how many American high school students have experienced deepfake nude abuse, but one 2021 study conducted in the UK, New Zealand, and Australia found that 14 percent of respondents ages 16 to 64 had been victimized with deepfake imagery. Nude images shared without consent can be traumatic, whether they’re real or not. When she first found out about the deepfakes at her school, “I was in the counselor’s office, emotional and crying,” Mani said. “I couldn’t believe I was one of the victims.”
When sexual images of students are shared around school, they can experience “shaming and blaming and stigmatization,” thanks to stereotypes that denigrate girls and women, especially, for being or appearing to be sexually active, Hasinoff said. That’s the case even if the images are fake because other students may not be able to tell the difference.
Moreover, fake images can follow people throughout their lives, causing real harm. “These images put these young women at risk of being barred from future employment opportunities and also make them vulnerable to physical violence if they are recognized,” Yeshi Milner, founder of the nonprofit Data for Black Lives, told Vox in an email. |
Stopping deepfake abuse may require reckoning with AI |
To combat the problem, at least nine states have passed or updated laws targeting deepfake nude images in some way, and many others are considering them. In Louisiana, for example, anyone who creates or distributes deepfakes of minors can be sentenced to five or more years in prison. Washington’s new law, which takes effect in June, treats a first offense as a misdemeanor.
The federal bill, first introduced in 2023, would give victims or parents the ability to sue perpetrators for damages, in addition to imposing criminal penalties. It has not yet received a vote in Congress but has attracted bipartisan support.
However, some experts worry that the laws, while potentially helpful as a statement of values, won’t do much to fix the problem. “We don’t have a legal system that can handle sexual abuse,” Hasinoff said, noting that only a small percentage of people who commit sexual violence are ever charged. “There’s no reason to think that this image-based abuse stuff is any different.”
Some states have tried to address the problem by updating their existing laws on child sexual abuse images and videos to include deepfakes. While this might not eliminate the images, it would close some loopholes. (In one recent New Jersey lawsuit, lawyers for a male high school student argued he should not be barred from sharing deepfaked photos of a classmate because federal laws were not designed to apply “to computer-generated synthetic images.”)
Meanwhile, some lawyers and legal scholars say that the way to really stop deepfake abuse is to target the apps that make it possible. Lawmakers could regulate app stores to bar them from carrying nudification apps without clear consent provisions, Hasinoff said. Apple and Google have already removed several apps that offered deepfake nudes from the App Store and Google Play.
However, users don’t need a specific app to make nonconsensual nude images; many AI image generators could potentially be used in this way. Legislators could require developers to put guardrails in place to make it harder for users to generate nonconsensual nude images, Paris said. But that would require challenging the “unchecked ethos” of AI today, in which developers are allowed to release products to the public first and figure out the consequences later, she said.
“Until companies can be held accountable for the types of harms they produce,” Paris said, “I don’t see a whole lot changing.”
—Anna North, senior correspondent |
|
|
| One Flu Over the Cowcow’s Nest |
Avian flu, which recently leaped from chickens to cows, has now been detected in milk. How worried should humans be about the outbreak? |
|
|
-
What values do you want your child to have?: The answer is probably different depending on your party. [NPR]
-
In case you somehow missed it: Kristi Noem for some reason seemingly thought that writing about killing a puppy would endear her to voters?. [Vox]
-
Arizona repeals its Civil War-era abortion ban: Why do these laws even exist in the first place? We've got the explainer for you. [Vox]
|
Kent Nishimura/Bloomberg via Getty Images |
-
Falling for AI: Things can apparently get “very steamy” when you hijack ChatGPT Plus and get it to respond like your boyfriend. [WSJ]
-
On bodies and fitness: “These lifestyle gyms aren't competing with Ozempic — they're embracing it.” [Quartz]
-
Dating apps suck: Will one made by academics instead of corporations be better? [Guardian]
|
|
|
Are baby bonds a good investment? |
A unique policy program that could help close the racial wealth gap. |
| |
|
Are you enjoying the Today, Explained newsletter? Forward it to a friend; they can sign up for it right here.
And as always, we want to know what you think. We recently changed the format of this newsletter. Any questions, comments, or ideas? We're all ears. Specifically: If there is a topic you want us to explain or a story you’re curious to learn more about, let us know by filling out this form or just replying to this email.
Today's edition was edited and produced by Caroline Houck. We'll see you tomorrow! |
|
|
|