Deepfake trains AI with a large number of porn videos: but the law only protects the face, and no one has ever cared about who that body belongs to?

Discussions about AI deepfake pornography have long focused on the faces being synthesized, ignoring who the bodies actually belong to; over 10,000 TB of adult content is suspected to be used to train nudify models.
(Background: UC research on “AI fog” phenomenon: 14% of office workers are driven crazy by agents and automation, with 40% considering quitting)
(Additional context: Breaking news — Huang Renxun boards Air Force One at the last moment to accompany Trump on his visit to China, with NVIDIA chip exports becoming a focal point)

Table of Contents

Toggle

  • Bodies as training data
  • The misalignment of the contract era
  • The irony of the Take It Down Act

Whenever someone discusses AI deepfake pornography, the focus almost always lands on the face, the face that’s been synthesized onto someone, making her do things she never did. But there’s another issue almost no one mentions: whose body is it?

According to a report from Technology Review, Jennifer, a 37-year-old licensed psychotherapist in New York, used facial recognition software in 2023 to search for her own adult videos from ten years ago, and found one she had never seen before: her body, with someone else’s face superimposed.

She recognized the background as a scene shot in 2013, and realized: “Someone used my body to create a deepfake.”

Bodies as training data

The term “deepfake” was born in November 2017, when Reddit user “deepfakes” combined celebrity faces onto adult performers. Since then, adult creators’ bodies have become the most frequently stolen material, and this has “been happening all along,” said Corey Silverstein, a lawyer specializing in the adult industry.

But the nature of the problem has changed. Adult performers’ bodies are no longer just being extracted for individual videos; they are used as training data to teach AI how to generate “realistic nude images,” how to move, and how to look authentic. This process occurs without informed consent and is almost impossible to trace.

The commercial model of the “nudify” app is built on this: upload a photo of yourself clothed, and you get a fake nude image. These apps almost certainly use over 10,000 TB of online adult content as training sources, leaving creators with little recourse.

Hany Farid, a digital forensics expert at UC Berkeley, said: “These are all black boxes.” But given the widespread presence of adult content online, it’s “a reasonable assumption” that it’s used for AI training.

The issue isn’t just limited to training data. AI can now fully reproduce the appearance and voice of adult performers. Creator Tanya Tate recently learned that a well-known fan spent $20k chatting sexually with an AI version of “her,” created by scammers. After being scammed, multiple fans began to accuse Tate herself and spread false statements.

Copyright enforcement company Takedown Piracy used digital fingerprint technology to remove 130 million infringing videos from a single platform, Google. Even if videos are altered or faces replaced, digital fingerprints can still identify the original material.

The misalignment of the contract era

Many adult performers signed contracts years ago that included clauses allowing “publishers to use any current or future technology” to exploit their content. The assumption at the time was VHS-to-DVD conversion.

No one foresaw that “future technology” would mean training AI with their content to generate synthetic replacements that could replace their jobs. Stephen Casper, a PhD student in computer science at MIT, pointed out that performers who started creating before the rise of AI could not have pre-approved AI uses; these risks are “retroactively imposed,” as Jennifer describes.

AI’s deception capabilities are also accelerating. Farid’s 2025 research found that participants correctly identified AI-generated speech only about 60% of the time, barely better than random guessing.

The irony of the Take It Down Act

The only federal law in the U.S. currently targeting deepfake content is the Take It Down Act, which requires websites to remove non-consensual intimate images (NCII) within 48 hours. The law aims to protect victims but may have the opposite effect.

Santa Clara Law Professor Eric Goldman pointed out that anyone can report legal, consensual adult content as NCII, forcing platforms to take it down. This makes the law a potential tool for censorship, aligning with Project 2025’s goal of eradicating porn from the internet.

U.S. law currently does not consider such infringements as privacy violations because “we don’t know who to hold responsible,” Goldman said. The EU, UK, and Australia have announced restrictions on nudify apps, but once these apps are taken down, they often reappear under different names.

Reba Rocket said: “AI girls will do anything you want; they don’t say no. That terrifies me, especially when they’re trained using real people. And once it’s online, it’s there forever.”

TRUMP-0.93%
NVDA1.04%
TAKE1.96%
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pinned