Recently, I noticed a significant legal development. The Minnesota State Legislature in the United States has just passed a ban law targeting AI-generated fake nude images, and Governor Tim Walz is ready to sign it into law. This issue actually reflects an increasingly serious problem.



In simple terms, it bans any websites and applications from providing AI tools that can generate realistic nude images for users, including those "disrobing" generation techniques. Not only is usage prohibited, but advertising and promotion of such platforms are also not allowed. The level of enforcement is quite strong.

More importantly, the penalty clauses that follow are significant. Victims can directly sue the companies operating these platforms, claim damages for actual losses, and also seek up to triple damages plus punitive damages. The state attorney general can also take action, with fines of up to $500k for each violation. These fines will be used to support services for victims of sexual assault, domestic violence, and child abuse. I find this design quite interesting, as it links penalties with victim protection.

If the bill is signed into law, it will take effect on August 1, but only applies to new cases occurring afterward, not retroactively. This provides platforms with a window to adjust.

From my perspective, this reflects the increasing importance that U.S. states are placing on issues like AI-generated fake nude images. After all, this involves personal privacy, reputation rights, and could even escalate into harassment and sexual violence. Minnesota has taken an early step, and other states may follow with similar legislation.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin