Recently, I revisited a rather noteworthy case concerning Telegram's handling of porn videos and sex crime content in South Korea. This incident actually reflects a fundamental dilemma faced by encrypted communication platforms.



The story begins with ongoing pressure from South Korean authorities on Telegram. The Korea Communications Standards Commission (KCSC) ultimately forced Telegram to delete 25 videos containing sexual exploitation content, mainly created using deepfake technology, targeting minors and women. It may seem like just 25 videos, but the underlying issues are much more serious—there are reports that a Telegram group has 220k members dedicated to sharing such content, most of whom are minors.

Data from the South Korean police is even more shocking. Over the past three years, 60% of deepfake-related criminal cases they investigated involved minors. From 2021 to mid-2023, the number of deepfake cases surged from 156 to 297, nearly doubling. At least 500 schools' students have become victims. This is not just a South Korea porn videos problem, but a crisis facing society as a whole.

Even more dramatic is that Telegram CEO Pavel Durov was arrested by French authorities in August 2024. He was charged with multiple crimes, including conspiracy related to the distribution of child sexual abuse materials and refusing to cooperate with police investigations. Although he was released after paying a 5 million euro bail, the incident thrust Telegram into the spotlight of public opinion. Telegram tried to defend itself, claiming that the platform or its owners should not be responsible for user abuse, but this argument clearly failed to convince anyone, especially in South Korea.

Interestingly, this isn’t Telegram’s first scandal of this kind. In 2020, the "Nth Room" incident broke out, where a 20-year-old named Cho Ju-bin operated a sex slave chatroom via Telegram, extorting at least 103 women, 26 of whom were minors. Cho was eventually sentenced to 40 years in prison, but the scandal had already tarnished Telegram’s reputation.

The South Korean government’s stance is very firm. President Yoon Suk-yeol has repeatedly emphasized a zero-tolerance policy toward digital sex crimes, especially those involving minors. Under the Sexual Violence Prevention and Victim Protection Act, creating and distributing explicit deepfake videos can be punished with up to 5 years in prison or a fine of 50 million won (about $37.5k).

Now, Telegram seems to realize the seriousness of the problem. In late August 2024, they issued a rare apology statement, acknowledging communication issues with the KCSC and expressing hope to build trust with the South Korean government. To show sincerity, Telegram provided a dedicated email hotline for reporting illegal content. KCSC officials said they plan to strengthen cooperation with Telegram through this hotline and an exclusive email address to address the circulation of deepfake sexual exploitation materials.

Honestly, it’s hard to say how effective all this will be in practice. Telegram’s history of non-cooperation is well-known, and Durov is still facing legal proceedings. Deleting 25 videos is like a drop in the ocean compared to the vast amount of illegal content out there. Solving the South Korea porn videos problem requires not only platform commitments but also real technological investment and law enforcement efforts. The battle has just begun, and there’s a long road ahead.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin