Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
Been following this Telegram situation in South Korea and it's honestly pretty alarming. The platform's been facing massive pressure from authorities over deepfake sexual exploitation materials circulating on its channels, and things just escalated significantly.
So here's what went down: The Korea Communications Standards Commission finally got Telegram to remove 25 videos depicting sexual abuse material, mostly created using deepfake technology targeting minors and women. This came after the company had been largely ignoring government requests for years. According to reports, one Telegram group alone had around 220,000 members actively sharing these materials, with a significant focus on minors. We're talking about at least 500 schools across Korea being affected by these crimes.
The scale is honestly disturbing. Police data shows deepfake cases nearly doubled from 156 in 2021 to 297 by mid-2023. And get this: six out of ten victims investigated by police over the past three years were children. The Korea National Police Agency launched a preliminary investigation after discovering criminals were using Telegram to create and distribute these videos. As of late August, they'd received 88 reports and identified 24 suspects.
What made things worse for Telegram was the arrest of its CEO Pavel Durov in France on August 24. French authorities charged him with complicity in distributing child sexual abuse material and failing to cooperate with police. Despite posting €5 million bail, the arrest threw Telegram into the center of a global content moderation debate. The company tried defending itself, saying "it's absurd to claim a platform or its owner are responsible for abuse of that platform," but that didn't really land with critics, especially in Korea.
This isn't even Telegram's first rodeo with these issues. Back in 2020, a 20-year-old named Cho Ju-bin ran the "Nth Room" on Telegram, a sex slave chatroom where he blackmailed over 100 women, including minors, into producing violent sexual imagery. He got 40 years in prison, but the scandal exposed just how vulnerable Telegram's infrastructure is when it comes to preventing illegal content distribution.
President Yoon Suk Yeol has been pretty vocal about this, calling for a zero-tolerance approach to digital sex crimes. He said it plainly: "It's an exploitation of technology while relying on the protection of anonymity. It's a clear criminal act." Under South Korea's Sexual Violence Prevention Act, creating and distributing sexually explicit deepfakes can get you up to five years in prison or around $37,500 in fines.
After all this pressure, Telegram finally issued an apology on August 29, acknowledging miscommunication with Korean authorities and committing to better cooperation. They've agreed to set up a dedicated hotline and email channel for reporting illegal content. Whether this actually leads to meaningful change though? That's the real question. Given Telegram's track record of non-compliance and the ongoing legal challenges facing Durov, a lot of people remain skeptical about whether the platform can actually be trusted to crack down on these crimes. The removal of 25 videos is a start, but it's really just scratching the surface of what's become a national crisis.