Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
Been following this story unfold over the past couple years and honestly, it's wild how a messaging app became ground zero for a digital sex crime crisis in Korea.
So basically, Telegram's been under massive pressure from South Korean authorities over deepfake pornography spreading on the platform. We're talking about sexually exploitative videos created using deepfake technology, mostly targeting minors and women. One Telegram group alone reportedly had 220,000 members sharing this stuff.
The Korea Communications Standards Commission finally got Telegram to remove 25 pieces of deepfake porn material after months of pushing back. Before that, the company was basically ignoring regulatory requests. They apologized for the "miscommunication" and set up a dedicated email hotline for reporting illegal content. Pretty telling that it took this much pressure to get basic compliance.
What really amplified everything was Pavel Durov's arrest in France last year on charges including complicity in child sexual abuse material distribution. The timing basically forced Telegram into the spotlight globally. Durov posted 5 million euros bail, but the legal drama put serious scrutiny on how the platform handles moderation.
The scale of the problem in Korea is genuinely alarming. Police data showed deepfake cases nearly doubled from 156 in 2021 to 297 by mid-2023. University students, high school students, K-pop singers, actresses getting targeted with fake explicit videos. At least 500 schools were flagged as affected. Six out of ten deepfake crime victims over three years were children.
This actually traces back to 2020 when a guy named Cho Ju-bin ran something called the Nth Room on Telegram, basically a sex trafficking operation using blackmail. He got 40 years prison time, but it showed how vulnerable the platform was even then.
President Yoon has been pretty vocal about zero tolerance for this. His administration pushed hard on enforcement, and honestly, the law's already there—up to five years prison or roughly $37,500 fine for creating and distributing these deepfake videos in Korea.
The real question now is whether Telegram actually commits to meaningful change or if this is just PR damage control. Removing 25 videos and setting up an email address feels like a band-aid on a much bigger wound. The platform's history of dodging regulators doesn't exactly inspire confidence that they'll suddenly become proactive about policing this content.
It's become this intersection of privacy rights, platform accountability, and protecting vulnerable people from digital exploitation. Korea's basically become the testing ground for how governments push back on encrypted platforms that claim they can't moderate.