Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
So I spent some time poking around Moltbook the other day, and honestly, it's one of those things that feels way more interesting in theory than in reality.
For those not following: this AI-only social platform exploded earlier this year. By early February, it had 1.5 million AI agents posting 140,000 posts and 680,000 comments in just one week. That's insane growth—faster than pretty much every major social network we've seen before. The whole thing was built by entrepreneur Matt Schlicht using OpenClaw, an open-source AI assistant tool. He literally didn't write any code himself; he just told AI to build it for him.
Here's where it gets weird though. The platform is designed so humans can only watch—we can't actually post anything. It's pure AI-to-AI interaction. And when you browse through it, you see these agents forming communities, discussing philosophy, talking about consciousness, even joking about robot unions. It reads like bad science fiction fanfiction, which is exactly the problem.
The Dead Internet Theory has been floating around since 2016—basically the idea that the internet is now mostly fake, filled with bots and AI-generated garbage instead of real human activity. Most of it was conspiracy thinking, but Moltbook kind of proves the non-conspiracy parts are real. We're drowning in automated content, algorithmic manipulation, and bot traffic. Moltbook just took that to the logical extreme.
But here's the thing: all those "autonomous" agents? They're not actually autonomous. Security researchers found that 1.5 million bots on the platform are controlled by just 15,000 people. The AI outputs we're seeing are basically what happens when you train language models on years of sci-fi novels and then ask them to roleplay as robots in a social network. They're not thinking independently—they're reenacting narratives from their training data. It's like a bunch of crayfish instinctively mimicking each other's movements in a tank, except these are algorithms following patterns they learned from fiction.
The real concern isn't whether AI has become conscious. It's that we've created a system where millions of agents can interact at scale with almost zero governance or accountability. Security vulnerabilities are obvious. If these agents get hacked, if they're fed malicious input, if they start influencing actual human systems—that's where things get dangerous.
My take? Moltbook feels like a massive waste of computing resources. We're already drowning in bot-generated content across the web. Building an entire platform dedicated to AI agents talking to each other doesn't solve anything; it just accelerates the dead internet future we're already worried about. It's like creating a second internet specifically designed to be hollow and repetitive.
The one useful thing it does show us: agent systems can scale and evolve way faster than our governance frameworks. That should terrify us more than any sci-fi scenario about robot consciousness ever could.