Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 30+ AI models, with 0% extra fees
Vercel has just launched something that could significantly change the way we build backends. Workflows officially arrived to solve a problem developers face directly: when moving from prototype to production, we end up spending entire weeks setting up orchestration infrastructure instead of actually improving the product.
The cool thing is that now you just need to mark "use workflow" at the top of the TypeScript function and "use step" in the child functions. Done. The framework takes care of the rest — queue, retry, state persistence, observability — all integrated into the application code. No need for separate orchestration services, message queues, or state databases. Vercel truly simplified it.
Since entering public testing in October 2025, Workflows has already processed over 100 million executions and 500 million steps, serving more than 1,500 clients with 200,000 downloads weekly on npm. The numbers speak for themselves.
For those working with AI agents, Vercel brought some very interesting capabilities. It has durable flows that persist the agent's output — it keeps running even after you close the browser and resumes from where it left off. It features automatic encryption for everything leaving the environment. And the best part: suspension and resumption of executions — you can wait for external events or sleep for days, months, without incurring any compute charges during that time.
The support is also heavy: up to 50 MB per step and 2 GB per execution, allowing space to transmit images and videos in multimodal agents. The AI SDK v7 was released alongside, integrating all of this with calls to tools and state management. There’s a public beta Python SDK expanding the model to the Python ecosystem.
The most interesting part is that the Workflow SDK is open-source and supports self-hosting through the "Worlds" adapter system. The community is already developing adapters for MongoDB, Redis, Cloudflare, and more. Vercel is keeping the door open.
The next version promises native concurrency control, global infrastructure, and snapshot-based runtime to reduce reprocessing costs. If you’re building something with heavy backend or AI agents, it’s worth keeping an eye on what Vercel is doing here.