Vercel has just launched something that could significantly change the way we build backends. Workflows officially arrived to solve a problem developers face directly: when moving from prototype to production, we end up spending entire weeks setting up orchestration infrastructure instead of actually improving the product.



The cool thing is that now you just need to mark "use workflow" at the top of the TypeScript function and "use step" in the child functions. Done. The framework takes care of the rest — queue, retry, state persistence, observability — all integrated into the application code. No need for separate orchestration services, message queues, or state databases. Vercel truly simplified it.

Since entering public testing in October 2025, Workflows has already processed over 100 million executions and 500 million steps, serving more than 1,500 clients with 200,000 downloads weekly on npm. The numbers speak for themselves.

For those working with AI agents, Vercel brought some very interesting capabilities. It has durable flows that persist the agent's output — it keeps running even after you close the browser and resumes from where it left off. It features automatic encryption for everything leaving the environment. And the best part: suspension and resumption of executions — you can wait for external events or sleep for days, months, without incurring any compute charges during that time.

The support is also heavy: up to 50 MB per step and 2 GB per execution, allowing space to transmit images and videos in multimodal agents. The AI SDK v7 was released alongside, integrating all of this with calls to tools and state management. There’s a public beta Python SDK expanding the model to the Python ecosystem.

The most interesting part is that the Workflow SDK is open-source and supports self-hosting through the "Worlds" adapter system. The community is already developing adapters for MongoDB, Redis, Cloudflare, and more. Vercel is keeping the door open.

The next version promises native concurrency control, global infrastructure, and snapshot-based runtime to reduce reprocessing costs. If you’re building something with heavy backend or AI agents, it’s worth keeping an eye on what Vercel is doing here.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin