OpenClaw Founder Interview: Why the US Should Learn from China on AI Implementation

Peter Steinberger, founder of OpenClaw and now an employee of OpenAI, recently sat down for an interview with Bloomberg to discuss the divergent paths of AI adoption between the US and China, the future of AI Agents, and OpenClaw’s development. As a key figure in next-generation AI agent technology, Steinberger emphasized that AI’s value lies in practical application, and the US can learn from China’s proactive attitude toward AI implementation. OpenClaw, originally designed to automate daily tasks like flight check-in and schedule management, has become a window into the future of AI Agents—tools that can call upon other systems, collaborate, and act continuously.

The interview features Peter Steinberger, an Austrian software engineer and founder of OpenClaw—an open-source AI Agent tool once hailed by NVIDIA CEO Jensen Huang as “perhaps the most significant software release ever.” Steinberger recently joined OpenAI to work on Codex, a programming-oriented AI tool with over 2 million weekly users, and the conversation focused on OpenClaw’s impact, the gap in AI adoption between the US and China, and the future of AI Agents. The interview also mentioned three industry updates: OpenAI has ceased support for Sora and ended its Disney partnership; Apple will overhaul Siri in iOS 27 with a new interface and “Ask Siri” button; Amazon has acquired Fauna Robotics to enter the consumer humanoid robot market.

A stark contrast exists in how the US and China embrace OpenClaw, reflecting different attitudes toward AI adoption. In China, OpenClaw has gained widespread popularity across all groups—students, professionals, and the elderly. Many companies even require employees to use the tool, with some tracking “what was automated today” for each staff member to drive efficiency. Despite regulatory restrictions on its use in state-owned enterprises and government agencies, China has become a large-scale experimental ground for AI Agents, allowing the technology to integrate into people’s digital lives. In sharp contrast, OpenClaw has attracted attention among US developers and early users but failed to gain mainstream traction. Many US companies restrict employees from using AI Agent tools due to security concerns, leading Steinberger to quip: “In the US, using OpenClaw may get you fired; in China, not using it may get you fired.” Steinberger acknowledges that neither path is perfect but argues that the US can learn from China’s faster adoption and willingness to experiment with different risk preferences—critical for understanding a technology as new as AI Agents.

OpenClaw was initially designed to automate mundane daily tasks, such as flight check-in and schedule management. While it has gained massive popularity, Steinberger admits it carries potential security risks. Notably, he plans to keep OpenClaw as an open-source project and hand it over to a soon-to-be-established foundation, aiming to maintain its independence even as he works at OpenAI.

During the interview, Steinberger shared profound insights on the future of AI Agents, key challenges facing the industry, and OpenAI’s plans for advancing this technology. He envisions a future where everyone has a personal Agent for daily life and a work Agent for professional tasks, with a core challenge being enabling seamless, secure communication between the two—ensuring privacy for personal data and security for company-internal information. Steinberger emphasizes that AI is too new to fully understand without hands-on use, criticizing the mockery of a Meta security researcher who disclosed issues with Agent tools. He argues that such attempts should be encouraged to avoid silencing innovation, as trial-and-error is the only way to refine the technology.

When asked about OpenClaw’s frenzy in China, he attributes the enthusiasm to the “dopamine feedback” of seeing AI automate tasks, even with an initial 30% success rate. For non-technologists, such as small business owners, the realization that AI can manage emails, schedules, and customer service is transformative—mirroring the enlightenment engineers experienced when first experimenting with AI Agents. Regarding his work at OpenAI, Steinberger is focused on integrating Codex (a programming tool) with OpenClaw. He argues that the line between “programming tools” and “non-programming tools” will eventually disappear, as AI Agents will use coding to compensate for their limitations. He also notes key hurdles in AI Agent adoption: helping users understand that current AI tools (e.g., ChatGPT’s ecosystem) already have the ability to connect to Slack, Google Docs, and other platforms, and the fact that refining tools for real work data—with higher security demands—takes far longer than advancing open-source projects.

Steinberger is actively advancing the establishment of the OpenClaw Foundation, with the goal of keeping it independent from OpenAI. The foundation already has partners including NVIDIA, and is in communication with Microsoft, ByteDance, and Tencent. Steinberger hopes to maintain “Swiss-like neutrality” for the foundation, whose core mission is to get more people interested in AI and prepare society for its transformative impact.

Peter Steinberger’s interview highlights the critical role of practical adoption in advancing AI Agents. The divergent paths of the US and China—one cautious about security risks, the other proactive in experimentation—offer valuable lessons for both. Steinberger emphasizes that trial-and-error is the only way to understand and refine AI technology, and the OpenClaw Foundation aims to drive broader AI literacy. As AI Agents become more integrated into daily life and work, the willingness to adapt and experiment will be key to unlocking their full potential—something the US can learn from China’s approach.

This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin