Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Zhipu Zhang Peng: When the model is strong enough, the API itself is the best business model
How AI · API Mode Becomes the Core Engine of Large-Model Commercialization?
Daily Economic News reporter: Ke Yang Daily Economic News editor: Liao Dan
In 2025, widely regarded as a key year for large models to move from technological contests to real commercial deployment.
On March 31, Zhipu (HK02513, share price 693.5 HKD, market cap 309.2 billion HKD) released its first set of post-listing results. Total annual revenue was 724 million yuan, up 131.9% year over year; its core MaaS platform (Models as a Service) ARR (annual recurring revenue) was about 1.7 billion yuan, increasing 60-fold within 12 months.
That evening, during a media phone call, Zhipu CEO Zhang Peng said that in the era of large models, the essence of commercial value can be summed up as “the intelligence upper bound multiplied by the Token (token) consumption scale,” and the API (application programming interface) model is precisely the optimal path to convert intelligence into tradable productive inputs.
APIs Become the Main Business Path
At the earnings release, Zhang Peng emphasized that improving the intelligence upper bound is the only “first principle” in the AGI (artificial general intelligence) era of large models.
Zhang Peng simplified the business logic of the AGI era into a formula: the intelligence upper bound multiplied by the Token consumption scale. The former determines pricing power, while the latter determines the size of value.
Behind this judgment is a process in which the business model has gradually become clearer over the past year. Zhang Peng said that APIs—together with the rise of Token economics—fundamentally represent a way to transform AI infrastructure capabilities into resources for economic operation, rather than a one-off windfall.
In Zhang Peng’s interpretation, behind this is a global paradigm shift: companies represented by Anthropic, by delivering the strongest models via APIs to enterprises and developers, enable intelligence to participate in creating economic value.
The same business logic is being realized at Zhipu as well. Zhang Peng said, “When the model is strong enough, APIs themselves are the best business model. The quality of intelligence creates pricing power, and deep usage by enterprises and users creates growth at scale.”
“AI capabilities moving from being usable and playable to real production—solving increasingly complex problems—means that Token consumption and API calls can truly be converted into economic value,” Zhang Peng said. In the long run, he added, the essence of pricing is determined by value; resources that can effectively replace human labor and improve conversion efficiency and intelligence levels are scarce and valuable.
When asked whether growth is sustainable—an issue the market cares about—Zhang Peng gave a clear view: “This is the beginning of a structural long-term trend.” He said that for a long time, the industry has been searching for a very simple, economical, and powerful business model so that this growth trend can quickly accelerate.
He further pointed out that with the emergence of new application forms such as OpenClaw and the development of native intelligence on the device side, it is expected that Token consumption will continue to be amplified, showing exponential growth.
“Using the intelligence upper bound as a barrier and APIs as the main product form—this is the business path Anthropic and Zhipu are both cashing in on,” Zhang Peng said. He added that Zhipu has become one of the domestic manufacturers with the highest paid Token consumption. More importantly, breakthroughs in the intelligence upper bound are driving exponential growth in Token consumption.
Agent Will Bring a New Paradigm to the Software Industry
Zhang Peng believes that in 2026, the intelligent paradigm will evolve from lightweight Vibe Coding (ambient programming) to industrial-grade Agentic Engineering (agentic engineering), and ultimately develop into digital engineers with capabilities for autonomous planning, environmental awareness, and self-iteration. This will bring further breakthroughs in the intelligence upper bound and a second round of exponential growth in Token consumption.
During this process, the form of the software industry will also change. When asked whether Agents will replace software companies, Zhang Peng did not give a direct conclusion, but stressed that “a brand-new paradigm will definitely emerge in the software industry.” There may be some traditional software companies that are influenced by the new paradigm, and that is a normal pattern of replacement.
By contrast, he is more focused on the role of outputting underlying capabilities. Zhang Peng said that as a foundational model provider, Zhipu’s positioning is to continuously provide intelligent capabilities and to lead innovation in new paradigms.
In its financial report, Zhipu introduced two new concepts—TAC (Token Architecture Capability, Token architecture capability) and LLM-OS (large language model operating system). Zhang Peng emphasized that these two concepts are not merely visions, but a summary and distillation of existing trends.
“In the future, the standard for measuring the value of an individual or organization will no longer be how much information it possesses, but instead, as a Token architect, its ability to build complex Agent systems within a given budget and drive the large model to complete complex tasks through closed-loop execution,” Zhang Peng explained. He said that TAC equals the number of intelligent calls multiplied by the quality of intelligence, multiplied again by economic conversion efficiency. Zhipu’s goal is to become the infrastructure for enhancing society-wide TAC, so that every Token can be converted into deliverable economic incremental value.
As for the concept of LLM-OS, it points to who defines the next-generation computing platform. Zhang Peng believes that a traditional operating system is a scheduler of hardware resources, whereas a large model operating system is a scheduler of intelligence. “Large models are consuming software. In the future, computing platforms will no longer be stacks of Apps, but the coordinated ecosystem of an API marketplace and an Agent matrix. Whoever’s model enters the system kernel will have the defining power over the next-generation computing.”
In an interview, Zhang Peng also addressed the balance between open source and commercialization. When asked, “Will open source affect API commercialization capabilities?” he said that the essence of open source is to drive technological innovation, give back to the community, and build brands and standards, with the goal of attracting more developers to join the GLM ecosystem. From a commercialization perspective, as models become more complex, the cost-benefit of enterprises building and deploying in-house is decreasing—“because model iteration is too fast; sometimes in-house builds can’t keep up with our 3 to 4 month iteration cycle. Many customers who originally tried private deployment are gradually switching, or partially switching, to using cloud-based large model APIs.”
Regarding market concerns that “large model companies will be replaced by giants,” Zhang Peng believes independent large model vendors have a natural advantage during cycles of rapid technological iteration. “Big companies may not be able to achieve sufficient competitiveness in all scenarios, which is determined by resource constraints. They themselves are also complex ecosystems; they won’t completely rely on themselves in all scenarios, and they also need to connect with excellent suppliers to ensure they don’t lose the lead at any point in time.”
Daily Economic News