Companies increase investment in the "morpheme economy" industry chain layout

robot
Abstract generation in progress

The data shows that in March this year, our country’s daily average number of Token calls has already exceeded 1.4 trillion, with growth over two years surpassing 1,000x. The surge in Token calls not only confirms that AI application scenarios are continually deepening, but also creates new opportunities—relevant companies are accelerating their deployment along the “Token economy” industrial chain, focusing on high-performance compute capacity supply, Token operational services, high-quality datasets, and more.

So-called Tokens are the smallest information units processed by large models—for example, in everyday work and life, every time you ask AI a question or use AI to generate content, you are calling Tokens. Particularly since this year, the explosion of intelligent agents such as “lobsters” has driven a significant increase in Token consumption.

“Since the end of January this year, some model companies have set performance record highs in which their revenue surpassed the total revenue for all of 2025 in just 20 days. Behind these numbers is a new business logic based on Token-based billing that is accelerating in its evolution.” Liu Ruilong, director of the National Data Administration of China, said at the 2026 Annual Meeting of the China Development Forum recently. He noted that around the calling, distribution, and settlement of Tokens, a new value system is accelerating in evolution, taking shape, and becoming an important path through which the AI industry could potentially monetize.

A surge in Token calls behind it is the consumption of compute capacity, the running of algorithms, and the investment in electricity, which also sets higher requirements for compute capacity. Recently, many companies have been accelerating their innovative deployments of high-performance compute capacity; among them, architecture innovations represented by supernodes are an important path to improving compute efficiency.

For example, Inspur released its first wireless-cable box-type supernode scaleX40; in typical inference scenarios, with a similar card-count scale, the inference throughput performance is improved by more than 4 times, which can significantly enhance the Token output capability of unit compute capacity. ZTE also introduced supernode technology. By reconstructing the compute interconnection system, it integrates dozens to hundreds of multi-vendor GPU logic into a unified computing unit, achieving system-level optimization of compute capacity.

In the view of industry insiders, the core innovation in the compute capacity industry will center on reducing effective Token costs. Deep synergy between compute capacity and applications, optimization across the full software-and-hardware stack, and ecosystem synergy across the entire industrial chain will be the focus. ZTE’s Chief Strategic and Ecological Expert, Tu Jiashun, said that the landing and promotion of supernode technology will drive the evolution of intelligent computing infrastructure toward higher efficiency, greener operations, and greater openness.

“An upsurge in Tokens puts higher demands on the compute density of compute capacity, memory-access bandwidth, and communication efficiency, among other factors. This is prompting compute capacity to shift from pursuing single-card peak performance to systematized ‘memory—bandwidth—interconnection’ coordinated optimization.” Lu Feng, dean of the Beijing Frontier Future Technology Industry Development Research Institute, told reporters, “The ‘Token economy’ will drive underlying compute capacity infrastructure toward higher energy efficiency development.”

Given their characteristics of measurability, priceability, and tradability, Tokens also serve as the “settlement unit” connecting technology supply and commercial demand. Recently, the three major telecom operators have all said that they will further explore “Token operational services.”

“In the intelligent age, to build a new form of intelligent economy, we must accelerate the transformation and upgrading from ‘traffic operations’ to ‘Token operations.’” Liu Guqing, general manager of China Telecom, introduced that China Telecom’s Token operations already have initial practice. Taking an example of a certain enterprise’s AI private deployment on the China Electric Information Soil platform, it customized and developed 73 intelligent agents, driving an annual consumption of 1.2 trillion Tokens.

In addition, China Unicom will accelerate the construction of a compute operations model of “intelligent agents + Token + AI cloud.” China Mobile said that by integrating high-quality AI models, it builds trusted inference services, and it connects the complete service chain in which “intelligent agents use Tokens, and Tokens pull compute capacity,” with the Token market opening up rapidly.

The large increase in daily Token call volume also indicates that dataset supply is increasing substantially. Enabling AI innovation development through data elements has entered a stage of positive and benign interaction. The reporter learned from the National Data Administration that as of the end of 2025, more than 100,000 high-quality datasets have already been built nationwide, with a total size exceeding 890PB (petabytes, a unit of computer storage capacity), which is roughly equivalent to about 310 times the total amount of digital resources in the National Library of China.

“By decomposing data into Tokens for AI processing and application, it provides solid support for model iteration and application deployment.” Lu Feng said. As Tokens become standardized, priced, and tradable units of AI capability, value across the industrial chain will also be redistributed to each optimization node across the “full lifecycle of Tokens.” In particular, in the data services segment, the supply of high-quality Tokens will become high-value resources, and it is also expected to drive the development of independent commercial products such as prompt engineering, Token compression, and vertical Token libraries.

It is reported that next, the National Data Administration will, in collaboration with all parties, deeply implement a new round of high-quality dataset construction action plans. Guided by scenario needs, it will build AI-Ready (AI readiness) high-quality datasets that are technically feasible, practical and convenient, and provide quality assurance, so as to achieve improvements in both the quantity and quality of high-quality dataset supply.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin