NVIDIA's Market Share Drops Significantly, Where Are the Opportunities in the New Stage of the AI Revolution?


This is the ninth article in the AI Investment Research 100 Series.
In the previous articles, we looked at Intel, AMD, and ARM. Their stock prices have all seen substantial gains over the past year—AMD doubled, Intel tripled, and ARM reached a new historical high. After these rises, a simple question arises:
Can these already appreciated stocks still be held? Are there still opportunities among those that haven't risen?
To answer this question, one cannot avoid a core term—reasoning. The companies discussed earlier, as their stocks rose, repeatedly featured this word in analyses.
So: How big is the reasoning track? What stage is it at? Which companies will benefit, and how? Which are already priced in by the market, and which are not?
This is the ninth article in the AI Investment Research 100 Series, with a length of 15k words. The content is rich yet easy to read. It is recommended to bookmark it first and then read.
1. How Big Is the Track?
Model training is "writing programs," while reasoning is "the process of calling that program daily." After GPT was trained, hundreds of millions of people ask it questions every day, and each interaction consumes reasoning computational power. Claude Code runs a task, with the agent running 100 rounds on its own, each round being reasoning.
Multiple industry studies and media references point in the same direction: after models enter production environments, reasoning will become the main component of lifecycle costs, with estimates ranging from 80% to 90%. In other words, in the future AI era, 8 out of 10 dollars of computing power bills will be spent on reasoning.
However, over the past three years, market discussions have almost entirely focused on training, because training is a more "sexy" story—who has more H100s, bigger parameters, or trains the next-generation model first. Reasoning has been regarded as a side task after training.
This cognitive bias is being reversed, and this is the fundamental reason why the semiconductor companies in this group have been revalued over the past year.
So, reasoning is a big track, but how big exactly? It can be measured from five specific angles.
First is the number of users. ChatGPT has 900 million weekly active users and 50 million paying users. The comparison on the Chinese side is more direct—daily token call volume increased from 100 billion at the beginning of 2024 to 140 trillion in 2026, a 1,400-fold increase. This area is still far from saturation.
Second is usage intensity. OpenAI’s token processing volume was 6 billion per minute in October 2025, and by April 2026, it had reached 15 billion—an increase of 2.5 times in half a year. Enterprise version revenue accounts for over 40%, and enterprise users’ usage intensity is dozens of times that of consumers.
Third is dialogue length. Context length has grown from a few hundred tokens in early days to now, with DeepSeek API documentation listing V4 Pro / Flash context lengths of 1 million tokens, with a maximum output of 384k. Longer documents mean higher memory and computational power consumption per reasoning session.
Fourth is the increasing computational cost of models themselves. Reasoning models like OpenAI’s GPT-1, DeepSeek R1, and Claude Thinking, before answering questions, internally "think" for thousands or even tens of thousands of tokens. Jensen Huang once mentioned, taking DeepSeek R1 as an example, that reasoning models may require much higher computational loads, even reaching hundreds of times more.
In the past, asking AI a question directly gave an answer; now, asking AI a difficult problem involves it thinking for half a minute first, then providing an answer. That "half-minute thinking" is the additional computational power consumption.
Fifth is agents. A single agent task usually requires calling the model 10-100 times. OpenAI Codex’s weekly active users have already surpassed 4 million (as of April 22, 2026)—this is just one product of a single company. An industry insider estimates that the overall computational power consumption of AI intelligences could reach more than 10 times that of large language models of similar parameter scale.
Multiplying these five factors, the total demand for reasoning will see a magnitude expansion within three to five years. This is not an exaggerated narrative but a judgment increasingly aligned with mainstream views.
"Where Are the Opportunities as NVIDIA’s Reasoning Market Share Declines and the AI Revolution Enters Its Second Stage?"
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin