TradingBase.AI Column | Elon Musk starts selling "computing power," the real changes in the AI quantitative industry are just beginning

robot
Abstract generation in progress

Recently, the most worth watching thing in the AI industry isn’t the release of a particular model, nor another update to an Agent’s capabilities from some company. Instead: SpaceXAI announced that it will open up all of Colossus 1’s computing power to Anthropic. More than 220,000 NVIDIA GPUs and 300MW of computing capacity will be used directly to expand Claude-series services.

Many people understand this as a normal collaboration. But if you look deeper, you’ll find that the signals released by this event are far more important than “Claude getting stronger.” Because it means that competition in the AI industry is shifting from “model competition” to “competition over computing power and infrastructure.” And this change may affect the AI quant industry more directly than most people imagine.

The real scarcity in the AI industry is no longer models

In the past few years, the whole industry has been talking about model capabilities. Who is smarter, who has stronger reasoning, who has a longer context window—these almost determine the valuation logic of virtually every AI company. But now, a more realistic problem has begun to overtake the model itself: there isn’t enough compute. Why does Anthropic need to take down the entire Colossus 1 in one go? The reason is simple. Claude’s growth rate has started to outpace the capacity of existing infrastructure. Recently, Anthropic has been aggressively purchasing computing power globally, from AWS and Google to Microsoft, and now to SpaceX.

This means:

What is truly scarce in the AI industry has shifted from “models” to:

GPU

Electricity

Data centers

Cooling systems

Large-scale scheduling capability

What Musk truly wants isn’t just Grok anymore

Many people still use “Grok vs ChatGPT” to understand xAI. But judging from this collaboration, it’s clear Musk has started moving into another stage. Because what SpaceXAI has now is not just models.

It also has:

Rockets

The Starlink network

GPU clusters

Ultra-large data centers

An energy system

AI models

This means that SpaceXAI is transitioning from a “model company” to an “AI infrastructure platform.” Especially the most easily overlooked line in this collaboration—both sides are exploring “Orbital AI Compute.” Many treat it as a sci-fi concept, but the underlying logic is actually very grounded. Because the biggest bottleneck in today’s AI industry has started to become: energy and land on the ground. Large AI data centers need continuous power supply and cooling, while the scale of model training is still growing exponentially. What’s truly special about SpaceX isn’t that it has AI models; it’s that it has the ability to send computing power into space.

The AI industry is entering a “resource war”

If you look at all the recent developments across the AI industry together, you’ll notice a trend that’s becoming increasingly obvious: the AI industry is entering a “resource war.” OpenAI is expanding data centers. Anthropic is stockpiling computing power. Google and Microsoft are tying their efforts to AI infrastructure. Meanwhile, SpaceXAI is starting to try to integrate energy, satellites, data centers, and AI into a single system.

This means: in the future, what will truly determine the landscape of the AI industry may no longer be model capability, but who controls the resources needed to run AI. Model performance can be narrowed down, and capabilities can be copied, but once infrastructure reaches scale, it becomes a long-term moat. This is actually quite similar to how the internet industry evolved in the past. First everyone competed over products; then over ecosystems; and finally over infrastructure. And now, the AI industry is moving into its third phase.

Why this matters more for TradingBase

Many people think this is just something happening in the large-model industry. But in reality, the impact of this event on the AI quant industry may be even more direct. Because AI quant is, at its core, a “computing power industry.” Real-time market analysis, multi-market data processing, agent collaboration execution, strategy reasoning, risk control—behind all these capabilities is fundamentally the dependence on continuous computing resources. In the past, quant focused on strategies; but in the future, AI quant may start competing over: who can possess more stable, more continuous, and lower-cost AI infrastructure capability. This has already become increasingly clear. Why are more and more AI quant systems converging? Why are the differences among many models getting smaller and smaller? Because models themselves are being rapidly leveled.

And in the future, what will truly determine the gap may become:

Data capability

Compute scheduling capability

System stability

Infrastructure efficiency

Agent collaboration capability

These are things users can’t usually see, but they will directly determine long-term competitiveness. For TradingBase, this is also the truly important direction for the next stage of AI quant. Because the core of future AI trading competition may no longer be simply “whose strategy is stronger,” but “who can run the entire system stably over the long term.”

The AI industry’s next stage may not be “super models”

In the past, everyone kept discussing who would build the next GPT-6, Claude Next, or a super Agent. But now, a deeper question is starting to emerge: if all models become strong enough, will the model still be what truly matters? Judging by how the industry is changing right now, the answer may already be shifting. What may be truly important in the future AI industry may not be the model itself, but who has everything needed to run those models. Including: energy, compute, data centers, networks, and system scheduling capability. And this is what Musk’s move this time is really trying to convey.

Conclusion

Many people are still debating “who will win the AI model war.” But the real logic of competition may already be changing. Because as model capabilities become increasingly similar, what truly matters will change as well. In the past, the AI industry competed on algorithms; now it’s starting to compete on energy, compute, and infrastructure; and in the future, the decisive factor in determining the industry’s overall landscape may be: who can control the underlying resources required to run the entire AI world.

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin