Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Been following Nvidia's moves pretty closely, and there's something important shifting in how the whole artificial intelligence chip market is evolving right now.
For years, Nvidia crushed it with GPU chips designed for raw training power—think of it like a drag racing engine. But here's what's changing: the industry is pivoting hard toward inference, and that's a completely different game. Inference needs efficiency and sustained reasoning, not just maximum horsepower. It's more like navigating mountain roads than going straight.
Rubin is Nvidia's answer to this shift, and it's not just another chip. It's actually a full platform—six components working together as an AI supercomputer, combining CPUs, GPUs, networking gear. The Vera Rubin NVL72 server can do inference computing at just 10% of the cost per token compared to their current Blackwell flagship. That's the kind of efficiency that gets enterprise buyers excited.
What's wild is that hyperscalers are already lining up. Nvidia mentioned back in November they had $500 billion in orders through 2026, and that number's probably climbing. Every major AI company is planning to spend even more this year, which suggests the artificial intelligence chip demand isn't slowing down—it's just shifting shape.
Wall Street's pretty bullish on this. Analysts are projecting revenue jumping from $187 billion to around $327 billion this fiscal year, then up to $419 billion next year. Nvidia's also got a track record of beating estimates consistently, so actual numbers could run even higher.
Now, the valuation question. Stock's trading at about 25 times sales right now, which looks expensive until you zoom out. At next year's projected revenue, it drops to 11 times sales. That's a meaningful difference if execution stays on track.
But there are real risks. Nvidia depends heavily on a concentrated customer base—if one or two hyperscalers pull back on spending, those estimates flip fast. Plus, competition's heating up. Broadcom's already had success with custom artificial intelligence chip solutions for certain players, and others are building alternatives too.
The consensus seems to be that Nvidia stays dominant long-term because of its installed base and the overall growth in AI. But plenty of smart investors are waiting for the earnings call to see what management says about the outlook before making moves. Makes sense to me—there's no rush when you've got this much momentum already priced in.