Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Been following Nvidia's latest earnings call and jensen huang just dropped something pretty significant that most people might be sleeping on. The man basically told investors that the AI infrastructure spending opportunity is way bigger than most of us think.
Here's what caught my attention: jensen huang pointed out that historically, the world spent around $400 billion annually on classical computing infrastructure. But then he made this wild claim - the computing capacity needed for AI workloads is a thousand times higher. Let that sink in. That's not just incremental growth, that's a completely different scale of demand.
Last year he floated the idea that AI data center infrastructure spending could hit $4 trillion annually by 2030. At the time it sounded ambitious, but if he's right about the sheer magnitude of compute required, it's starting to look more realistic. Especially now that they're actually bringing down inference costs.
Which brings me to the Vera Rubin platform. This isn't just another GPU refresh. According to Nvidia, models can be trained using 75% fewer GPUs compared to Blackwell, and inference token costs drop by 90%. That's the kind of efficiency that changes the economics for every AI company out there. When you can cut costs that dramatically, usage explodes. More usage means more revenue for the companies building these models, which means they'll keep buying more chips. It's a pretty clean feedback loop.
They're shipping samples now and ramping to mass production in the second half of this year. The CFO literally said they expect every major cloud builder to deploy Vera Rubin. That's not a guess - that's confidence.
Looking at the numbers, Nvidia pulled in $215.9 billion in revenue for fiscal 2026, up 65% year-over-year. Data center was $193.7 billion of that, up 68%. And management is guiding for $78 billion in Q1 fiscal 2027, which would be a 77% jump. These aren't slowing down - they're accelerating.
Here's what's wild though - the stock actually looks cheap on a valuation basis right now. Trading at a P/E of 36.1, which is a 41% discount to its 10-year average of 61.6. Wall Street's consensus for fiscal 2027 earnings is $8.23 per share, giving it a forward P/E of just 21.5. The S&P 500 is trading at 24.7 today, so Nvidia could literally be cheaper than the broad market index if it doesn't move much over the next year.
I'm not trying to make some crazy prediction, but if earnings estimates hit, the math says the stock would need to jump 186% just to trade at its historical average valuation. And that's assuming we're not underestimating the opportunity.
The core insight from jensen huang's comments is basically this: we're still in the early innings of AI infrastructure spending. The demand they're seeing now isn't the peak - it's the beginning. They're literally competing with themselves because they can't make chips fast enough to meet demand. Competitors are trying to catch up, but Nvidia's roadmap is already locked in through 2026 and beyond.
When a CEO is saying the market is a thousand times bigger than what we've historically spent on computing infrastructure, and the company is actually executing on that vision with products that reduce costs by 90%, that's worth paying attention to. Whether you're looking to build a position or just watching the space, this setup is pretty compelling.