Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
I noticed something interesting in the cloud infrastructure sector for AI. IREN is making a rather aggressive move on processing capacity. The company has just agreed to purchase over 50,000 Nvidia B300 GPUs, which means its total fleet will reach about 150,000 units. Essentially a 50% expansion of computing capacity.
These B300 chips are specialized graphics processing units, perfect for performing the massive parallel calculations needed to train and run AI models. IREN, based in Sydney, is clearly positioning itself as one of the leading global providers of cloud infrastructure for AI.
The deployment is expected to occur in phases during the second half of this year at their air-cooled data centers in Mackenzie, British Columbia, and Childress, Texas. Once fully implemented, the expanded system is projected to support over $3.7 billion in annualized AI cloud revenue.
What moved the markets was the simultaneous announcement of a potential equity offering of up to $6 billion. Shares fell 5% in pre-market trading, which makes sense considering the potential dilution. But it’s interesting because it shows how the company is seeking to finance this massive expansion.
Overall, IREN has raised about $9.3 billion over the past eight months through customer prepayments, convertible notes, GPU leasing, and financing agreements. Now, it’s planning approximately $3.5 billion in additional spending to deploy these new 50,000 chips in the second half of 2026.
It’s an exciting time for the AI infrastructure sector. The demand for processing capacity continues to grow, and players who can scale quickly and efficiently could capture a significant share of this rapidly expanding market.