Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
I just saw that SpaceXAI's Colossus 2 is putting a lot of pressure on AI development. They are training seven different models simultaneously, which is quite ambitious when I think about it.
The interesting part is the variety of scales they are exploring. They have two versions with 1 trillion parameters, two more with 1.5 trillion, a 6 trillion model, and a 10 trillion model. It's like they are testing different sizes to see which offers the best balance between capacity and efficiency.
Among these models is Imagine V2, which seems to be an important project for them. What catches the eye is that Colossus 2 is being used as the core infrastructure for all this parallel training.
Honestly, the AI landscape is becoming increasingly competitive. SpaceXAI recognizes that there is still a long way to go, but these moves suggest they are serious about building world-class AI capabilities. It's the kind of infrastructure investment only some players can afford.