Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 30+ AI models, with 0% extra fees
I just noticed something interesting in the stock price movement of Intel today. The stock closed up 4.70% with a trading volume of $9.38B, which is no coincidence. Behind this is a quite significant announcement that many might be underestimating.
Google has just confirmed that it will deploy Intel’s Xeon 6 processors across multiple generations of its data centers to train and infer AI models. At first glance, it seems like just another technical detail, but in reality, it reflects an important shift in how the industry is thinking about AI infrastructure. For years, the market was obsessed with graphics accelerators, as if they were the only piece that mattered. But now it’s becoming clear that this is not the case.
What I find key here is that Google’s AI Infrastructure CTO explicitly expressed confidence in Intel’s product roadmap. And Lip-Bu Tan, Intel’s CEO, put it well: effective AI expansion requires more than just accelerators. You need balanced computing systems, where the CPU plays a fundamental role in coordinating massive workloads, optimizing energy efficiency, and reducing costs.
What’s fascinating is that Intel has been supplying server processors to Google for nearly thirty years. This new collaboration deepens that relationship. Both companies will expand joint development of customized infrastructure processing units. The Xeon 6 processors are already in Google Cloud’s C4 and N4 instances, supporting everything from massive training to latency-sensitive inference tasks.
From a market perspective, this is notable. In a scenario that was dominated by specialized GPUs, demand for powerful CPUs is resurging. Google’s commitment to multiple generations of Intel processors not only ensures a steady flow of orders but also underscores how much hyperscalers value the reliability and maturity of the ecosystem when building complex heterogeneous systems.
For Intel’s stock price, this is clearly positive in the short term. But the most interesting part is the long-term potential. The trend in AI infrastructure is evolving from “accelerator-centric” to “system-level optimization.” Intel is well-positioned to capture that opportunity, especially if it maintains momentum in its product roadmap and improves energy efficiency.
Of course, we must be realistic. The actual impact will depend on concrete orders and financial results. And Intel still faces competition from AMD and other manufacturers. But for now, the market is recognizing that CPUs have an irreplaceable role in the AI era. That’s what explains the movement we saw today.