$ARM-based CPUs are expected to take over the host CPU role in custom AI ASIC servers, rising from about 25% share in 2025 to at least 90% by 2029, according to Counterpoint Research


This shift is being driven by hyperscalers like Google, AWS, Microsoft, and Meta, which are increasingly designing their own chips
The main reasons are better power efficiency, lower token costs, improved supply chain control, and deeper vertical integration
At the company level, Google is expanding Axion in TPU systems, AWS is scaling Graviton with Trainium, Microsoft is using Cobalt with Maia, and Meta is adopting Arm for MTIA
As custom AI infrastructure scales, x86 loses share in host CPUs, while Arm and the TSMC ecosystem benefit
post-image
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin