Just caught something interesting in Amazon's latest earnings call that most people are probably sleeping on. The company just announced they're dropping $200 billion into capex for 2026 - that's a massive jump from last year. But here's what caught my attention: a huge chunk of that money isn't going toward Nvidia GPUs like everyone assumes.



Amazon's custom chip business inside AWS has hit over $10 billion in annual run rate, and get this - it's still growing at triple-digit rates year over year. We're talking about their Graviton CPUs and AI accelerator chips like Trainium and Inferentia. The demand for Trainium chips specifically has been wild. CEO Andy Jassy mentioned that Trainium2 had the fastest ramp-up they've ever seen, and they're already getting strong interest in Trainium3 and Trainium4.

What's really happening here is a broader shift in the AI chip market. It's not just Amazon either. Google's pushing their Tensor Processing Units, Microsoft's rolling out Maia chips for Copilot, Meta's building their own silicon for inference and training. Even Anthropic ordered over $20 billion worth of chips for their own data centers. The demand for custom silicon across hyperscale platforms is accelerating hard.

Now, the interesting play here is Marvell Technology. They partnered with Amazon to design the Trainium chips and locked in a five-year deal in late 2024 to supply chips across AWS data centers. Yeah, there's been some noise about Marvell losing ground on newer Trainium designs - Amazon's apparently using other partners for Trainium3 and Trainium4. But that's only part of the story.

Marvell's real money comes from networking chips - interconnect, switching, storage stuff that gets used across data center infrastructure. That business doesn't change much generation to generation. Plus their recent Celestial AI acquisition positions them well for AI-focused interconnect solutions. The five-year agreement is still very much in play.

There was also worry that Marvell would lose Microsoft business, but management addressed that in December - nothing has changed in their outlook. In fact, they're forecasting a significant step-up in custom AI accelerator revenue by fiscal 2028 when Microsoft's Maia 300 ramps up.

What I'm seeing is that while Nvidia remains important to all these companies, the real growth story in the AI chip market is shifting toward custom silicon. Nvidia's biggest growth days might actually be behind it. Meanwhile, companies like Marvell that are embedded in these partnerships could see substantial growth ahead. The stock was trading at reasonable multiples too, which makes it worth paying attention to if you're looking at plays in this space.

The broader takeaway: the AI chip market is fragmenting away from pure GPU plays. The hyperscalers are building their own solutions, and the chipmakers supporting that transition could be where the real returns are.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin