Been seeing a lot of chatter about Michael Burry's latest AI doom prediction, and honestly, I think people are overweighting his contrarian credibility without looking at the actual numbers.



Don't get me wrong—Burry's legend is real. The guy made roughly $100 million personal and $700 million for his investors by nailing the 2008 subprime crisis. That's the stuff of legend, and yeah, Christian Bale played him in The Big Short. But here's the thing nobody talks about: that was nearly two decades ago.

Since then? His track record has been pretty rough. He's been consistently early and wrong on the bearish calls while markets kept climbing. He literally shut down his hedge fund recently because he couldn't align with where markets were heading. So when evaluating michael burry net worth and his current credibility, you have to factor in this inconsistency.

Now he's saying AI is 1999 dot-com 2.0. Let me break down why this thesis doesn't actually hold up.

First, he's claiming big tech is cooking the books with depreciation schedules. Says Meta, Microsoft, Alphabet are inflating earnings by stretching out GPU depreciation. The argument is that GPUs die fast, so they should depreciate quicker. But that's not how it works in practice. Most AI infrastructure actually has a 15-20 year useful life. And here's the kicker—old GPUs don't just become worthless. They're still valuable for inference, running models for end users. They have real residual value.

Second, he warns that massive CAPEX spending will destroy cash flow. Except... it's literally not happening. Hyperscalers are reporting increased operating cash flow because of AI revenue. Alphabet went from under $100 billion in annual operating cash to $164 billion in 2026. That's not a company drowning in capex—that's a company printing money.

Plus, margins are expanding. We're seeing reports of $3+ returns for every $1 invested in AI infrastructure. The new wave of agentic AI is supposedly cutting costs by 25% or more. That's efficiency, not a bubble.

Third point: he's comparing NVIDIA to Cisco in 2000, saying both are overvalued. When Cisco peaked in March 2000, its P/E was over 200. NVIDIA's current P/E? 47. That's not even close to bubble territory by comparison. Michael Burry net worth came from betting against obvious excess; this isn't obvious excess.

The market is giving us real signals too. H100 GPU rental prices jumped about 17% since mid-December. That's scarcity and demand, not a speculative frenzy. Agentic AI adoption is driving this, and it's bullish for the whole infrastructure stack—companies like Nebius, CoreWeave, and IREN are benefiting. Even power plays like Bloom Energy are seeing major demand because energy is the real constraint for hyperscalers.

Options activity is interesting here too. Big money is making serious bets on NVIDIA and Bloom ahead of earnings. We're talking million-dollar positions and whale-sized $9 million bets on March calls. That's not retail FOMO; that's sophisticated capital positioning.

Look, I respect what Michael Burry did in 2008. His michael burry net worth and reputation were earned through that call. But being right once doesn't mean you're always right, especially when you're making predictions that contradict actual cash flow data, margin expansion, and real-world GPU scarcity. The AI boom has real economics behind it, not just hype. That's the difference between this and 1999.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin