Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
Cerebras IPO: $48.8 billion valuation, "NVIDIA challenger" is it a bubble or the new king?
Article: Xiao Hei, Deep Tide TechFlow
Priced on May 13, trading opens on May 14, NASDAQ ticker CBRS.
This is the largest IPO globally so far in 2026. The underwriters are Morgan Stanley, Citigroup, Barclays, and UBS. This lineup secured 20 times oversubscription during the roadshow, pushing the initial price from $115-125 all the way up to $150-160, with an expected raise of $4.8 billion, corresponding to a valuation of $48.8 billion.
Just three months ago, Cerebras’ secondary valuation was still at $23 billion. In other words, the final stretch before the IPO, the company’s book value has more than doubled.
The “selling point” of the story has been repeated a thousand times: Nvidia challenger, wafer-level chips, inference speed 21 times faster than B200, signed a $1 billion starting contract with OpenAI, and a maximum of $20 billion in computing power contracts. It’s a perfect “AI challenger” script—technological narrative, geopolitical story, star clients, huge orders—each piece precisely aligned with the 2026 AI infrastructure theme.
But reading the S-1 document page by page reveals something strange: all public reports tell the same story, but the prospectus tells a different one.
Triple Paradox
Breaking down the prospectus item by item, Cerebras presents a target composed of “three paradoxes.”
First: Technologically a true Alpha, financially an accounting magic trick.
The prospectus discloses: 2025 revenue of $510 million, up 76% year-over-year, GAAP net profit of $237.8 million. Sounds very impressive—an AI hardware company that is rapidly growing and profitable, almost a “mythical” target in today’s valuation environment. CoreWeave IPO’d in March this year still at a loss; Cerebras reports a 47% net profit margin directly.
But that $237.8 million “net profit” includes $363.3 million from a one-time, non-cash accounting adjustment related to G42’s forward contract liability extinguishment, which generated paper gains. Excluding this, and adding back $49.8 million in stock-based compensation, the true non-GAAP net loss for 2025 is $75.7 million—worse than the $21.8 million loss in 2024 by 247%.
In other words, the market sees a “profitable + 76% growth” IPO star, but the prospectus reveals a “rapidly growing company with expanding losses.” Both versions are not wrong; the difference lies in which one the market chooses to believe.
Second: Surface-level detachment from G42, but actually nested within OpenAI’s cycle.
Cerebras’ first IPO attempt in 2024 was not complicated: G42, an UAE-based client, contributed 85% of revenue in the first half of the year. CFIUS launched an investigation, and the company was forced to withdraw the application.
A year and a half later, the client list appears more diversified, adding heavyweight clients like OpenAI and AWS. But looking at the May 2026 S-1, the 2025 client structure was:
MBZUAI (Mohamed bin Zayed University of Artificial Intelligence): 62%
G42: 24%
Total: 86%
G42 simply shifted “weight” to MBZUAI, which is also based in the UAE and related to G42. MBZUAI alone accounts for 77.9% of receivables.
And the so-called “rescue line” with OpenAI is itself a nested structure. This contract is worth over $20 billion, with OpenAI promising to purchase 750 MW of compute power. But the same document discloses several other facts: OpenAI provided Cerebras with a $1 billion loan; OpenAI received nearly free warrants for 33 million shares; and the Master Relationship Agreement with OpenAI includes exclusivity clauses restricting Cerebras from selling to certain “named competitors.”
In other words, OpenAI is simultaneously Cerebras’ client, lender, upcoming shareholder, and to some extent, strategic controller. An anonymous analyst once said on Medium: when revenue is cyclical, valuation is cyclical, and IPO is just a way for those generating revenue to cash out, this is not a market but financial engineering.
The words may be sharp, but factually, it’s hard to refute.
Third: On the surface, Nvidia’s “challenger”; in essence, Nvidia’s “narrow-band supplementer.”
This is the easiest point for the market to overlook.
Cerebras’ technology is indeed solid. WSE-3 has 40 trillion transistors, 900k AI cores, 44GB on-chip SRAM, making a wafer-sized chip that bypasses all the chip-to-chip communication bottlenecks faced by GPU clusters. Independent benchmarks show that running Llama 4 Maverick (400 billion parameters), CS-3 outputs over 2,500 tokens per second per user, compared to about 1,000 tokens for Nvidia’s flagship DGX B200, and 549 and 794 tokens for Groq and SambaNova respectively.
The numbers don’t lie—Cerebras has an intergenerational advantage over GPUs in inference for this specific scenario.
The key word is “inference.” Cerebras’ own IPO document clearly states that its strength lies in latency-sensitive inference workloads. It does not challenge Nvidia’s capabilities or intentions in large model training or general computing. The CUDA ecosystem has been built over nearly 20 years since 2007, with tools, developer community, third-party libraries—all still part of Nvidia’s moat.
More critically, the market is not standing still. Nvidia announced the Vera Rubin architecture at GTC 2026, with 336 billion transistors, claiming performance five times higher than Blackwell; AMD’s MI400 has already reached 320 billion transistors; Google TPU v6, Amazon Trainium 3, Microsoft Maia 2—all major players developing their own chips. Nvidia’s R&D spending exceeded $18 billion in fiscal 2025; last December, it spent $20 billion acquiring AI inference startup Groq’s assets; in March, it invested another $4 billion in photonics tech companies.
So, a more accurate statement is: Cerebras isn’t trying to replace Nvidia; it’s fighting for a differentiated niche within Nvidia’s “inference” narrow band. It’s a real business, but with a valuation of $48.8 billion against $510 million in revenue, implying a price-to-sales ratio of 95.
Andrew Feldman’s Third “Product Pitch”
Beyond the numbers, it’s important to understand the company’s soul.
Andrew Feldman is a somewhat underestimated “serial entrepreneur” in Silicon Valley. He’s not a tech genius founder, nor someone from an ivory tower. He graduated from Stanford Business School, served as VP of Marketing at Riverstone Networks (IPO in 2001), and VP of Products at Force10 Networks (sold to Dell for $800 million in 2011).
In 2007, he co-founded SeaMicro with Gary Lauterbach, focusing on “energy-efficient servers,” stacking low-power cores into clusters to challenge mainstream high-power, large-core servers. The idea was ahead of its time, but the market was too early. AMD bought SeaMicro for $334 million in 2012; Feldman worked there as VP for two years before leaving.
Then he founded Cerebras.
Looking at Feldman’s path, one interesting insight emerges: he is not a “chip designer,” but an “alternative bet on compute infrastructure.” SeaMicro bet on “small cores beating big cores,” but it was a wrong bet. AMD bought it to use its Freedom Fabric interconnect for its own server CPU platform, but that route didn’t pan out, and the SeaMicro brand quietly disappeared. Cerebras, on the other hand, bets on “big chips beating small chips,” the complete opposite of SeaMicro’s premise.
In a sense, Feldman is doing the same thing: finding overlooked, seemingly “impossible” paths in computing architecture, heavily betting on them, then leveraging strong sales to push them into the market. At SeaMicro, he could control Force10’s sales team; AMD valued his sales network. This time, the most critical move was securing G42, enabling a hardware company with 80% of 2024 revenue from a single Middle Eastern client to sign a $20 billion contract with OpenAI.
The footnote: Feldman is a product-sales CEO, not a visionary tech CEO. He excels at selling “seemingly crazy” products to customers willing to pay a premium for differentiation—that’s his alpha.
Understanding this is crucial because it directly influences the investment judgment on Cerebras.
So, is CBRS worth investing in?
Looking at the three paradoxes together, the answer is more complex than simply “buy” or “not buy.”
If the goal is to capitalize on the IPO’s first-day surge, with 20x oversubscription, in the hottest AI hardware sector, and lacking a pure Nvidia alternative, CBRS will likely surge on day one. This is event-driven short-term trading, requiring no deep judgment.
But for long-term holding, three questions must be considered:
First: Is Cerebras worth a 95x price-to-sales ratio?
CoreWeave IPO’d in March at about 15x; Nvidia’s current P/S is around 25x. A company with 2025 revenue of $510 million, 86% customer concentration, and still operating at a loss, priced at 95x sales, implies the market expects it to reach $3–4 billion in revenue in three to four years and achieve sustained profitability.
Can this happen? It depends on whether OpenAI’s $20 billion contract can be realized as planned. According to the prospectus, in 2026 and 2027, about 15% of the remaining performance obligations (~$3.5 billion) will be recognized. If this pace continues, Cerebras’ revenue could reach over $2 billion by 2027, and the valuation could become more reasonable. But any delay, strategic shift by OpenAI, or loss of a key customer could instantly shatter this valuation.
Second: How wide is Cerebras’ moat?
The architecture advantage of WSE-3 is real, but how long will it last? Nvidia’s Vera Rubin, AMD MI400, Google TPU v6 are all advancing. Chip industry cycle times are 18–24 months. If Cerebras lags behind, its technical edge will be caught up. Its R&D expenditure already accounts for a significant portion of revenue, but in absolute terms, it’s still far behind the giants.
Deeper question: is wafer-level chip design a mainstream path or a niche “special forces” approach? There’s no definitive answer. Optimistically, as inference workloads grow from 30% to over 70% of AI compute, Cerebras’ niche could become the main battlefield. Pessimistically, Nvidia could improve Rubin’s inference performance, making the niche forever niche.
Third: Governance structure and geopolitical risks
The prospectus discloses two often-overlooked but critical issues:
First, Cerebras uses a Class A/Class B dual-class share structure, with insiders holding 99.2% of voting rights post-IPO. Even if the founders hold only 5% of the public float later, they still control the company. This means minority shareholders have little say in governance.
Second, the company reports two “material weaknesses” in internal controls over financial reporting. As a growth company, it can be exempt from SOX 404(b) audits for five years post-IPO. This is a red flag, not a major one, but worth noting.
Geopolitically, CFIUS has cleared G42’s voting rights issues this time, but export controls (licenses for shipments to UAE under CS-2, CS-3, CS-4) remain a long-term variable. The Trump administration’s policies on Middle Eastern AI chip exports are still uncertain; any policy swings could reignite tail risks for CBRS.
Conclusion
This IPO, as an event, is the most noteworthy AI hardware capital event of 2026. It sets the valuation anchor for AI infrastructure in the secondary market, influencing the pricing of all related targets.
As a long-term holding, it’s a classic “high risk, high uncertainty” bet—on the macro narrative of “inference as king,” on Cerebras’ ability to leverage OpenAI to carve out a narrow-band monopoly, and on the market’s willingness to continue paying a 95x P/S premium for AI hardware. All three conditions must align for huge returns; any failure could lead to a sharp decline.
For institutional investors, the typical approach is to wait for the third quarter results, key client progress, and valuation digestion before building a position. For individual investors, treating it as a small tail asset in AI hardware is fine; treating it as an all-in faith bet, reconsider the three paradoxes above.
More than the tomorrow’s opening surge, what’s more meaningful is this: when a company with 86% revenue from two related entities in the UAE, still operating at a loss, can be valued at $48.8 billion, it reveals how far the capital frenzy in AI infrastructure has already gone.