Futures
Access hundreds of perpetual contracts
CFD
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
The most expensive "another chip" in Silicon Valley has finally reached its moment
In June 1944, the Allied forces launched Operation Overlord in Normandy. The success of this landing relied not only on a strong frontal assault but also on a logistics supply line stretching thousands of kilometers—fuel, ammunition, food—each indispensable.
If today’s AI arms race is compared to a great war, NVIDIA is the logistics line that almost monopolizes all ammunition supplies. Everyone depends on it, and everyone is well aware of how dangerous this dependence is.
Therefore, a campaign to find a “second supply line” has been quietly underway.
On May 14, 2026, local time, Cerebras Systems rang the Nasdaq opening bell. The opening price was $350, reaching a high of $385 during the day, a 108% increase over the IPO price of $185. The stock closed at $311.07, a first-day gain of 68%.
Cerebras is also the largest IPO in the U.S. tech scene since Uber went public in 2019. Why is Silicon Valley so optimistic about this chip company? Can they really challenge NVIDIA’s dominance?
A “Chip” That Doesn’t Look Like a Chip
To understand why Cerebras is causing such a stir, you first need to know what it actually makes.
NVIDIA’s GPUs, no matter how powerful, are fundamentally “small” chips—multiple chips interconnected at high speed to form clusters that collaboratively handle large model training and inference tasks. This architecture has dominated the industry for the past decade, but it has an inherent shortcoming: data communication latency between chips, which becomes a bottleneck when dealing with ultra-large models.
Comparison of Cerebras chip and NVIDIA B200 chip Image source: Cerebras
Cerebras’ founder Andrew Feldman is no stranger to “architectural-level dissent.” In the early 2010s, at SeaMicro, he argued that the then-popular server architecture was “geometrically wrong” for internet workloads—and he was right. AMD eventually acquired the company for over $334 million.
This time, he applied the same logic to AI chips.
Comparison of Cerebras chip and NVIDIA B200 chip Image source: Cerebras
Cerebras’ core product is the WSE (Wafer Scale Engine), a chip as large as an entire silicon wafer. No multiple chips interconnected, no communication delays— all neural network computations are completed on a single piece of silicon. Cerebras claims that its inference speed is 15 times faster than “leading GPU-based solutions.”
Deming Chen, a professor at the University of Illinois at Urbana-Champaign, offered a calm perspective: “Smaller chips are still more practical for most use cases—cheaper, more flexible, easier to scale. Cerebras performs well on certain workloads, but it won’t replace everything.”
This isn’t a “better chip,” but a “different chip”—designed for specific scenarios, and that scenario happens to be the inference acceleration most needed in today’s AI wave.
Behind the 20-Fold Oversubscription
The day before the IPO, Cerebras’ pricing announcement was already eye-catching.
Originally, the expected price range was $150 to $160 per share, but it was ultimately set at $185, breaking the upper limit. Investor demand exceeded available shares by more than 20 times. The company sold 30 million shares, raising approximately $5.55 billion.
Behind the numbers is a signal that AI infrastructure investment is entering a new phase.
Over the past two years, the market’s bet on AI compute power has been highly concentrated—focused on NVIDIA, on data centers, on H100 and B200 order queues. But the capital market is beginning to realize that over-concentration carries risks and entails premiums. NVIDIA’s current P/E ratio is about 25 times its sales, while Cerebras’ IPO day P/S ratio soared to 187 times.
Is it a bubble? Maybe. But this pricing itself indicates one thing: investors are willing to pay a high options premium for the story of “AI chips that aren’t NVIDIA.”
Since the beginning of this year, AMD has risen over 94%, Intel’s share price has surged by 218%, and the Philadelphia Semiconductor Index has increased by 66%. Capital is shifting from a single NVIDIA bet to every node in the entire AI chip supply chain.
Cerebras’ IPO is the most dramatic act in this reallocation of funds.
Beyond Moats
Behind the excitement, Cerebras’ story isn’t without cracks.
Let’s start with the most obvious: customer concentration risk.
According to the S-1 prospectus, Mohammed bin Zayed Artificial Intelligence University (MBZUAI) in the UAE contributed 62% of Cerebras’ revenue and 77.9% of accounts receivable. Although heavyweight clients like OpenAI and AWS have appeared on the customer list, the structural issue of “one big customer supporting most of the business” remains unaltered.
Then there’s valuation. A P/S ratio of 187, compared to NVIDIA’s 25, isn’t just a bit expensive—it’s nearly eight times higher. Motley Fool analyst bluntly said: “I would advise caution to investors jumping in immediately.”
But another perspective is also worth noting. Some analysts believe Cerebras already has enough backlog orders—if executed properly, revenue could grow tenfold in the next few years. A small company with a large order backlog relative to current revenue, despite the risks, can warrant a premium valuation—key is execution.
The real question for Cerebras’ fate isn’t today’s stock price, but whether it can get OpenAI, AWS, and the next wave of large model companies to deploy WSE at production scale—not just as a “test backup.”
NVIDIA spent 15 years building its software ecosystem—CUDA’s moat—which can’t be filled simply by a faster chip.
The “Touchstone” Effect Begins to Ferment
Matt Kennedy, head of IPO research at Renaissance Capital, said Cerebras’ first-day performance “will reinforce the idea that there is strong demand for high-potential AI companies.”
This implies more than just Cerebras’ IPO.
In the second half of 2026, a series of more heavyweight AI companies are expected to go public—SpaceX, OpenAI, Anthropic—each story surpassing Cerebras in scale. Cerebras’ IPO, to some extent, is paving the way for these companies.
The 68% first-day gain gives a clear answer:
The market’s appetite has not yet been satisfied.
And for NVIDIA, it’s not yet a real threat—on May 20, NVIDIA will release its Q1 FY2027 earnings, with market expectations of about $78 billion in revenue, up roughly 75% year-over-year, remaining the undisputed leader in the industry. But it may already sense that the atmosphere of “everyone is looking for a second card” is growing stronger.
Historically, no technology platform can maintain a monopoly forever. NVIDIA is not the first, and it won’t be the last, to face challenges.
Cerebras’ story today may not necessarily be NVIDIA’s nightmare, but it marks the true beginning of diversification in the entire AI chip ecosystem.
Source: GeekPark
Risk Warning and Disclaimer
The market is risky; investments should be cautious. This article does not constitute personal investment advice and does not consider individual users’ specific investment goals, financial situations, or needs. Users should consider whether any opinions, viewpoints, or conclusions in this article are suitable for their particular circumstances. Invest accordingly at their own risk.