Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
AI Earnings Report Showdown Night: $650 Billion Invested in AGI
April 29, 2026, Microsoft, Google, Meta, and Amazon all released their first-quarter earnings reports on the same day. Looking at the capital expenditure guidance provided by these four companies alone, the figure approaches $650 billion. This scale is already comparable to the entire annual GDP of Sweden.
In other words, the four wealthiest tech giants in the world are preparing to spend an amount equivalent to a medium-developed country’s annual economy to buy their way into the ticket to the AGI era.
Now, everyone’s eyes are firmly fixed on that ticket to AGI. At this moment, often called the “Global AI Asset ‘Decisive Night’,” if we slightly shift our focus away from those grand narratives and look into the unnoticed hidden corners, we will find an underground battle concerning physical constraints, capital anxiety, and industry restructuring, which has actually reached a point of no return.
How can a company without earnings reports crash the US stock market?
The ones truly able to control market sentiment are not necessarily those with the most profitable books, but the companies regarded by everyone as “faith symbols.”
April 29 was originally the most important day of the US earnings season. But before the listed companies finished their reports, the market experienced an unanticipated stampede. According to Goldman Sachs data, this was the second-worst trading day for AI assets performance this year.
The trigger was not a major earnings miss from any listed company, but a report from The Wall Street Journal the day before, stating that OpenAI failed to meet its 2025 revenue targets, and the goal of 1 billion active weekly users remains far off. What further stung the market nerves was the mention that OpenAI CFO Sarah Friar had internally warned that if revenue growth continued to underperform, the company might struggle to support its massive $600 billion compute procurement commitments.
A company that is not publicly listed and does not need to release financial reports, just based on a rumor, caused Oracle’s stock to fall by 4%, CoreWeave to drop 5.8%, and even SoftBank across the Pacific to plunge 12% in the over-the-counter market.
When the commitment to $600 billion in computing power collided with unfulfilled revenue growth, the market suddenly realized that the most dangerous aspect of the AI narrative is not that no one believes in the future, but that the future is just too expensive.
Over the past two years, OpenAI has been like a religion in Silicon Valley.
Graphics card procurement, data center construction, cloud provider expansion, startup valuations—many seemingly scattered decisions are all betting on the same underlying judgment: model capabilities will continue to leap, user scale will keep expanding, and AGI will eventually turn all today’s costly investments into future tickets.
The strongest part of this logic is its self-reinforcing nature. The more people believe, the higher the valuation; the higher the valuation, the more people dare to believe.
But around April 29, for the first time, the market seriously questioned the cash flow of this belief system. Even OpenAI had to face issues like customer acquisition costs, user retention, revenue growth, and compute bills.
Printing presses and cooling water
The most fascinating aspect of the internet era is that growth appears almost limitless.
A piece of code written and copied to ten million users incurs extremely low marginal costs. Over the past twenty years, Silicon Valley has dared to overturn traditional industries with “burning money for growth,” relying on this belief: as long as network effects are strong enough, scale will swallow costs.
But in the AI era, the digital printing press is being tightly choked by the cooling water pipes of the physical world.
At the April 29 earnings call, despite cloud business growing at an astonishing 63% (with quarterly revenue surpassing $20 billion for the first time), Google CEO Sundar Pichai expressed helplessness: “If we could meet demand, cloud revenue could be even higher.”
Behind this statement lies the most peculiar business dilemma of the AI era: demand far exceeds supply, but growth is ruthlessly constrained by the physical world.
Google holds a backlog of cloud orders worth up to $462 billion, nearly doubling quarter-over-quarter. AI solution products grew nearly 800% year-over-year, Gemini Enterprise paid users increased by 40% quarter-over-quarter, and API token usage soared from 10 billion per minute to 16 billion.
These numbers would be celebrated growth for any internet company. But in Pichai’s words, we hear a new kind of dilemma emerging in the AI age: customers are already lining up, money is on the way, but servers are not yet built, power is not yet connected, and advanced chips are not yet produced in wafer factories.
It’s not that there is no demand; it’s that demand is so overwhelming that it pulls growth back into the physical realm.
Microsoft faces the same dilemma. Azure’s growth hit 40%, and AI annualized revenue surpassed $37 billion—this figure was only $13 billion in January 2025, nearly tripling in 15 months.
However, Microsoft’s capital expenditure dropped quarter-over-quarter to $31.9 billion, down nearly $6 billion from the previous quarter’s $37.5 billion. The company explained this as “infrastructure build-out timing.” The implication is that money can be approved today, but data centers won’t be built overnight; GPUs can be ordered, but power, land, cooling systems, and construction cycles cannot be hastened by capital markets.
While everyone thought we were rushing toward a virtual world, the ultimate determinants of victory or defeat remain the oldest assets—heavy physical assets and physical laws.
Compute power is becoming a new form of “land resource”: limited in the short term, slow to build, location matters, and early movers lock in supply. In this land grab, the reason the four giants are willing to push capital expenditure to the level of $650 billion is not because they have all calculated the returns precisely, but because they fear that if they don’t hoard this “land,” they might not even get a seat at the table tomorrow.
The art of burning money
After market hours on April 29, despite exceeding expectations and raising capital expenditure guidance, Google’s stock rose 7%, while Meta plummeted 7%.
To be fair, Meta delivered a quite impressive report: revenue of $56.31 billion, up 33% year-over-year, fastest growth since 2021; EPS reached $10.44, far exceeding Wall Street expectations.
But Zuckerberg made a taboo mistake: Meta raised its 2026 capital expenditure guidance to between $125 billion and $145 billion. The better the performance, the more nervous the market became. Because investors’ real concern is not whether Meta is profitable now, but whether it will use the cash generated from its current advertising business to support a high-stakes AI gamble with an unclear path to recoupment.
Market punishment is ruthless. The difference lies in the granularity of business monetization.
Google, Amazon, and Microsoft’s AI spending can at least be incorporated into relatively clear accounting books.
Google has a backlog of $462 billion in cloud orders, Amazon has AI annualized revenue from AWS, and Microsoft has Copilot paid users and high RPO. Every dollar spent may not pay off immediately, but Wall Street at least knows roughly where the money will come back from: enterprise clients, cloud contracts, software subscriptions, compute leasing.
This is why capital markets are willing to keep listening to their stories. The story can go far, but the cash flow path cannot be entirely invisible.
Meta’s problem is that it does not have a cloud business to sell externally.
The hundreds of billions it invests will ultimately be realized through a different, more convoluted path: Meta AI assistants to increase user stickiness, improved recommendation algorithms to boost ad conversions, AI-generated content to extend user engagement, and future hardware like smart glasses to become new entry points.
This logic is not invalid; it’s just that the chain is too long. Cloud providers burn money by placing GPUs into already signed orders; Meta burns money by placing GPUs into an unproven advertising efficiency model. The former can be discounted, the latter must be believed first. Although theoretically sound, the monetization chain is too long, and Wall Street lacks the patience.
And in capital markets, patience is a luxury. Especially when capital expenditure hits the trillion-dollar level, investors are willing to pay for the future but not indefinitely for ambiguity.
Even more worrying is the time lag.
Amazon CEO Andy Jassy admitted in the earnings call that most of the funds invested in 2026 will only generate returns in 2027 or even 2028.
This means giants are pushing today’s cash flow into capacity realization two years down the line. With gaps in data center construction, chip supply, power access, customer demand, and model iteration, any deviation in any link can lead to revaluation by the capital market.
The most dangerous aspect of the AI arms race is here: money is spent today, stories are told today, but the answers only come two years later.
Blurring industry boundaries
AI has not, as many expected two years ago, rapidly pushed search off the table.
When ChatGPT first appeared, the market believed that search ads would be directly swallowed by answers, and companies like Perplexity were thus highly anticipated. But in Google’s April 29 earnings report, search query volume hit a record high, with ad revenue reaching $77.25 billion, up 15% year-over-year.
This is more like the “Jevons Paradox” of the AI era. In 1865, British economist William Stanley Jevons discovered that improvements in steam engine efficiency did not reduce coal consumption—instead, coal consumption increased significantly because efficiency made steam engines affordable to more people, fueling overall demand. Similarly, AI makes search more complex and prompts users to ask more questions.
This is also why Google is easier to convince the market than Meta. It has both the cash flow from its old entry points and a new ledger from cloud business; it can profit from advertising and from enterprise compute needs. AI has not dismantled its city walls—in fact, so far, it has added a new layer.
Similar boundary reconfigurations are happening in the chip industry. On the same day, Qualcomm, the king of mobile chips, reported revenue of $10.6 billion. During the earnings call, CEO Cristiano Amon announced a major decision: Qualcomm is officially entering the data center market, collaborating with a top large-scale cloud provider on custom chips expected to start shipping later this year.
Qualcomm’s main battlefield has always been mobile devices. But as AI’s computational load begins redistributing between cloud and edge, it must redefine its position.
If future AI is fully dominated by cloud large models, the value of mobile chips will be compressed; if edge AI becomes standard, Qualcomm must prove it can also enter inference, terminals, and low-power data centers.
Their move into data centers is more defensive than offensive.
As AI shifts from a “luxury in the cloud” to a “standard on the edge,” all industry boundaries start to blur. Mobile chip companies try to enter data centers; cloud providers begin developing their own chips; chip companies explore models. Qualcomm’s “defection” is just the tip of this major restructuring.
The same gold rush, two valuation languages
In the same AI gold rush, the US stock market has entered a strict “value realization disproof” phase. Even leading semiconductor process control and inspection equipment companies, if exposed to geopolitical and tariff risks, will be revalued. On April 29 after hours, KLA Corporation reported revenue of $3.42B, exceeding expectations, with Non-GAAP EPS of $9.40, above the expected $9.16.
However, its stock price fell sharply by 8% afterward.
The reason was not poor performance but market concerns over tariffs and China exposure.
KLA’s customer list includes many Chinese wafer fabs. Against the backdrop of US-China tech decoupling, this “China exposure” is like the sword of Damocles hanging overhead. No matter how stellar the results, it cannot offset the market’s instinctive fear of geopolitical risks.
In A-shares, a different language is used.
Here, performance still matters, but often, performance is just fuel; the real ignition is the narrative—whether you hold the ticket called “domestic substitution.”
On the evening of April 29, Cambrian (HanCambrian) released a remarkable quarterly report: revenue of 2.89B yuan, up 159.56% year-over-year, breaking the 2 billion mark for the first time in a single quarter; net profit of 1.01B yuan, up 185.04%. The next day, Cambrian’s stock surged, with a market cap surpassing 670 billion yuan, hitting a record high, with an increase of over 62% since the start of the year.
On the same day, Muxi (MuXi) reported revenue of 562 million yuan, up 75%, and a significant narrowing of losses from 233 million yuan last year to 98.84 million yuan. This GPU company, listed only in December 2025, delivered its first quarterly report.
Both are in the AI infrastructure chain, but US and Chinese markets give completely different valuation responses.
KLA faces a complex global supply chain ledger—performance, orders, tariffs, China exposure, export controls—each potentially impacting valuation models.
Cambrian and Muxi face a different narrative environment: external restrictions strengthen the strategic value of domestic computing power. US markets discount risk, while A-shares assign a scarcity premium.
Smart money exits
But just as the market cheers for Cambrian, a detail stands out painfully.
By the end of 2025, super-investor Zhang Jianping still held 6.8149 million shares of Cambrian, worth about 9.2 billion yuan, making him the second-largest individual shareholder. By this quarterly report, he had quietly exited the top ten shareholders.
Roughly estimating from the quarterly stock price range, this divestment involved at least tens of millions of yuan. The exact price is unknown, but it’s clear that before the explosive performance and stock price hitting new highs, the earliest to cash in on this narrative was the one who took profits.
There are always two types of people in the market: those who buy into the narrative, and those who price it.
Zhang Jianping clearly belongs to the latter. He entered Cambrian before it became a nationwide consensus, then turned and exited after it was written into the grand story of “domestic AI chip leader.”
On this earnings night of $650 billion, Silicon Valley giants are anxious over compute shortages, Wall Street analysts are agonizing over the timing of realization, and A-shares are busy re-pricing domestic compute power.
In this same AI gold rush, each market speaks its own language. US stocks focus on return cycles, A-shares on domestic substitution; cloud providers on order backlog, Meta on ad efficiency; OpenAI, despite not releasing earnings, still influences the entire compute chain.
Everyone is convinced they hold the ticket to the AGI era. But no one knows when this show will end, or where the exit is. The ticket to the AI era is expensive, but more costly than the ticket itself is knowing when to leave.