AI Earnings Night Showdown: $650 Billion Invested in AGI

Article | Sleepy.md

On April 29, 2026, Microsoft, Google, Meta, and Amazon all released their Q1 results on the same day. If you pull out the capital expenditure guidance provided by these four companies and look at it separately, the numbers are close to $650 billion. At this scale, it is already comparable to an entire year of Sweden’s GDP.

In other words, the four wealthiest tech companies in the world are preparing to buy their way into the AGI era with a year’s worth of economic output from a medium-developed country.

Now everyone’s eyes are tightly fixed on that ticket to AGI. At this moment, dubbed the “global AI asset showdown,” if we shift our gaze a little away from those sweeping narratives and take a look at those inconspicuous hidden corners, you’ll find a covert battle over physical constraints, capital anxiety, and industrial reshuffling—one that has already reached the stage where everything comes down to the final outcome.

How can a company that hasn’t turned a profit crash the US stock market?

What truly controls market sentiment is not necessarily the companies with the most profitable books, but the enterprise that everyone treats as a “faith totem.”

April 29 was originally the heaviest day of the US earnings season. But before listed companies turned in their reports, the market first went through an abrupt, unprovoked stampede. According to data from Goldman Sachs, this was the second-worst trading day for AI assets since the start of this year.

The trigger was not a blow-up in any listed company’s performance. Instead, it was a report from the day before by The Wall Street Journal. The report said that OpenAI failed to meet its 2025 revenue target, and the goal of reaching 1 billion weekly active users still looks out of reach. What further pricked the market was that the report mentioned OpenAI CFO Sarah Friar had warned internally that if revenue growth continued to fall short of expectations, the company might struggle to support its compute power procurement commitment worth as much as $600 billion.

A company that is not publicly listed and therefore does not need to publish earnings reports—just based on a rumor—sent Oracle’s stock down by 4%, CoreWeave down by 5.8%, and even SoftBank across the Pacific down by 12% in the over-the-counter market.

When a $600 billion compute power commitment collided with revenue growth that did not get fulfilled in sync, the market suddenly realized that the most dangerous part of the AI narrative is not that people don’t believe in the future, but that the future is simply too expensive.

Over the past two years, OpenAI has been Silicon Valley’s religion.

GPU procurement, data center construction, cloud provider expansion, startup valuations—many seemingly scattered decisions at the surface level all rest on the same underlying judgment: model capabilities will continue to leap forward, the user base will continue to expand, and AGI will eventually turn all of today’s expensive investments into tickets for the future.

The strongest part of this logic is that it can reinforce itself. The more people believe, the higher the valuation; the higher the valuation, the fewer people dare not to believe.

But around April 29, the market first began seriously questioning the cash-flow side of this belief—even for OpenAI—forcing it to face customer acquisition costs, user retention, revenue growth pace, and compute bills.

Printing Money and Cooling Water

The most fascinating thing about the internet era is that growth looks almost infinite.

Write a piece of code, copy it to 10 million users, and marginal costs can be spread extremely thin. Over the past 20 years, Silicon Valley has dared to “burn money for growth” to upend traditional industries—because of this belief: as long as network effects are strong enough, scale will swallow costs.

But in the AI era, the digital-world money-printing machine is being tightly choked by the cooling-water pipes of the physical world.

At the earnings call on April 29, facing the astonishing growth of the cloud business—up 63% (quarterly revenue first time exceeding $20 billion)—in Google CEO Pichai’s tone, you could still hear helplessness: “If we could meet demand, cloud revenue could have been even higher.”

Hidden behind this sentence is the most peculiar business predicament of the AI era: demand far exceeds supply, but growth is ruthlessly constrained by the physical world.

Google has a cloud backlog of up to $462 billion, which is nearly doubled quarter over quarter. AI solution products grew nearly 800% year over year. Paid users for Gemini Enterprise increased 40% quarter over quarter. API token usage surged from 10 billion per minute to 16 billion.

Put these figures in any internet company, and they are growth worth celebrating. But in Pichai’s words, what we can hear is a new type of dilemma emerging in the AI era: customers are already lining up, the money is already on the way, but the servers haven’t been built yet, power hasn’t been connected yet, and advanced chips haven’t been produced out of the wafer fab yet.

It’s not that there is no demand. It’s that there is too much demand—enough to pull growth back into the physical world.

Microsoft is facing the same predicament. Azure’s growth hit 40%. AI annualized revenue surpassed $37 billion. In January 2025, this figure was only $13 billion; within 15 months, it has nearly tripled.

However, Microsoft’s capital expenditure fell quarter over quarter to $31.9 billion, down from $37.5 billion in the previous quarter—an almost $6 billion decrease. In its earnings report, Microsoft explained this as “the timing of infrastructure build-out.” The underlying meaning is that money can be approved and sent out today, but data centers won’t grow overnight; GPUs can be placed on order, but electricity, land, cooling systems, and construction timelines cannot be accelerated by capital markets.

When everyone believes we are sprinting toward the virtual world, the final determinant of victory and defeat is still the oldest heavy-assets and physical laws.

Compute power is turning into a new kind of “land resource”: limited in the short term, slow to build, location-dependent, and first-come, first-served. In this land grab, the reason the four giants dare to push capital expenditures to the $650 billion level is not that they have already calculated returns precisely, but because they fear that if they do not hoard these “lands,” they might not even get a seat at the table tomorrow.

Burning Money Strategies

After the market closed on April 29, with results also exceeding expectations and capital expenditure guidance raised, Google’s stock rose 7%, while Meta plunged 7%.

To be fair, Meta turned in a fairly impressive performance. Revenue was $56.31 billion, up 33% year over year, the fastest growth rate since 2021. EPS reached $10.44, far above Wall Street expectations.

But Zuckerberg committed a taboo: Meta raised its 2026 capital expenditure guidance to between $125 billion and $145 billion. The better the results, the more nervous the market became. Because investors’ real worry is not whether Meta is making money now—it is whether it plans to use the cash earned from today’s advertising business to support an AI bet whose recovery path is not clear.

The market’s punishment is ruthless. The difference behind it lies in the granularity of monetization.

AI spending by Google, Amazon, and Microsoft can at least be placed into a relatively clear set of books.

Google has a $462 billion backlog of cloud orders; Amazon has AWS’s AI annualized revenue; Microsoft has paid Copilot users and a high RPO. For every dollar they burn, even if it may not turn into returns immediately, Wall Street at least knows roughly where the money will come back from: enterprise customers, cloud contracts, software subscriptions, and compute leasing.

That is why the capital markets are willing to keep listening to them tell stories. The story can be told far into the future, but the cash-collection path cannot be completely invisible.

Meta’s trouble is that it has no cloud business it can sell externally.

The hundreds of billions it poured in ultimately have to be cashed out through another, more circuitous route: Meta AI assistants need to increase user stickiness, recommendation algorithms need to improve ad conversion, AI-generated content needs to extend user time on platform, and smart glasses and future hardware need to become new entry points.

This logic is not that it cannot work; it’s that the chain is too long. Cloud providers burn money by putting GPUs into an order that has already been signed; Meta burns money by putting GPUs into an advertising-efficiency model that has not been fully proven yet. The former can be discounted; the latter can only be believed first. Even if the logic is sound, the monetization chain is too long, and Wall Street has not enough patience.

In capital markets, patience is a luxury. Especially when capital expenditure is pushed to the scale of hundreds of billions, investors are willing to pay for the future—but not indefinitely for something that is still vague.

What is even more worrying is the time lag.

At the call, Amazon CEO Andy Jassy acknowledged that the vast majority of the funds invested in 2026 will not produce returns until 2027 or even 2028.

This means the giants are pushing today’s cash flow into production capacity realization two years later. In between are gaps in data center construction, chip supply, power connection, customer demand, and model iteration. Any deviation in any link will be re-priced by the capital markets.

The most dangerous part of the AI arms race is right here: money is spent today, stories are told today, but the answers are not revealed until two years later.

A shifting, blurred boundary of industries

AI has not, as many people expected two years ago, quickly pushed search off the table.

When ChatGPT first appeared, the market once believed search ads would be directly swallowed by direct answers. Companies like Perplexity were therefore heavily expected. But in Google’s earnings report on April 29, its data showed search query volume hit an all-time high, and ad revenue reached $77.25 billion, up 15% year over year.

This looks more like the “Jevons Paradox” of the AI era. In 1865, British economist William Stanley Jevons found that improvements in steam engine efficiency did not reduce coal consumption; instead, coal consumption increased significantly. The reason was that efficiency improvements made more people able to afford steam engines, which triggered overall demand. Similarly, AI makes search more complex and causes users to ask more questions.

This is also where Google is easier to persuade the market than Meta. It has cash flows from old entry points and a new ledger from cloud business. It can make money from ads and also from enterprise compute demand. AI has not dismantled its walls—in fact, at least so far, it has helped thicken them.

A similar boundary reshaping is also happening in the chip industry. On the same day, Qualcomm—the king of mobile chips—released a report with revenue of $10.6 billion. During the call, CEO Cristiano Amon announced a major decision: Qualcomm will officially enter the data center market, collaborating with a leading hyperscale cloud provider on custom chips, which are expected to start shipping later this year.

Qualcomm’s main battlefield has always been mobile devices. But when AI computational workloads begin to be redistributed between the cloud and the edge, it must redefine its position as well.

If in the future all AI is dominated by cloud large models, the value of mobile chips will be compressed; if edge AI becomes standard, Qualcomm has to prove that it belongs not only to phones, but can also enter inference, terminals, and low-power data centers.

Its move into data centers is more like defense than attack.

When AI shifts from a “luxury on the cloud” to a “standard on the edge,” industry boundaries begin to blur. Mobile chip companies try to enter data centers. Cloud providers begin developing their own chips. Chip companies also explore models. Qualcomm’s “defection” is only a small part of the iceberg of this massive restructuring.

The same gold rush, two valuation languages

In the same AI gold rush, the US stock market has already entered a stringent “monetization verification period.” Even a leading semiconductor process control and detection equipment company—if it exposes even a hint of geopolitical and tariff risk—will have its valuation re-priced by the market. After the market close on April 29, KLA Corporation (KLA) delivered revenue of $3.415 billion that exceeded expectations. Non-GAAP EPS was $9.40, higher than the expected $9.16.

However, the stock price briefly fell as much as 8% after the close.

The cause was not poor performance, but market worries about tariffs and China exposure. KLA’s customer list includes a large number of Chinese wafer fabs. Against the backdrop of technology decoupling between the US and China, this “China exposure” is like the sword of Damocles hanging overhead. Even if performance is outstanding, it cannot offset the market’s instinctive fear of geopolitical risk.

Meanwhile, in the A-share market, the market is speaking another language.

Of course, performance matters here too, but many times performance is just fuel; what truly ignites is the narrative—whether you hold the ticket called “domestic substitution.”

That evening of April 29, Cambrian (Hanvue) released a remarkable quarterly report: revenue was 2.885 billion yuan, up 159.56% year over year, breaking through the 2 billion mark for the first time in a single quarter. Net profit was 1.013 billion yuan, up 185.04%. The next day, Cambrian’s stock surged. Its market cap exceeded 670 billion yuan, hitting a record high. From the start of the year, its gains have already surpassed 62%.

On the same day, Muxi (Moxi) also released its financial results. Revenue was 562 million yuan, up 75% year over year. Losses narrowed sharply from 233 million yuan in the same period last year to 98.84 million yuan. This is the company’s first quarterly report—this GPU company was listed only in December 2025.

Though both are in the AI infrastructure chain, the US and A-share markets gave completely different pricing reactions.

KLA faces a complex global supply chain ledger. Performance, orders, tariffs, China exposure, and export controls—each item could enter the valuation model.

Cambrian and Muxi face a different narrative environment: the stronger the external constraints, the easier it is for the strategic value of domestic compute power to be amplified. The US is discounting risk; A-shares are pricing scarcity premiums.

Smart Money Exits

But just as the market was celebrating Cambrian, one detail looked somewhat glaring.

At the end of 2025, the super bullish retail investor Zhang Jianping still held 6.8149 million shares of Cambrian, worth about 9.2 billion yuan, making him the company’s second-largest shareholder among natural persons. By this quarterly report, he had quietly exited the list of top ten shareholders.

If you roughly estimate from the share price range in the quarter, this reduction corresponds to a capital scale of at least the tens-of-billions-of-yuan level. The exact price is unknown to outsiders, but what can be confirmed is that before the performance explosion and the stock hitting new highs, the earliest beneficiary of this narrative dividend chose to cash out.

There are always two kinds of people in the market: one type buys into the narrative, and another type prices the narrative.

Zhang Jianping clearly belongs to the latter. He entered Cambrian before it became a consensus, then left after it was written into the grand story of “a leader in domestic compute power.”

On this $650 billion earnings night, Silicon Valley’s giants are anxious about compute shortages, Wall Street analysts are suffering from the time lag of monetization, and A-shares are busy re-pricing domestic compute power.

In the same AI gold rush, each market uses its own language. The US talks about return cycles; A-shares talk about domestic substitution. Cloud providers talk about order backlogs; Meta talks about ad efficiency. OpenAI hasn’t published earnings reports, yet it still tugs at the nerve of the entire compute chain.

Everyone is convinced they have bought a ticket into the AGI era. But no one knows when this show will end, or where the exit will be. The ticket into the AI era is expensive—but even more expensive than the ticket itself is knowing when you should leave.

Click to learn about Rhythm BlockBeats’ job openings

Welcome to join the official Rhythm BlockBeats community:

Telegram Subscription Group: https://t.me/theblockbeats

Telegram Group Chat: https://t.me/BlockBeats_App

Twitter Official Account: https://twitter.com/BlockBeatsAsia

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin