Futures
Access hundreds of perpetual contracts
TradFi
Gold
One platform for global traditional assets
Options
Hot
Trade European-style vanilla options
Unified Account
Maximize your capital efficiency
Demo Trading
Introduction to Futures Trading
Learn the basics of futures trading
Futures Events
Join events to earn rewards
Demo Trading
Use virtual funds to practice risk-free trading
Launch
CandyDrop
Collect candies to earn airdrops
Launchpool
Quick staking, earn potential new tokens
HODLer Airdrop
Hold GT and get massive airdrops for free
Pre-IPOs
Unlock full access to global stock IPOs
Alpha Points
Trade on-chain assets and earn airdrops
Futures Points
Earn futures points and claim airdrop rewards
Promotions
AI
Gate AI
Your all-in-one conversational AI partner
Gate AI Bot
Use Gate AI directly in your social App
GateClaw
Gate Blue Lobster, ready to go
Gate for AI Agent
AI infrastructure, Gate MCP, Skills, and CLI
Gate Skills Hub
10K+ Skills
From office tasks to trading, the all-in-one skill hub makes AI even more useful.
GateRouter
Smartly choose from 40+ AI models, with 0% extra fees
AI Earnings Report Showdown Night: $650 Billion Invested in AGI
Author: Sleepy.md
On April 29, 2026, Microsoft, Google, Meta, and Amazon all released their Q1 earnings reports on the same day. Looking specifically at the capital expenditure guidance provided by these four companies, the number approaches $650 billion. This scale is already comparable to the entire annual GDP of Sweden.
In other words, the four wealthiest tech giants in the world are preparing to spend an amount equivalent to a medium-developed country’s annual economy to buy their way through the ticket to the AGI era.
Now everyone’s eyes are firmly fixed on that ticket to AGI. In this moment, often called the “Night of the Global AI Asset Showdown,” if we slightly shift our focus away from those grand narratives and look into the unnoticed hidden corners, we will find an underground battle concerning physical shackles, capital anxiety, and industry restructuring, which has actually reached a point of no return.
A company that hasn’t reported earnings, how did it cause the US stock market to collapse?
The true market sentiment controllers are not necessarily those companies with the most profitable books, but rather the enterprises regarded by everyone as “faith totems.”
April 29 was originally the most important day of the US earnings season. But before the listed companies finished their reports, the market experienced an unanticipated stampede. According to Goldman Sachs, this was the second-worst trading day for AI assets performance this year.
The trigger was not a company’s earnings miss, but a report from The Wall Street Journal the day before, stating that OpenAI failed to meet its 2025 revenue targets, and the goal of 1 billion active weekly users remains far off. Even more nerve-wracking for the market was the mention that OpenAI CFO Sarah Friar had internally warned that if revenue growth continued to underperform, the company might struggle to support its future $600 billion compute power procurement commitments.
A company that is not publicly listed and does not need to release financial reports, just based on a rumor, caused Oracle’s stock to fall by 4%, CoreWeave to drop by 5.8%, and even SoftBank across the Pacific to plunge 12% in the over-the-counter market.
When the $600 billion compute power commitment collided with unfulfilled revenue growth, the market suddenly realized that the most dangerous aspect of the AI narrative is not that no one believes in the future, but that the future is just too expensive.
Over the past two years, OpenAI has been regarded as a religion in Silicon Valley.
Graphics card procurement, data center construction, cloud provider expansion, startup valuations—many seemingly scattered decisions are fundamentally betting on the same judgment: that model capabilities will continue to leap, user scale will keep expanding, and AGI will eventually turn all today’s costly investments into future tickets.
What makes this logic most powerful is its self-reinforcing nature. The more people believe, the higher the valuation; the higher the valuation, the more others dare to believe.
But around April 29, for the first time, the market seriously questioned the cash flow of this belief. Even OpenAI had to face issues like customer acquisition costs, user retention, revenue growth, and compute bills.
Printing money and cooling water
The most fascinating aspect of the internet era is that growth seems almost limitless.
Write a piece of code, copy it to ten million users, and the marginal cost is spread extremely thin. Over the past twenty years, Silicon Valley has dared to overturn traditional industries with “burning money for growth” because of this belief: as long as network effects are strong enough, scale will swallow costs.
But in the AI era, the digital printing press is being tightly choked by the cooling water pipes of the physical world.
At the earnings call on April 29, facing an astonishing 63% growth in cloud business (with quarterly revenue surpassing $20 billion for the first time), Google CEO Sundar Pichai expressed helplessness: “If we could meet demand, cloud revenue could be even higher.”
Behind this statement lies the most peculiar business dilemma of the AI era: demand far exceeds supply, but growth is ruthlessly limited by the physical world.
Google holds a backlog of cloud orders worth up to $462 billion, nearly doubling quarter-over-quarter. AI solution products grew nearly 800% year-over-year, Gemini Enterprise paid users increased by 40% quarter-over-quarter, and API token usage soared from 10 billion per minute to 16 billion.
These numbers would be celebrated as growth in any internet company. But in Pichai’s words, we hear a new kind of dilemma emerging in the AI age: customers are already lining up, money is on the way, but servers are not yet built, power is not yet connected, and advanced chips are not yet manufactured in wafer fabs.
It’s not a lack of demand, but demand so overwhelming that it pulls growth back into the physical realm.
Microsoft faces the same dilemma. Azure’s growth hit 40%, and AI annualized revenue surpassed $37 billion—this number was only $13 billion in January 2025, nearly tripling in 15 months.
However, Microsoft’s capital expenditure dropped quarter-over-quarter to $31.9 billion, down nearly $6 billion from the previous quarter’s $37.5 billion. The company explained this as “infrastructure build timing.” The implication is that money can be allocated today, but data centers won’t be built overnight; GPUs can be ordered, but power, land, cooling systems, and construction cycles cannot be hastened by the capital markets.
While everyone thought we were rushing toward a virtual world, the ultimate determinants of victory or defeat are still the oldest assets—heavy assets—and physical laws.
Compute power is becoming a new kind of “land resource,” limited in the short term, slow to build, location-dependent, and first-come, first-served. In this land grab, the four giants are willing to push capital expenditure to the $650 billion level not because they have all calculated the returns, but because they fear that if they don’t hoard this “land,” they might not even get a seat at the table tomorrow.
Burning money tactics
After the market closed on April 29, despite exceeding expectations and raising capital expenditure guidance, Google’s stock rose 7%, while Meta plummeted 7%.
To be fair, Meta delivered a very impressive report: revenue of $56.31 billion, up 33% year-over-year, fastest growth since 2021; EPS reached $10.44, far exceeding Wall Street expectations.
But Zuckerberg made a taboo mistake—Meta raised its 2026 capital expenditure guidance to between $125 billion and $145 billion. The better the performance, the more nervous the market became. Because investors are not really worried about whether Meta is profitable now, but whether it will use the cash earned from its current advertising business to support a high-stakes AI gamble with an unclear return path.
Market punishment is merciless, and the difference lies in the granularity of business monetization.
Google, Amazon, and Microsoft’s AI spending can at least be incorporated into relatively clear accounting books.
Google has a backlog of $462 billion in cloud orders, Amazon has AI annualized revenue from AWS, and Microsoft has Copilot paid users and high RPO. Every dollar they burn may not pay off immediately, but Wall Street at least knows roughly where the money will come back from: enterprise clients, cloud contracts, software subscriptions, compute leasing.
This is why the capital markets are willing to keep listening to their stories. The story can go far, but the cash flow path cannot be entirely invisible.
Meta’s trouble is that it has no external cloud business to sell.
The hundreds of billions it invested will ultimately be realized through a more convoluted path: Meta AI assistants to increase user stickiness, improved recommendation algorithms to boost ad conversions, AI-generated content to extend user engagement, and future hardware like smart glasses to become new entry points.
This logic isn’t invalid, but the chain is too long. Cloud providers burn money by putting GPUs into a signed order; Meta burns money by putting GPUs into an unproven advertising efficiency model. The former can be discounted, the latter can only be believed first. Logically sound, but the monetization chain is too long, and Wall Street lacks the patience.
And in the capital markets, patience is a luxury. Especially when capital expenditure hits the trillion-dollar level, investors are willing to pay for the future but won’t pay indefinitely for ambiguity.
Even more worrying is the time lag.
Amazon CEO Andy Jassy admitted on the call that most of the funds invested in 2026 won’t generate returns until 2027 or even 2028.
This means giants are pushing today’s cash flow into capacity realization two years from now, with gaps in data center construction, chip supply, power connection, customer demand, and model iteration. Any deviation in any link will be re-priced by the capital market.
The most dangerous aspect of the AI arms race is here: money is spent today, stories are told today, but the answers won’t be revealed until two years later.
Blurring industry boundaries
AI has not, as many expected two years ago, rapidly pushed search off the table.
When ChatGPT first appeared, the market believed that search ads would be directly swallowed by answers, and companies like Perplexity were thus highly anticipated. But in Google’s Q1 report, search query volume hit a record high, and ad revenue reached $77.25 billion, up 15% year-over-year.
This resembles the “Jevons Paradox” of the AI era. In 1865, British economist William Stanley Jevons discovered that improvements in steam engine efficiency did not reduce coal consumption—instead, they led to a significant increase in coal use because efficiency made steam engines affordable to more people, igniting overall demand. Similarly, AI makes search more complex and prompts users to ask more questions.
This is also where Google is easier to convince the market compared to Meta. It has both the cash flow from its old entry points and a new ledger from cloud business; it can profit from advertising and from enterprise compute needs. AI has not dismantled its city walls—in fact, so far, it has added a layer of reinforcement.
Similar boundary reconfigurations are happening in the chip industry. On the same day, Qualcomm, the king of mobile chips, reported revenue of $10.6 billion. During the call, CEO Cristiano Amon announced a major decision: Qualcomm is officially entering the data center market, collaborating with a top large-scale cloud provider on custom chips, expected to start shipping later this year.
Qualcomm’s traditional battlefield is mobile devices. But as AI’s computational load begins redistributing between the cloud and the edge, it must redefine its position.
If future AI is entirely dominated by cloud large models, the value of mobile chips will be compressed; if edge AI becomes standard, Qualcomm must prove it’s not just for phones but also capable of inference, terminals, and low-power data centers.
Its move into data centers is more defensive than offensive.
As AI shifts from a “luxury in the cloud” to a “standard on the edge,” all industry boundaries start to blur. Mobile chip companies try to enter data centers, cloud providers begin developing their own chips, and chip companies explore models. Qualcomm’s “defection” is just the tip of this major restructuring.
Same gold rush, two valuation languages
The same AI gold rush has entered a harsh “value realization disproof period” in the US stock market. Even leading semiconductor process control and inspection equipment companies, if they reveal any geopolitical or tariff risks, will be revalued. On April 29 after hours, KLA Corporation reported revenue of $8B, exceeding expectations, with Non-GAAP EPS of $9.40, above the expected $9.16.
However, its stock price fell sharply by 8% after hours.
The reason was not poor performance but concerns over tariffs and China exposure. KLA’s customer list includes many Chinese wafer fabs. In the context of US-China tech decoupling, this “China exposure” is like the sword of Damocles hanging overhead. No matter how bright the earnings, it cannot offset the market’s instinctive fear of geopolitical risks.
In the A-share market, a different language is used.
Here, performance is also observed, but often, performance is just fuel; the real ignition is the narrative—whether you hold the ticket called “domestic substitution.”
On the evening of April 29, Cambrian Technology released a remarkable quarterly report: revenue of 8B yuan, up 159.56% year-over-year, breaking the 2 billion mark for the first time in a single quarter; net profit of 8B yuan, up 185.04%. The next day, Cambrian’s stock surged, with a market cap surpassing 670 billion yuan, hitting a record high, with an increase of over 62% since the start of the year.
On the same day, Mu Xi Co., Ltd. reported revenue of 562 million yuan, up 75% year-over-year, and a significant narrowing of losses from 233 million yuan last year to 98.84 million yuan. This is the first quarterly report from this GPU company, which went public in December 2025.
Both companies are part of the AI infrastructure chain, but the US and Chinese markets give completely different valuation responses.
KLA faces a complex global supply chain ledger—performance, orders, tariffs, China exposure, export controls—each potentially impacting valuation models.
Cambrian and Mu Xi face a different narrative environment: external restrictions strengthen, the strategic value of domestic computing power is more easily amplified. In the US, risk is discounted; in China, scarcity is valued.
Smart money’s exit
But just as the market cheered for Cambrian, a detail stood out as somewhat glaring.
By the end of 2025, super-investor Zhang Jianping still held 6.8149 million shares of Cambrian, worth about 9.2 billion yuan, making him the second-largest individual shareholder. By the quarterly report, he had quietly exited the top ten shareholders.
Roughly estimating from the quarterly stock price range, this divestment involved at least tens of millions of yuan. The exact price is unknown, but it’s clear that before the performance exploded and the stock hit new highs, the earliest beneficiaries of this narrative chose to cash out.
There are always two types of people in the market: those who buy into the narrative, and those who price it.
Zhang Jianping clearly belongs to the latter. He entered Cambrian before it became a household consensus, and after it was written into the grand story of “domestic computing power leader,” he turned and exited.
On this $650 billion earnings night, Silicon Valley giants are anxious about compute shortages, Wall Street analysts are agonizing over the timing of cash realization, and the A-share market is busy re-pricing domestic computing power.
In this same AI gold rush, each market speaks its own language. US stocks talk about return cycles, A-shares about domestic substitution; cloud providers discuss order backlog, Meta talks about ad efficiency; OpenAI doesn’t release earnings but still influences the entire compute chain.
Everyone is convinced they hold the ticket to the AGI era. But no one knows when this show will end, or where the exit is. The ticket to the AI era is expensive, but more expensive than the ticket is knowing when to leave.