When Google "invests" 40 billion dollars in Anthropic, where does the money actually go?

Monday, May 11 · Bloomberg · ~10 min read

AI

On April 24, 2026, Bloomberg broke a story that should have caused more confusion than it did. Alphabet, the parent company of Google, agreed to invest up to 40 billion dollars in Anthropic, the AI lab behind the Claude chatbot. The structure was straightforward enough at the top: 10 billion in cash upfront at a 350 billion dollar valuation, with another 30 billion to follow as Anthropic hits performance milestones. Alongside it, Google Cloud committed to giving Anthropic 5 gigawatts of computing capacity over five years. A gigawatt, for the uninitiated, is a measure of electrical power, and 5 gigawatts is roughly the consumption of New York City on a hot afternoon. Big number. Big deal.

Eleven days later, on May 5, The Information reported the other side of that compute commitment. Anthropic had agreed to spend roughly 200 billion dollars at Google Cloud over the same five years.

Read those two sentences again.

Google is putting in up to 40 billion. Anthropic is committing to spend roughly 200 billion at Google. So Google's "investment" comes back to Google as cloud revenue, five times over. And that's before we get to the third thing that happened, which is the part nobody quite explained in plain English.

For the uninitiated, here's the cast. Anthropic was founded in 2021 by ex-OpenAI staff led by Dario Amodei. Its product Claude competes with ChatGPT and is, by most accounts, the model enterprise software developers prefer. The company hit a 30 billion dollar revenue run rate by April 2026, up from 9 billion at the end of 2025. ("Run rate" is the current monthly revenue annualised. It's a forward-looking flex, not a backward-looking statement of fact.) Amodei told a conference on May 6, "We tried to plan very well for a world of 10x growth per year. And yet we saw 80x." Anthropic burns cash hard. Internal projections leaked to The Information put 2026 losses around 11 billion, and 2026 compute spend around 20 billion.

Google sells cloud computing services, which essentially means renting out massive data centres full of chips. Its custom chip is called the TPU, or Tensor Processing Unit, designed in partnership with Broadcom. Cloud is the part of Alphabet's business that's growing fastest. In the first quarter of 2026, Google Cloud booked 20 billion in revenue, up 63 percent year on year. The cloud backlog (contracts signed but not yet delivered) more than doubled in a single quarter to over 460 billion dollars. Per The Information, Anthropic's 200 billion commitment alone represents more than 40 percent of that backlog.

Now we can get to the third thing.

When Alphabet reported Q1 earnings in late April, the company posted record net income of 62.6 billion dollars. But buried in the filing was a line called "net gain on equity securities" worth 36.9 billion. Fortune's read, published April 30, was blunt: nearly half of Alphabet's record profit didn't come from selling search ads or cloud services or anything Alphabet does for a living. It came from marking up the value of private companies it owns stakes in. Chief among them: Anthropic. (Amazon, which has its own multi-billion Anthropic stake, reported 16.8 billion in similar gains the same quarter, with disclosures explicitly naming Anthropic as the driver.)

This is the part you need to hold in your head, because it's where the carousel image actually comes from. When Google wires 10 billion dollars to Anthropic, three things happen on Google's parent company's books. First, the cash leaves and an equity stake comes in. Second, over the years that follow, Anthropic spends a multiple of that 10 billion back at Google Cloud, which Google books as revenue as it gets consumed. Third, Anthropic's secondary market valuation keeps climbing, because investors see the demand and the partnerships and bid up the price they'd pay for shares. Bloomberg reported on April 14 that unsolicited offers were coming in at around 800 billion. By April 29, the rumoured number was over 900 billion. As Anthropic gets marked up, Alphabet's stake gets marked up, and that mark-up flows through Alphabet's profit line as a non-cash gain.

The same outbound dollar shows up as a future revenue commitment AND a present-day profit gain.

Think of it like this. Suppose you give your cousin 10,000 dollars to open a coffee shop. Your cousin signs a contract to buy 50,000 dollars worth of beans from your bean wholesaler over the next five years. You happen to own most of the bean wholesaler. And because a venture investor down the road heard about all this and offered to buy your cousin's coffee shop for ten times what you paid in, you tell your accountant to mark up the value of your stake by 90,000 dollars on your books. Then you go to a dinner party and announce you had a record profit year.

Now, none of this is illegal. The cash is real. The chips are real. The data centres are real. The accounting rules (GAAP, the Generally Accepted Accounting Principles that govern American corporate disclosures) explicitly allow companies to mark private equity holdings to fair value when there's a recent transaction price to anchor on. The 800 billion preemptive offer for Anthropic is a real transaction marker. Alphabet's auditors signed off.

But here's the catch. This pattern isn't limited to Google and Anthropic. The exact same shape repeats across a small cluster of companies that, between them, account for most of the trillion-dollar AI capex number you keep reading about.

Nvidia, the chipmaker whose GPUs (Graphics Processing Units, the silicon that trains and runs AI models) power roughly every frontier AI system, announced in September 2025 that it would invest up to 100 billion dollars in OpenAI as OpenAI deployed 10 gigawatts of Nvidia systems. By February 2026 the deal had stalled, and Jensen Huang, Nvidia's CEO, told reporters in Taipei, "it was never a commitment." The final shape, signed in late February, was about 30 billion in Nvidia equity going into OpenAI as part of a 122 billion dollar funding round at an 852 billion dollar valuation. OpenAI's then-CFO Sarah Friar told CNBC the previous September, "most of the money will go back to Nvidia," meaning OpenAI would spend the equity it received buying Nvidia chips.

Nvidia also owns roughly 11.5 percent of CoreWeave, a cloud company whose primary business is buying Nvidia GPUs and renting them out, mostly to Microsoft and OpenAI. In September 2025, Nvidia signed a contract obligating itself to buy any unsold CoreWeave capacity through April 2032, worth up to 6.3 billion dollars. Asked by Bloomberg on January 26 whether that arrangement was circular, Huang said, "the idea that it is circular is , it's ridiculous." Reasonable people can disagree. CNBC reported on May 9 that Nvidia has committed over 40 billion dollars in AI equity bets in just the first 18 weeks of 2026.

Then there's Oracle, which signed a 300 billion dollar five-year cloud contract with OpenAI in 2025. Oracle's order backlog jumped to 523 billion dollars by the end of its fiscal Q2. Larry Ellison, Oracle's founder, saw his net worth jump by an estimated 101 billion dollars in a single day on the earnings announcement. Ed Zitron, the technology critic, pointed out that Ellison has pledged 346 million Oracle shares as collateral on roughly 21 billion in personal loans. The carousel turns, and individual fortunes ride on whether OpenAI keeps consuming what Oracle has promised to deliver.

The bull case for all of this is straightforward and worth taking seriously. The gigawatts are real. The IEA, the International Energy Agency, reported in 2026 that global data centre electricity demand grew 17 percent in 2025, and AI-focused data centre demand grew about 50 percent. AWS's "Project Rainier" campus in New Carlisle, Indiana, built to train Anthropic's Claude, has 525 megawatts already operational. Meta is putting roughly 6,000 construction workers on its Hyperion site in Louisiana. The Big Four hyperscalers (Microsoft, Google, Meta, and Amazon) are guiding to roughly 700 billion dollars in combined capital spending in 2026, about three times their 2024 total. That's larger, in a single year, than the entire 1996 to 2001 US telecom build-out. It's roughly 73 percent of the US defence budget. It's bigger than the GDP of Saudi Arabia.

This is real money buying real things. Andy Jassy, Amazon's CEO, told investors in April that AWS has "high confidence" the spend will be monetised, "as we already have customer commitments for a substantial portion of it." Sundar Pichai, Google's CEO, said on Alphabet's Q1 call that the company was "compute constrained in the near term." Bernstein analyst Stacy Rasgon, raising his Nvidia price target to 300 dollars, said: "I've never seen it so cheap, especially relative to the broader semiconductor industry." At 25 times forward earnings, Nvidia is in the 11th percentile of its own 10-year valuation history. Cheap by its own standards.

And yet.

Anthropic and OpenAI, the two companies whose hunger justifies most of this build-out, have a combined revenue run rate of roughly 70 billion dollars. Single-year hyperscaler capex is ten times that. Bain & Company published a report in September 2025 estimating the AI ecosystem needs to generate 2 trillion dollars in new annual revenue by 2030 to justify the planned 500 billion per year of compute spend, and pointed to an 800 billion dollar revenue shortfall even after counting expected AI productivity gains. An MIT study in 2025 found that 95 percent of corporate AI pilots fail to achieve meaningful revenue growth.

Lisa Shalett, Morgan Stanley's wealth management chief investment officer, put it on the record in October 2025: "the deals and the cross-dealing have gotten more and more and more complicated, where some of this starts to feel and look and smell like circular dealing, like vendor financing." Vendor financing is the technical term for what Lucent and Nortel did in the late 1990s, lending money to phone companies so they could buy Lucent and Nortel gear. Lucent's vendor financing commitments peaked at 8.1 billion dollars, about 24 percent of its annual revenue. Lucent's stock went from 65 dollars to 76 cents. Roughly 85 to 95 percent of the fibre laid in that period sat unused for years afterwards, which is why "dark fibre" entered the vocabulary.

Tom Tunguz, the venture investor, ran the comparison in November 2025. Nvidia's direct equity investments and commitments add up to roughly 67 percent of its annual revenue, against Lucent's 24 percent. Nvidia's exposure, by this measure, is 2.8 times Lucent's at its peak. Nvidia's Q3 FY26 disclosure also showed that four customers accounted for 61 percent of its revenue. Lucent's peak top-two concentration was 23 percent.

This is the case Michael Burry, the investor made famous by The Big Short, has been making. In November 2025 he filed shorts worth roughly 912 million in notional value against Palantir and 187 million against Nvidia. He wrote on Substack: "I am not claiming Nvidia is Enron. It is clearly Cisco."

The Cisco comparison matters because it tells you what bear case is on the table. Cisco didn't go bankrupt. Cisco's products didn't fail. Cisco survived the dot-com bust and is still a major company today. But Cisco shareholders who bought at the top in March 2000 are still underwater, 26 years later. The buildings got built. The fibre got laid. The investors got hosed.

The honest answer to "is the trillion-dollar AI capex number real" is: yes and no. The capital expenditure is real. The data centres exist. The power is being drawn. The chips are being shipped. What's less clear is how much of the demand justifying that spend is independent end-customer demand versus a small set of well-funded labs spending the money their backers gave them at the cloud divisions of those same backers. Howard Marks, the Oaktree co-founder who called the 2000 bubble, wrote in a December 2025 memo: "there is no doubt that investors are applying exuberance with regard to AI. The question is whether it's irrational."

The way to tell, as a reader, is to look at a specific line. On every hyperscaler earnings release, there's a line called "Other income (expense), net" or some variant. When Alphabet, Microsoft, Amazon, or Oracle reports record profits, scroll down and check what portion of that profit came from the operating business and what portion came from marking up the value of stakes in other AI companies. In Alphabet's Q1, the answer was roughly 46 percent. In Amazon's, a meaningful chunk. The growth in those numbers is real. The valuations underpinning those numbers depend on private companies that haven't been tested in a downturn.

The Lucent investors got the build-out right. They got the timing wrong. The chips that powered the 2026 AI boom will still be running models in 2031. The only question is who will own them.

Terms
GPU: Graphics Processing Unit, the type of chip Nvidia makes that's used to train and run AI models; originally designed for video game graphics.
TPU: Tensor Processing Unit, Google's in-house chip designed for AI workloads, manufactured in partnership with Broadcom.
Gigawatt: A unit of electrical power equal to one billion watts; roughly the consumption of a major city, used here as a proxy for AI data centre scale.
Run rate: A company's current monthly revenue multiplied by 12, used to project annualised revenue forward; impressive but not the same as audited annual revenue.
Vendor financing: When a supplier lends or invests money in its own customers so they can afford to buy the supplier's products; was central to the late-1990s telecom bust.