“It’ll happen slowly, and then all at once.”
TL;DR
- Michael Burry predicts the AI bubble will burst, citing stretched depreciation schedules.
- He believes Meta and Oracle's financial reporting hides the reality of AI investments.
- Investors are reportedly extending asset lifespans to inflate current earnings.
- This practice creates an illusion of growth, masking potential overcapacity and low returns.
Jim Morrow, founder and chief investment officer of Callodine Capital,, explains the eventual, and unavoidable, conclusion to what he terms “the most crowded trade in history.”
He's not merely paraphrasing Ernest Hemingway; he's discussing the AI competition and trillion-dollar agreements so strained they resemble tangles more than transactions. His concerns are echoed by others.
Michael Burry—the investor of Big Short fame who famously predicted the 2008 housing collapse—broke a two-year silence this week to say nearly the same thing: that Big Tech’s AI-era profits are built on “one of the most common frauds in the modern era”—stretching the depreciation schedule (some, including Burry, would say cheating the depreciation schedule).
And it landed with extra weight: earlier this week, Burry quietly deregistered his investing firm, Scion Asset Management, effectively stepping away from managing outside money or filing public disclosures. Some analysts interpreted the move as less of an ominous sign and more, as Bruno Schneller, managing director at Erlen Capital Management, told CNBC, stepping away “from a game he believes is fundamentally rigged.”
“On to much better things,” Burry hinted on X, with a new launch expected on November 25.
Having shed the duties of reporting and client oversight, Burry rejoined X, sharing a sentiment that pierced the prevailing AI excitement. In his view, the surge in GPUs, data centers, and massive AI investments doesn't signify inevitable expansion; rather, it points to a financial cycle that appears more and more warped, more and more congested, and more and more precarious.
Burry quantified this. The investor, via an X post, projected that Big Tech will underestimate depreciation by $176 billion from 2026 to 2028, thereby boosting reported earnings by 26.9% for Oracle and 20.8% for Meta, among other specific entities he identified.
Meta did not respond to a request for comment. Oracle declined to comment.
“He’s spot on,” Morrow tells Coins2Day. Morrow has been making the arguments for months, warning that a “ tsunami of depreciation” could quietly flatten Big Tech’s AI profits. Behind the trillion-dollar boom in chips, data centers, and model training, he argues, lies a simple but powerful illusion: companies have quietly changed the length of time they account for their machines—and their semiconductor chips—wearing out and depreciating.
“Companies are very aware of it,” Morrow claims. “They’ve gone to great lengths to change their accounting and depreciation schedules to get ahead of it—to effectively avoid all this capex hitting their income statements.”
Burry’s post drew viral attention. Morrow’s been making the case longer, but he thinks the sudden resonance means investors are finally waking up to something fundamental.
“These aren’t small numbers—they’re huge. And the fact that someone like Burry is calling it out tells you people are starting to notice what’s happening between the lines of the balance sheet.”
The great depreciation stretch
Depreciation is explained, or perhaps not. Tech behemoths such as Microsoft, Meta, and Oracle invest billions in GPUs, servers, and cooling infrastructure for their AI data centers. Typically, these assets rapidly depreciate, impacting earnings. However, Burry asserts that many firms have recently, and discreetly, prolonged the claimed lifespan of this equipment, extending it from approximately three years to as long as six.
That simple change lets them spread out their costs and report fatter earnings now.
“Had they not made those changes,” Morrow says, “their earnings would be dramatically lower.”
Meta’s filings, for one, seem to at least corroborate the directionality of Burry’s and Morrow’s claim. Until 2024, servers and network gear were depreciated over four to five years; effective January 2025, Meta said they would “extend the estimated useful lives” of “certain servers and network assets” to 5.5 years.
“Depreciation expense on property and equipment,” Meta wrote in its annual filing for 2022, was “$8.50 billion, $7.56 billion, and $6.39 billion for the years ended December 31, 2022, 2021, and 2020, respectively.”
Depreciation represented a significant expense previously, and leadership evidently opted to extend the period for recognizing these expenses. This policy adjustment doesn't validate Burry's total figures across his companies, but it substantially shifts reported profits toward his stated view by reducing immediate depreciation costs and deferring more of them to future periods.
Morrow contends the timing is illogical. With technology advancing more rapidly—Nvidia now introducing new chips every 12 to 18 months rather than every two years—hardware becomes outdated more quickly, not less so.
Similar to any established technology, consider laptops; picture attempting to operate the latest iteration of Adobe Premiere Pro on a 2018 MacBook. While it might start, it's likely to overheat, slow down, or fail, as it wasn't designed for current computational needs. Morrow contends that older chips function similarly; they don't cease operation but rapidly diminish in economic worth as more advanced, quicker versions make them practically outdated.
Morrow's specialization lies in value investing and companies with substantial dividends, contrasting with the more fashionable high-growth tech stocks; he mentioned he holds no long positions in technology overall. Consequently, he stands to gain if Big Tech valuations shrink or if markets start re-evaluating the expenses embedded in AI investments. Nevertheless, his assessment echoes increasing apprehension among other analysts.
Richard Jarc, an analyst at Uncovered Alpha, has raised similar alarms about the mismatch between AI chip lifecycles and corporate accounting.
He has said indicating that the newest generation of GPUs degrades much more rapidly than what companies' amortization schedules imply. Although some cite the continued use of Nvidia’s H100 chips—which came out three years back—as proof of extended usefulness, Jarc states that this is deceptive.
Jarc contends that demand stays elevated primarily due to a handful of companies are subsidizing compute costs for end users, a situation driven by investor capital rather than core market principles. Crucially, Nvidia has transitioned from an 18–24 month cycle for new chip releases to an annual one. Given this, Jarc suggests that assuming GPUs retain their value for five to six years is impractical, estimating their actual economic lifespan to be nearer one or two years.
The Economist in September called the postponed depreciation “the $4 trillion accounting puzzle at the heart of the AI cloud,” pointing out that Microsoft, Alphabet, Amazon, Meta, and Oracle have all increased the service duration for their servers while Nvidia is reducing its chip lifespan to twelve months.
By The Economist’s estimates, if those assets were depreciated over three years instead of the longer timelines companies now assume, annual pre-tax profits would fall by $26 billion, roughly an 8% hit. A two-year schedule would double that loss, and if depreciation truly matched Nvidia’s pace, the implied market value hit could reach $4 trillion.
Not everyone buys the doom loop. In a note to clients sent this week, Bank of America’s semiconductor team argued that the market’s sudden skepticism about AI capex is evidence that the trade is far less overcrowded than credits claim.
The recent selloff in megacap AI names, the team led by Vivek Arya wrote, was driven by “correctable macro factors”—shutdown jitters, weak jobs data, tariff confusion, even misinterpreted OpenAI comments—rather than any real deterioration in AI demand. In fact, the firm pointed to surging ancillary segments like memory and optical (up 14% last week), as well as Nvidia’s disclosure of $500 billion-plus in 2025–26 data-center orders, as signs the underlying spending cycle remains “robust.”
Growth or just capital intensity?
Morrow's primary concern is that investors are misinterpreting sheer expenditure as actual expansion. He contends the market has lost the ability to differentiate between capital expenditure and authentic output. The surge in AI has propelled valuations from typical software metrics into territory resembling industrial infrastructure calculations. Constructing a single hyperscale, one-gigawatt data center, sufficient for a state-of-the-art AI model, can cost approximately $50 billion, with the bulk allocated to GPUs, then buildings, cooling mechanisms, and power systems.
Reality check—“none of these companies has ever managed a $50 billion project before,” he says. “Now they’re trying to do fifty of them at once.”
He points out the irony that many of these facilities aren't operational yet. Data centers throughout Santa Clara and Northern Virginia remain unused, awaiting grid connections that may take a considerable amount of time.
“Every month a $35 billion stack of GPUs sits without power, that’s a billion dollars of depreciation just burning a hole in the balance sheet,” he says. “So of course they’re panicking — and ordering their own turbines.”
He thinks the result will be a massive power glut by the late 2020s: an overbuilt grid, over-leveraged utilities, and ratepayers stuck with the bill.
“We’ve seen this movie before—in shale, in fiber, in railroads,” he says. “Every capital-spending boom ends the same way: overcapacity, low returns, and a bailout.”
He states the primary danger is that investors have completely disregarded balance sheets. He highlights the extreme market concentration currently observed, with almost fifty percent of all 401(k) funds now essentially linked to six mega-cap companies.
“This is the most crowded trade in history,” he says. “When it turns, it’s going to turn fast.”
