By Dan Gallagher
Score a big one for the “buy the dip” crowd.
Micron Technology has been in an awfully big dip. The memory chip maker is no stranger to harsh sales cycles, but an inventory glut made the last one particularly bad, after customers spooked by pandemic shortages severely over-ordered.
Micron’s revenue slid 49% in its latest fiscal year — its worst annual decline since at least the late 1980s, which is as far back as data tracked by S&P Global Market Intelligence goes. It also generated a record operating loss of $5.7 billion in fiscal 2023, the first time it has lost money on that basis in more than a decade.
But investors have been betting on a recovery, especially after Micron’s quarterly revenue growth turned positive for the fiscal first quarter that ended in November. The company’s latest results for the fiscal second quarter delivered even more on that front, with revenue jumping 58% from a year earlier to $5.8 billion. Micron is also projecting a 76% surge in revenue for the current quarter. Both were well ahead of Wall Street’s projections.
Profitability also returned — one quarter earlier than expected. Micron’s adjusted operating income came in at $204 million compared with the $263 million loss analysts expected.
Micron’s shares thus jumped about 15% Thursday morning. That builds on an already beefy run that had pushed the stock up 64% over the last 12 months before the results late Wednesday, having fallen 31% the 12 months before that. That has put Micron firmly in the echelon of chip stocks blown up by hype over generative artificial intelligence. The PHLX Semiconductor Index had logged a 12-month gain of 55% before Micron’s latest report.
Such a run may suggest that Micron’s shares have already topped out early. But the demand being sparked by AI is only beginning to hit the company’s books. The popular AI systems being sold by Nvidia and other chip makers require a specialized form of memory called HBM, for high bandwidth memory. South Korea’s SK Hynix was an early mover on this technology and has seen a strong revenue boost from it.
Micron announced last month that it began production on its own HBM, which will be used in Nvidia’s latest H200 systems. The Blackwell systems that Nvidia announced earlier this week and plans to begin shipping late this year also use 33% more HBM memory than its predecessors.
So Micron’s strong run still has legs. The company said on its conference call late Wednesday that it has already sold out of HBM production capacity for this year, with the “overwhelming majority” of its HBM supply for next year allocated as well. Micron says revenue for the fiscal year ending in August 2025 will exceed prior records. Wall Street now projects revenue surging 58% this year and 40% next year. “HBM is arguably the strongest secular driver the industry has ever seen, ” wrote Srini Pajjuri of Raymond James.
Growing production of HBM memory also helps to rationalize supply of traditional DRAM memory, thus helping prices in what is still Micron’s largest business. Brian Chin of Stifel says memory makers need to convert capacity for two regular DRAM wafers to produce one HBM wafer, so tighter supply of DRAM will boost Micron’s average selling prices, or ASPs.
“ASP is hard to predict, but tight supply is a good place to start and conditions are ripe for Micron to optimize around higher-value/margin product mix,” he wrote in his report Thursday. Micron may be late to the AI party, but better late than never.
Write to Dan Gallagher at dan.gallagher@wsj.com