Micron Is Suddenly a Hot AI Stock — And That’s Not Hype — Barrons.com

By Eric J. Savitz

In yet another feat of digital alchemy, artificial intelligence has transformed the memory chip producer Micron Technology into a rip-roaring growth stock.

Investors with long memories will be skeptical that this new version of the Micron story is sustainable. Historically, the memory chip business has been wildly cyclical. And yet, and I hesitate to say this, things are different this time. Already trading at an all-time high and up 29% year to date, Micron is one of the market’s best and least appreciated long-term plays on the AI trend.

Even better, the stock is still cheap — just how cheap we’ll get to in a minute. But let’s be clear: Micron has always been a challenging company to value.

The memory-chip maker has cutting-edge semiconductor technology, which it applies to both DRAM and NAND memory chips, used in PCs, servers, smartphones, cars, and every other imaginable electronic device.

However, the memory chip business has insanely complex dynamics. It requires vast and expensive production capacity and extensive R&D. Also, it goes through periods where capacity sharply outstrips demand, causing wild price fluctuations. All this despite the fact that Micron has only a handful of competitors. DRAMs are made primarily by Micron, Samsung Electronics, and SK Hynix. NANDs are produced by those three, plus Western Digital and its joint-venture partner Kioxia.

Just a few quarters ago, Micron was mired in a post-Covid downswing, as demand collapsed for both PCs and smartphones, two of Micron’s largest end markets. High customer inventories aggravated the issue. In the August 2023 fiscal year, revenue was down 49% to $15.5 billion, the lowest level since fiscal 2016. At one point sales fell by double digits for five straight quarters, including four quarters in a row down 40% or more from a year earlier.

A few quarters ago, the excess inventories in the supply chain began to ebb, while demand for both smartphones and PCs appeared to bottom. Demand — and pricing — began to improve. But what’s really changed at Micron is simply this: Running artificial intelligence software requires massive amounts of memory.

This past week, Micron reported stunning financial results for its fiscal second quarter ended Feb. 29. Revenue jumped 58% year over year to $5.8 billion, some $500 million above the company’s own estimates. And while Micron had expected to report a sixth straight quarter of non-GAAP losses, the company instead earned 42 cents a share, 70 cents above estimates. May-quarter guidance is for more of the same, with the company forecasting revenue of $6.6 billion, up 76% from a year earlier, and $600 million above Street consensus estimates.

Several intertwined factors have spurred the huge rebound in Micron’s fortunes, all tied to AI. Data centers are increasingly adopting a standard known as HBM, or high bandwidth memory, and Micron has become an important player in the market. Micron said its HBM supply is sold out for calendar 2024, and for most of 2025. Raymond James analyst Srini Pajjuri says HBM is “arguably the strongest secular driver the industry has ever seen.”

Chip makers create HBM components on the same equipment they use to build old school DRAM. Sumit Sadana, chief business officer at Micron, told Barron’s that HBM chips require larger pieces of silicon and have lower yields than conventional DRAM. The result is that when you make HBM, you are producing about a third as many bits as traditional DRAM.

But HBM is far more lucrative for Micron and its memory chip rivals, so they are all making as much HBM as they can. The result is that as HBM production increases, DRAM production falls, supply tightens, and prices spike. That’s one reason DRAM prices had high teens sequential growth in the February quarter.

AI data centers also are snapping up flash-based solid-state drives — basically all-chip versions of disk drives. Sadana reports that Micron’s sales of enterprise SSDs were up 100% in the latest quarter from a year earlier. “We’re selling everything we can make,” he says. That kind of growth is one reason NAND pricing jumped more than 30% sequentially in the latest quarter.

There’s more. Coming in the next few quarters will be a flood of AI-focused personal computers and smartphones. Micron says AI PCs, which should begin selling in volume late this year, require 40% to 80% more DRAM than current models. AI smartphones need 50% to 100% more DRAM than current high-end phones.

The trends are so powerful that Micron felt comfortable telling the Street to expect record revenue and much improved profitability in the August 2025 fiscal year, which doesn’t even start for five more months. KC Rajkumar, an analyst with research boutique Lynx Equity Strategies, wrote in a research note that the results were “unimaginably impressive for what is supposed to be a cyclical business.” As he notes, not even Nvidia is providing a two-year outlook.

Rajkumar says he finds Micron to be more attractive than other AI chip players like Nvidia, Advanced Micro Devices, Broadcom, and Marvell Technology. He thinks Micron will “shed its cyclicality” on the back of the huge AI-driven demand, and shift into a multiyear growth phase, becoming a pure play bet on AI. And he notes that Micron, even after a 29% rally this year, trades for five times consensus current-year revenue estimates, compared with AMD at 11 times, Broadcom at 13 times, and Nvidia at 23 times. Rajkumar thinks investors are going to use those stocks as a source of funds — to buy more Micron.

Write to Eric J. Savitz at eric.savitz@barrons.com

Scroll to Top