3 Semiconductor Stocks to Play the AI Memory Supercycle
AI's center of gravity is shifting from compute to memory. As models scale, the bottleneck isn't just GPUs - it's the high-bandwidth memory and flash that feed and store their data.
As one analyst put it, "memory is the next frontier." Expect the winners to be the companies that can ship HBM at scale and keep yields high while pricing stays firm.
1) Micron (MU): HBM torque with room on valuation
Micron has gone from cyclical also-ran to essential in the AI server stack. The driver: high-bandwidth memory (HBM), a specialized DRAM built for AI training throughput.
The company sees HBM's total addressable market reaching roughly $100 billion by 2028, implying ~40% CAGR. Even after a massive run, shares trade around 9.9x forward earnings - a discount to the S&P 500 at ~22x and well below leading AI names.
HBM is complex to produce, which is soaking up fab capacity that would normally go to phones and standard storage. That supply pinch is giving Micron pricing power and better margins. As one analyst framed it, this setup is like finding a rare asset at a garage sale - unusual value in plain sight.
2) SK Hynix (000660.KS): The HBM epicenter - with capacity risk
SK Hynix is the primary HBM supplier to Nvidia and held an estimated ~60% market share in late 2025. The upside case is simple: if demand for next-gen HBM4 stays hot, Hynix benefits first.
The risk is equally clear: capacity constraints. If the company can't ship enough HBM4 in 2026, share can shift to rivals. Even so, recent forecasts have pegged SK Hynix's HBM4 share as high as ~70% in 2026 as it supports Nvidia's next Rubin platform.
Net: operational execution will decide whether Hynix extends its lead or cedes ground.
3) Sandisk (SNDK): The NAND surprise tied to "AI at the edge"
After its spin-off from Western Digital, Sandisk has ripped higher - up over 800% in the past year. Unlike DRAM, Sandisk's strength is NAND flash, the long-term storage that edge devices rely on.
Think robots, factory automation, and autonomous systems storing and processing data locally. As those deployments scale, NAND demand tightens - and Sandisk's leverage grows.
Why memory now
HBM's bandwidth and proximity to compute are essential for training. Meanwhile, data growth and model checkpoints drive demand for both DRAM and NAND across data centers and edge devices.
Companies like Micron, SK Hynix, and the memory arm of Samsung are now central to AI infrastructure. The near-term setup favors suppliers that can add HBM capacity without wrecking yields.
What HBM is and why it matters (JEDEC)
Risks, timing, and what to watch
Memory is still a commodity. Unlike a proprietary software stack, buyers can shift orders across suppliers once supply loosens - putting pricing at risk after the crunch.
- HBM4 ramp: yields, lead times, and confirmed allocations into 2026
- Capacity adds: how quickly Micron, SK Hynix, and Samsung bring new lines online
- Pricing: HBM and mainstream DRAM/NAND ASP trends as bottlenecks ease
- Customer concentration: Nvidia exposure and diversification across other AI buyers
- Capex signals: hyperscaler spend, AI server mix, and edge deployments
Bottom line for finance pros
The trade is memory. Micron offers valuation support plus HBM exposure, SK Hynix sits on the supply choke point, and Sandisk is levered to edge storage demand.
The near-term setup favors suppliers amid scarcity. The medium-term hinges on execution, capacity, and how fast the bottleneck breaks.
Helpful resources
Your membership also unlocks: