Micron beats Q4 expectations, raises outlook on AI-driven DRAM demand
Micron beat Q4 with $11.3B revenue and $3.03 EPS as AI data center demand lifted results; data center made 40% of sales. Guidance tops views: $12.2-$12.8B revenue, $3.60-$3.90 EPS.

Micron beats Q4, raises outlook as AI data center demand accelerates
Micron (MU) delivered fiscal Q4 results that beat consensus and reinforced the AI infrastructure trade. Data center demand did the heavy lifting, and guidance points to continued strength into the new fiscal year.
Q4 highlights
- Revenue: $11.3B vs. $11.15B expected.
- Adjusted EPS: $3.03 vs. $2.84 expected.
- Data center contribution: 40% of total revenue.
- After-hours move: shares briefly up ~2% to $171.50 before trimming gains.
The quarter landed above Micron's own updated outlook from August. Management credited AI data center demand as the key driver.
Guidance (FQ1 2026)
- Revenue: $12.2B-$12.8B (consensus ~$11.9B).
- Adjusted EPS: $3.60-$3.90 (consensus ~$3.05).
CEO Sanjay Mehrotra told analysts, "Over the coming years, we expect trillions of dollars to be invested in AI, and a significant portion will be spent on memory... Micron is uniquely positioned to benefit from the AI opportunity ahead."
Where the growth is coming from
- DRAM momentum: Segment revenue rose nearly 70% year over year to $8.98B, above the ~$8.55B expected. High Bandwidth Memory (HBM) paired with Nvidia GPUs is the core catalyst.
- Market backdrop: The DRAM market grew 83% to $95B in 2024 with Micron at ~35% share, according to Deutsche Bank's Melissa Weathers.
- Customer mix: Nvidia accounts for ~16% of Micron revenue on an annualized basis (Bloomberg). Samsung's slower AI ramp has also helped Micron's positioning versus Korea-based peers SK Hynix and Samsung.
On the flip side, NAND underperformed expectations. Revenue declined ~5% year over year to $2.25B vs. ~$2.35B expected. Management expects NAND demand to improve in fiscal 2026 as AI storage needs catch up. "AI has been a huge driver of DRAM, but AI also uses NAND in a pretty significant way," said chief business officer Sumit Sadana.
Strategic positioning and US build-out
Micron remains the only US-based manufacturer of memory at scale and plans to invest $200B in domestic facilities. The US focus, plus accelerating HBM adoption, is central to its thesis that memory will capture a meaningful share of AI capex over the next several years.
What matters for investors
- HBM execution: Watch output, yields, and backlog as AI servers scale. HBM mix drives both revenue and margins.
- Pricing and supply: DRAM/NAND bit growth and ASP trends will signal whether the upcycle has legs.
- Customer concentration: Nvidia demand remains a swing factor; monitor diversification across hyperscalers and enterprise AI builds.
- NAND recovery: Timing and magnitude of the NAND inflection will influence overall gross margin trajectory.
- Competitive response: Track SK Hynix and Samsung shipments in HBM and DDR5; any catch-up could pressure pricing or share.
- Policy and capex: US fab timelines and incentives affect cost curves and capacity planning.
- Sentiment risk: The AI trade has been strong; any slowdown in AI infrastructure orders could reset expectations.
Key numbers to keep on your dashboard next quarter
- Data center revenue mix (vs. 40% this quarter).
- HBM revenue/mix commentary and capacity adds.
- DRAM and NAND ASPs, bit supply growth, and inventory days.
- Gross margin progression and opex discipline alongside capex ramp.
For the full breakdown and updates, see Micron's investor relations page: Micron IR.
If you work in finance and track AI infrastructure, you may also find this resource useful: AI tools for finance.