- The Forward Thesis
- Posts
- Micron: The Memory Bet
Micron: The Memory Bet
A Forward Thesis Deep Dive
In an era where data is the new oil, one company sits at the intersection of virtually every technological revolution underway: Micron Technology.
From artificial intelligence to autonomous vehicles, cloud computing to smartphones, Micron's memory and storage products work as the foundation that supports these technologies.
While many investors rush to buy AI chipmakers like Nvidia and AMD, they often overlook the companies providing the critical components that make these chips functional in the first place. Micron represents exactly this kind of overlooked opportunity.
This deep dive explores how Micron has positioned itself at the center of the AI memory revolution, the competitive dynamics of the industry, and what it means for both Micron's future and the broader tech landscape.
Let's dive in.

The Memory Imperative
Memory isn't just a component – it's the bottleneck that increasingly determines what's possible in computing. As Nvidia CEO Jensen Huang memorably put it:
The future of AI isn't limited by compute, but by memory.
While processing capabilities have advanced phenomenally, the ability to move data to and from these processors – what engineers call "memory bandwidth" – has become the critical constraint. Think of it like driving a Ferrari but being stuck on a narrow country road – your actual speed is limited by the infrastructure around you.
AI workloads have exposed this bottleneck with their extreme scaling trajectory. Training large language models requires astonishing amounts of memory bandwidth to feed data to processors efficiently. The most advanced AI accelerators aren't maximizing their potential because they're waiting for data – a concept engineers call the "memory wall."
This creates both a challenge and an opportunity. The challenge: developing memory technologies that can keep pace with exploding processing demands. The opportunity: companies that solve this bottleneck stand to capture outsized value in the AI revolution.

Micron’s New York Semiconductor Factory
Enter Micron Technology, which has been quietly building the foundations for this moment for decades.

From Basement to Powerhouse
Micron's story began in 1978 in an unlikely location – the basement of a dental office in Boise, Idaho. Founded by Ward Parkinson, Joe Parkinson, Dennis Wilson, and Doug Pitman, the company started with a focus on semiconductor design and consulting before pivoting to memory manufacturing.

Micron’s Early Days
In its early years, Micron fought for survival in a market dominated by Japanese competitors, weathering numerous "memory cycles" – the industry's notorious boom-and-bust periods that have claimed many victims over the decades. The company's persistence through these challenging times built the resilience that characterizes Micron today.
Micron's growth accelerated through strategic acquisitions, most notably the 1998 purchase of Texas Instruments' memory business and the 2013 acquisition of Japanese competitor Elpida Memory. These moves transformed Micron from a small player into one of the world's largest semiconductor manufacturers.
Today, Micron stands as one of only three companies globally capable of producing advanced DRAM (Dynamic Random Access Memory) at scale (alongside Samsung and SK Hynix) and one of six major producers of NAND flash memory. This exclusive club is the result of the extraordinary technical complexity and capital intensity of memory production – creating formidable barriers to entry that protect incumbent players.
The Numbers Behind the Memory Giant
Micron's size and market position are impressive. With fiscal 2024 revenue of $25.1 billion (up 62% year-over-year), the company has demonstrated exceptional growth amid increasing demand for memory solutions across virtually all technology sectors.
The breakdown of this revenue shows Micron's business portfolio:
DRAM accounts for 70% of revenue ($17.6 billion)
NAND flash represents 29% ($7.2 billion)
Various other products make up the remaining 1%
By business unit, Micron's revenue distribution shows their reach across multiple markets:
Compute and Networking: $3 billion (Q4 FY2024)
Mobile: $1.9 billion
Storage: $1.7 billion
Embedded: $1.2 billion
Perhaps most telling is Micron's rapid profitability improvement. After weathering a challenging period in 2022-2023, the company has dramatically expanded its gross margins from negative territory to 36.5% in Q4 2024 – demonstrating the leverage in its business model when market conditions become favorable.
This financial recovery isn't merely cyclical – it reflects fundamental changes in the memory industry and Micron's strategic positioning within it.

Understanding the Technology: DRAM, NAND, and HBM
Before we get technical in Micron's strategy, it's worth understanding the core technology terms that form the foundation of the company's business:
DRAM (Dynamic Random Access Memory) is the primary working memory in most computing systems. It provides fast data access but is volatile, meaning it loses data when power is removed. DRAM serves as the short-term memory for computers, smartphones, and servers, temporarily storing data that processors need to access quickly.
NAND Flash is non-volatile memory that retains data even without power. It's the technology behind SSDs (Solid State Drives), USB flash drives, and storage in smartphones. NAND is better suited for long-term storage rather than working memory due to its slower access speeds but much higher density and persistence.
HBM (High-Bandwidth Memory) represents the cutting edge of memory technology, specifically designed to address the bandwidth bottleneck. Traditional memory architectures place DRAM chips beside processors on a circuit board, requiring data to travel relatively long distances. HBM takes a revolutionary approach by stacking multiple DRAM dies vertically and placing them much closer to the processor—often on the same package. This proximity dramatically reduces the distance data must travel, enabling much higher bandwidth and lower latency.
TSVs (Through-Silicon Vias) are a critical enabling technology for HBM. These microscopic vertical connections pass through the silicon dies in a 3D-stacked memory chip, allowing the vertically stacked layers to communicate with each other and with the base die that connects to the processor.
The AI-Driven Memory Revolution
The most significant catalyst transforming Micron's business is the artificial intelligence boom, which has fundamentally altered the memory landscape in three critical ways:
1. Exponential Demand Growth
AI training and inference require extraordinary amounts of memory. Consider these figures:
A single Nvidia H100 GPU needs 80GB of HBM memory
Training GPT-4 reportedly required over 25,000 GPUs
A modern AI data center can consume more memory than entire countries did a decade ago
This translates into staggering market growth projections. The High-Bandwidth Memory (HBM) market alone is expected to grow from approximately $4 billion in 2023 to over $25 billion in 2025, according to Micron's estimates – a more than 6x increase in just two years.
This is largely due to eye-watering capital expenditure from the biggest hyperscalers in AI, with plans to spend more than $300 billion on AI infrastructure, up more than 40% from last year.

AI CapEx Estimates | Source: Yahoo Finance
2. Higher Value Memory
Not all memory is created equal. AI applications demand specialized, high-performance memory solutions that command premium pricing. High-Bandwidth Memory (HBM), the specialized DRAM format used in AI accelerators, costs substantially more per gigabyte than standard memory. More importantly, it delivers significantly higher margins.
Micron CEO Sanjay Mehrotra confirmed this in the company's recent earnings call:
Our fiscal Q4 HBM gross margins were accretive to both company and DRAM gross margins.
This is extraordinary considering that DRAM already delivers strong margins when the market is healthy.
3. Favorable Industry Structure
The memory industry has long been plagued by boom-and-bust cycles driven by oversupply. The AI revolution is creating a more constructive environment in several ways:
HBM production requires approximately three times as many wafers as traditional DRAM to produce the same number of bits, effectively reducing industry supply and intrinsically increasing the value of their product
The technical complexity of HBM limits competition to just three players globally: Micron, SK Hynix, and Samsung
Memory manufacturers have demonstrated greater supply discipline after painful lessons from previous cycles and margins are improving
These factors combine to create what industry analysts call a "supercycle" – an extended period of favorable supply-demand dynamics that could deliver sustained profitability improvement.
Micron's HBM Advantage
At the center of this AI memory revolution is Micron's High-Bandwidth Memory (HBM) business, which represents perhaps the company's most significant growth opportunity.
HBM differs from standard DRAM in critical ways. By stacking multiple memory dies vertically and connecting them with thousands of microscopic through-silicon vias (TSVs), HBM delivers dramatically higher bandwidth while minimizing the physical distance data must travel. This architecture makes it ideal for bandwidth-intensive AI workloads like AI model training.
Micron has made good progress in this space:
In fiscal 2024, Micron delivered "several hundred million dollars" in HBM revenue
For fiscal 2025, the company expects "multiple billions of dollars" from HBM
Micron aims to achieve HBM market share commensurate with its overall DRAM market share sometime in calendar 2025
Most importantly, Micron appears to have leapfrogged competitors with its HBM3E technology.

Micron’s HBM3E Cube
The company's 12-high HBM3E stack, which delivers 36GB capacity (50% more than competing 8-high solutions), also consumes 20% less power – a critical advantage for power-constrained data centers.
This technical leadership has favorable financial implications in a bullish market.
Our HBM is sold out for calendar 2024 and 2025, with pricing already determined for this time frame.
Beyond HBM: Micron's Broader AI Portfolio
While HBM captures headlines, Micron's AI strategy extends far beyond this single product line. The company is driving innovation across multiple memory technologies to address different segments of the AI market:
High-Capacity Data Center DRAM
Not all AI workloads require HBM. Many applications benefit from standard DDR5 DRAM in extremely high capacities. Micron is capitalizing on this trend with its mono-die-based 128GB DIMM products, which deliver higher performance and reliability than multi-die alternatives.

Micron’s DRAM Chips
Low-Power DRAM for Servers
Micron is also leading the use of LPDDR (traditionally used in mobile devices) for data center applications. This solution offers significant power efficiency advantages which is important energy-constrained data centers.
Data Center SSDs
Often overlooked in AI discussions is the critical role of storage. AI models require enormous datasets for training, and inference operations need rapid access to these models. Micron's data center SSD business has exploded, with quarterly revenue exceeding $1 billion in fiscal Q4 2024 – more than tripling year-over-year.
This diversified approach ensures Micron captures value across the entire AI memory stack, not just in the spotlight HBM segment.
The Edge AI Opportunity
While data center AI dominates current discussions, the next frontier is "edge AI" – artificial intelligence running on devices rather than in the cloud. This trend is creating new memory demand vectors that play to Micron's strengths:
AI-Enabled Smartphones
Leading Android manufacturers are now shipping phones with 12-16GB of DRAM, compared to an average of 8GB in flagship phones last year. This content growth directly benefits Micron as a leading supplier of mobile DRAM. The company's low-power DRAM technology, including its LP5X products manufactured on its advanced 1-beta process, is particularly well-suited for power-constrained mobile devices running AI workloads.
AI PCs
The PC market is undergoing a similar transformation. New AI-enabled PCs feature dramatically higher memory content:
Value segment: Minimum 16GB DRAM (vs. 8GB previously)
Mid/premium segments: 32-64GB DRAM
Storage: Larger, faster SSDs to support local AI models
These increases directly translate to higher memory content per device and thus higher revenue opportunity for Micron.
Automotive
Perhaps the most promising long-term opportunity is in automotive, where advanced driver assistance systems (ADAS) and autonomous driving features demand ever-increasing amounts of memory. Micron has achieved record automotive revenue for four consecutive years and continues to invest in automotive-grade memory products that meet the stringent reliability requirements of this sector.

The Memory Triopoly
Micron operates in a highly concentrated industry where just three companies – Samsung, SK Hynix, and Micron – control virtually the entire DRAM market. This structure results from the enormous technical complexity and capital requirements of memory production, which create nearly insurmountable barriers to entry.

Source: Maeli Business
Each member of this "triopoly" has distinct characteristics:
Samsung Electronics
The South Korean giant dominates with approximately 40-44% market share in DRAM. Samsung's advantages include vertical integration (they make everything from phones to TVs that consume their own memory) and financial resources that allow them to outspend competitors on capital expenditures. However, Samsung's diversified business means memory isn't always their top priority, potentially creating opportunities for more focused players like Micron.
SK Hynix
Another South Korean firm, SK Hynix holds approximately 27-30% market share in DRAM. The company has shown particular strength in HBM, where it secured early partnerships with Nvidia. However, capacity constraints have reportedly limited SK Hynix's ability to fully capitalize on HBM demand.
Micron Technology
With approximately 21-23% DRAM market share, Micron is the smallest of the three major players. However, the company has demonstrated leadership in technology transitions and product innovation. Micron's focus as a pure-play memory manufacturer also ensures memory receives its full strategic attention, unlike its more diversified competitors.
In NAND flash, the market is slightly less concentrated with Samsung, Kioxia, Western Digital, SK Hynix, Micron, and Intel as the major players. However, the dynamics remain similar – enormous capital requirements and technical complexity limit new entrants.
Unless current chips players like Nvidia, Broadcom, or TSMC begin allocating large amounts of their budget towards memory focused compute, the competitive landscape remains predictable.
The Cyclical Nature of Memory
No analysis of Micron would be complete without addressing the elephant in the room: the notorious cyclicality of the memory industry.
Memory has historically followed boom-and-bust cycles driven by supply-demand imbalances:
Rising prices encourage manufacturers to invest in new capacity
After 2-3 year lead times, this new capacity comes online simultaneously
Supply exceeds demand, causing prices to crash
Manufacturers reduce investments, eventually leading to shortages
The cycle repeats
These cycles have created extreme volatility in Micron's financial performance. For instance, the company's quarterly revenue dropped from $8.44 billion in Q4 2022 to $3.69 billion in Q1 2023 – a 56% decline in just one quarter.

Semiconductor Revenue vs Index Growth | Source: MacroMicro
Such volatility explains why Micron has historically traded at lower multiples than other semiconductor companies. However, there are reasons to believe the current cycle may be different:
Industry Consolidation: With only three major DRAM producers, coordination on capacity additions has improved.
Capital Intensity: New memory nodes require increasingly expensive equipment, making companies more cautious about expansions. When companies are spending 10s or 100s of billions on infrastructure, they better be confident in the long term validity of these investments.
Diverse End Markets: Memory demand now comes from multiple sectors (AI, mobile, automotive, etc.), reducing the dependence on any single market.
HBM Trade Ratio: As mentioned earlier, HBM production consumes approximately three times more wafer capacity than equivalent bits of standard DRAM, effectively reducing overall bit supply growth and increasing the value of HBM products.
These factors suggest potential for longer periods of profitability and less severe downturns – though cyclicality will never be eliminated entirely.
Challenges and Risks
Despite Micron's strong positioning, the company faces several significant challenges:
Geopolitical Tensions
Memory manufacturing sits at the center of US-China technological competition. Micron has already faced restrictions on selling to certain Chinese customers, and further escalation could impact approximately 25% of the company's revenue. While Micron is diversifying its manufacturing footprint with new facilities in the US (Idaho and New York) and India, these plants will take years to reach full production.
Technical Complexity of HBM
HBM represents both Micron's greatest opportunity and a significant execution risk. The technology is extraordinarily complex, requiring the stacking of up to 12 memory dies with thousands of microscopic connections between them. Yield challenges have plagued all manufacturers, and maintaining technological leadership requires continuous innovation.
Capital Intensity
Memory manufacturing is among the most capital-intensive industries in the world. Micron expects capital expenditures in fiscal 2025 to be "in the mid-30s percentage range of revenue" – potentially exceeding $10 billion. This heavy investment is necessary to maintain technological competitiveness but creates financial risk if market conditions deteriorate unexpectedly.
Chinese Competition
While the memory triopoly seems secure for now, China has made semiconductor self-sufficiency a national priority. Companies like YMTC (in NAND) and CXMT (in DRAM) have made progress despite Western export controls. While these companies remain years behind technologically, their state backing makes them potential long-term threats.

Investment Implications
Micron's position at the nexus of AI, cloud computing, and edge computing creates a compelling investment case, particularly at current valuations.
As of early March 2025, Micron trades at approximately 9-10x forward earnings – significantly below the broader semiconductor sector average of 20-25x. This discount reflects both historical volatility and investor skepticism about the durability of the current upcycle.
However, several factors suggest this discount may be unwarranted.
Firstly, AI is creating sustained demand growth across multiple market segments — and this isn’t going to stop anytime soon. Second, the triopoly structure has reduced the likelihood of drastic oversupply, reducing the likelihood of cyclicality. Also, New US-based fabs reduce geopolitical risks and may attract government subsidies
Micron's guidance also suggests strong momentum through fiscal 2025, with the company projecting:
Record quarterly revenue in fiscal Q1 2025
Substantial full-year revenue record
Significantly improved profitability
Strengthening free cash flow
For investors, the key question isn't whether Micron will perform well in the next few quarters – that seems highly likely given current industry conditions and the company's strong competitive positioning. Rather, the question is how long the favorable cycle will last and how severe the eventual downturn might be.
The shift toward higher-value memory products, particularly in AI applications, suggests that even when the cycle eventually turns, Micron may experience a softer landing than in previous downturns.
Conclusion
Micron Technology stands at the intersection of multiple technological revolutions, providing the critical memory and storage infrastructure that enables everything from cloud computing to artificial intelligence, smartphones to autonomous vehicles.
The explosion of AI workloads has dramatically increased both the quantity and quality of memory demand, creating perhaps the most favorable environment for memory manufacturers in decades. Micron, with its leading technology and focused strategy, appears exceptionally well-positioned to capture this opportunity.
While cyclicality will never be eliminated from the memory industry, structural changes suggest the current upcycle may be more durable than historical patterns would indicate. For investors willing to look beyond short-term volatility, Micron offers exposure to the AI revolution at a valuation that remains compelling despite the stock's recent performance.
In a world increasingly driven by data, the companies that enable its storage, movement, and processing will be critical infrastructure providers for decades to come.
Micron's journey from a basement in Boise to a global technology leader is remarkable – but its most exciting chapters may still lie ahead.
That's why you read The Forward Thesis.
Until next time.