AI Data Centers Are Eating All the Memory Chips. Your Next PC Will Cost You.
Samsung, SK Hynix, and Micron are shifting production toward AI-critical HBM and server DDR5, starving consumer DRAM and NAND supply — with data centers expected to consume 70% of all memory produced in 2026. NAND spot prices are up roughly 5x since August 2025, and SK Groups chairman says this shortage will last until 2030.
Server racks in a modern data center with blue LED lights
Key Points
•Samsung, SK Hynix, and Micron are shifting production toward AI-critical HBM and server DDR5, starving consumer DRAM and NAND supply — with data centers expected to consume 70% of all memory produced in 2026 [1]
•NAND spot prices are up roughly 5x since August 2025, with consumer SSDs that cost $74 at historical lows now approaching $331, while DDR5 kits that sold for $60–$90 in late 2025 have doubled or tripled in price [1][2]
•SK Groups chairman says the memory shortage will last until 2030, making this a structural supply crisis — not a blip — with cascading effects on CPUs, laptops, gaming handhelds, and server procurement worldwide [3]
The Price Tag You Weren't Expecting
If you've tried to buy a new SSD, a RAM upgrade, or a laptop in the past few months, you've probably noticed something unpleasant: the prices have gone through the roof.
A 1TB Crucial P3 Plus — one of the most popular budget SSDs on the market — hit a historical low of about $74 in mid-2025. Today? You're looking at $331 for the same drive. NAND flash spot prices have surged roughly fivefold since August 2025, and the trend line isn't flattening [1].
RAM tells the same story. Those 32GB DDR5 kits that were a routine $60 to $90 purchase in October 2025 now run $150 to $180. Some DDR5 listings on Newegg have hit genuinely absurd prices — $4,000 for kits that would have been a fraction of that a year ago. Even DDR4, the older standard most people assumed was safely cheap, has started climbing [2].
This isn't a supply chain hiccup from a factory fire or a pandemic-era backlog. This is a deliberate reallocation of the world's memory manufacturing capacity toward a single customer: the AI industry.
Follow the Money to the Data Center
Here's what's actually happening inside Samsung, SK Hynix, and Micron — the three companies that produce the vast majority of the world's memory chips.
Memory fabs are prioritizing HBM and server DDR5 over consumer products.
All three have made the same calculation: AI data centers pay more, buy in massive volume, and sign long-term contracts. Training a single large language model can require tens of thousands of GPUs, each paired with HBM (High Bandwidth Memory) — the specialized, vertically stacked DRAM that Nvidia's chips demand. A single Nvidia Blackwell GPU uses more HBM than an entire consumer PC uses regular DRAM [1].
The numbers are staggering. Data centers are expected to consume 70% of high-end memory production in 2026. That's not 70% of all chips — it's 70% of the good stuff, the highest-margin, highest-performance memory that these fabs can produce [1].
When Samsung or SK Hynix has to choose between fulfilling a $500 million HBM order from a hyperscaler and producing consumer DDR5 sticks for Best Buy, the decision is obvious. The fabs aren't getting bigger fast enough to serve both markets. So the consumer market gets what's left over.
SK Group's chairman, Chey Tae-won, said it plainly: the memory chip shortage will last until 2030. That's not a hedge or a pessimistic forecast — it's a structural assessment of how long it will take for manufacturing capacity to catch up with AI demand [3].
The Cascade Effect: It's Not Just Memory
The memory crisis isn't happening in isolation. It's triggering a cascade across the entire PC component supply chain.
CPU prices are rising 10 to 15% as Intel and AMD shift their own manufacturing capacity toward data center processors, which — you guessed it — carry higher margins than consumer chips. Asus has forecast 25 to 30% price increases for PCs in Taiwan over the next quarter, with similar impacts expected globally [3].
The gaming hardware market is feeling it especially hard. Ayaneo, a company that makes gaming handhelds, recently cancelled a $4,000 device outright — not because of weak demand, but because storage and RAM costs made the product financially unviable at any reasonable price point [1].
Gartner's latest analysis warns of broad impacts on PC, laptop, and server procurement through the rest of 2026. Enterprise IT departments are being told to lock in contracts now or face even higher prices in Q3 and Q4. For consumers, the message is simpler: if you were planning to build a PC, upgrade your RAM, or buy a new laptop, doing it sooner rather than later probably saves you money [3].
Who Actually Benefits (and Who's Getting Squeezed)
Let's be clear about the economics here. This shortage is extremely profitable for the companies causing it.
Samsung, SK Hynix, and Micron are all reporting surging revenues. HBM sells at margins that consumer DRAM and NAND can't match. When you shift production from a $5 consumer memory stick to a $50 HBM unit that goes into an Nvidia GPU, your revenue per wafer skyrockets even if your total unit volume drops.
Nvidia benefits too. The scarcity of HBM gives it leverage over customers — if you want the chips, you accept the price and the wait time. The company's $1 trillion in reported AI chip orders through 2027 essentially guarantees that memory producers will continue prioritizing data center products for years [1].
The losers are everyone downstream: PC builders, gamers, small businesses, students, and anyone in developing markets where a 130% price increase on components makes the difference between affording a computer and not. The "democratization of computing" narrative that the tech industry has been telling for decades is running headfirst into a wall of AI infrastructure demand.
What Consumers Should Actually Do
If you're in the market for memory or storage, here's the practical reality:
Don't wait for prices to drop. The supply dynamics driving this shortage are structural, not cyclical. SK Group says 2030. Even optimistic analysts don't see meaningful relief before late 2027 [3].
DDR4 is your friend — for now. If your system supports DDR4, you're in a slightly better position. DDR4 prices have risen, but less dramatically than DDR5. Upgrading to DDR4 now, while it's still being produced, is probably smarter than waiting [2].
Consider used or refurbished components. The secondary market hasn't caught up to new pricing as aggressively. A used 1TB NVMe drive from a system pull can be had for a fraction of new retail — with most of its lifespan still ahead of it.
Watch for retailer gouging. Some of the most extreme pricing — the $4,000 DDR5 kits on Newegg, for instance — isn't a reflection of manufacturer pricing. It's retailers and scalpers capitalizing on scarcity. Shopping around and avoiding panic purchases can save hundreds [2].
The Bigger Picture: AI's Hidden Tax
This story is ultimately about externalities — costs that the AI industry creates but doesn't directly pay.
Every new data center that Meta, Google, Microsoft, or Amazon breaks ground on represents not just electricity and water consumption (both well-documented concerns), but a claim on the global memory supply that directly reduces availability for everyone else. The AI companies aren't doing anything wrong in a market sense — they're buying what they need at the price the market sets. But the market isn't pricing in the societal cost of making basic computing more expensive for billions of people [1].
There's a policy dimension here that hasn't gotten enough attention. Government subsidies for semiconductor manufacturing — the CHIPS Act in the US, similar programs in the EU and Japan — were designed to boost domestic chip production for national security reasons. But if the new fabs being built with those subsidies primarily serve AI data centers, taxpayers are effectively subsidizing the very demand that's making their own computers more expensive.
None of this means AI isn't worth building. But the conversation about AI costs has been almost exclusively about training compute, electricity, and water. Memory — the component you can't run any computer without — hasn't been part of that discussion. It should be.
The 1TB SSD in your laptop isn't just a storage device. It's now competing for the same silicon, the same clean rooms, and the same engineering talent as the AI infrastructure being built to power the next generation of language models. And right now, the AI infrastructure is winning.
On this page
Web · https://www.techradar.com/computing/cpu/forget-the-ram-crisis-storage-prices-are-spiralling-and-processors-could-be-next-as-gaming-pc-maker-warns-cpu-shortage-is-getting-more-serious