
The rush to build out AI data centers and advanced AI systems is creating major pressure points in memory infrastructure. Custom high-bandwidth memory (HBM) and surging commodity DRAM prices are just a portion of the voracious demand that is reshaping workflows for memory manufacturers. In 2026, procurement teams must understand these changes to prepare for the seismic shifts ahead.
As artificial intelligence continues its transition from proof-of-concept to production at scale, memory components are emerging as a critical chokepoint in the industry’s infrastructure stack. SK Group’s AI Summit 2025, held in Seoul under the banner “AI Now & Next,” put the spotlight on this challenge.
With participation from global tech firms, startups, and research institutions, the summit laid out a strategic blueprint for how next-gen memory architectures must evolve to meet AI’s accelerating compute demands.
Throughout the summit, SK preached an overarching message about AI being a defining force across industries, economies, and societal functions. Companies around the world now see AI adoption as a necessity for their survival. This shift has become visible in boardrooms, investor meetings, and in everyday operational pivots.
The pressure this role puts on infrastructure is massive. In the memory domain, the fallout is already here. As AI workloads scale, especially in inference, memory throughput and efficiency become increasingly important. To this point, most firms have prioritized scaling capacity, but that approach is no longer sufficient.
Reiterated at many points during the AI Summit 2025 was the fact that efficiency-optimized memory solutions will be key to moving the industry forward. Tomorrow’s sought-after solutions will deliver greater bandwidth, energy savings, and latency improvements at the data center level.
SK Hynix’s ambitious new positioning as a “full-stack AI memory creator” reflects this trend. Rather than simply scaling DRAM or shipping HBM modules, SK Hynix aims to deliver custom memory stacks tailored to specific AI applications. The firm highlighted plans for co-packaged memory designed to minimize data transfer bottlenecks, solutions targeting inference optimization, and next-gen HBM variants that go beyond current bandwidth ceilings.
Ultimately, this strategy is an important break from the status quo of “scale-first” competition. With rising capex for AI infrastructure and concerns over power density and floor space in AI data centers, memory efficiency is quickly becoming as important as raw capacity. It’s likely other organizations will respond to this trend in the near future by following SK’s lead.
Procurement teams and supply chain leaders should note this pivotal moment. As advanced memory becomes central to AI competitiveness, visibility into evolving demand-supply dynamics is essential. Sourceability can help clients anticipate allocation shifts, identify emerging bottlenecks, and source memory components that align with strategic adjustments ahead of the competition.
The global DRAM market is in the midst of a dramatic repricing cycle, one that industry experts fear is just getting started. According to data reported by CTEE, DRAM contract pricing has surged nearly 172% year-over-year as of Q3 2025.
The primary driver? Soaring demand for high-end memory to support AI infrastructure buildouts that is reshaping production priorities across the supply chain. With data center demand gobbling up capacity, there is little remaining for consumer-grade memory products.
As AI training and inference workloads grow more sophisticated and memory-intensive, suppliers are reallocating fab and packaging resources toward HBM and DDR5 variants with tighter latency, higher bandwidth, and better thermal profiles. The result in consumer markets is rapidly tightening supply and steep price increases that are cascading downstream.
TechPowerUp reports that Kingston and ADATA are “paying $13 for 16 GB DDR5 chips that cost $7 just six weeks ago, an increase significant enough to erase their entire gross margin.” Meanwhile, key Nvidia supplier SK Hynix reports that it has already completely sold out its chip production for 2026, including DRAM, HBM, and NAND products.
These troubles have led ADATA chairman and CEO Chen Libai to declare that Q4 2025 marks the launch of a significant upward pricing trend in the memory sector.
Industry analysts expect elevated DRAM pricing to persist well into 2026, especially if hyperscale investment remains strong and SK Hynix, Samsung, and Micron continue to prioritize premium segments. Supply constraints are likely to worsen if additional geopolitical tensions or raw material bottlenecks hit production, particularly in Asia.
OEMs and distributors will need to adapt quickly to these changes. Planned product launches will increasingly be at risk of cascading delays. With allocation heavily skewed toward AI data center demand, small to mid-size companies are left with few options. They can gamble on orders being fulfilled on time, pay exorbitant markups on the spot market, or idle their production lines. None of these choices is ideal, but in the current market, the game becomes one of choosing the lesser evil.
For sourcing leaders, this environment demands more than reactive purchasing. Sourceability can help clients mitigate the impact of memory price shocks by monitoring contract prices in real time and advising on alternative sourcing channels through our network of verified suppliers.