SK hynix expands AI memory portfolio beyond HBM with LPDDR6 breakthrough

by Candice Kim Posted : March 11, 2026, 14:59Updated : March 11, 2026, 14:59
SK hynix’s LPDDR6 mobile DRAM chip Courtesy of SK hynix
SK hynix’s LPDDR6 mobile DRAM chip/ Courtesy of SK hynix

SEOUL, March 11 (AJP) - SK hynix said Tuesday it has developed the world’s first 16-gigabit LPDDR6 DRAM built on its sixth-generation 10-nanometer-class (1c) process, positioning the chipmaker to capture the next wave of artificial-intelligence demand spilling over from data centers to smartphones and mobile front. 
 

The new mobile memory, designed for devices running on-device AI, improves data processing speed by 33 percent and boosts power efficiency by more than 20 percent compared with the current LPDDR5X generation. With a base operating speed exceeding 10.7 gigabits per second, the chip surpasses the maximum performance of existing mobile DRAM products, the company said.

 

The LPDDR6 chip incorporates Dynamic Voltage and Frequency Scaling (DVFS) and a sub-channel architecture that activates only necessary data paths, enabling devices to maximize bandwidth during heavy workloads while lowering voltage and power consumption during routine tasks.

 

SK hynix plans to complete preparations for mass production in the first half of the year and begin supplying global smartphone and tablet makers in the second half.

 

The development underscores how aggressively SK hynix is leaning into the AI memory boom that has reshaped the semiconductor industry over the past two years.

 

The company has emerged as one of the biggest beneficiaries of the AI infrastructure build-out, particularly through its dominance in high-bandwidth memory (HBM) used in AI accelerators supplied to companies such as Nvidia. Demand for AI servers has tightened the supply of advanced memory, driving prices sharply higher across the industry.

 

Server-grade DRAM prices are expected to rise as much as 60 to 70 percent this year compared with late 2025, according to industry estimates, as hyperscale cloud providers including Microsoft and Google rush to secure memory supplies for expanding AI data centers.

 

The surge in AI-related demand has also spilled over into conventional DRAM markets. Even as chipmakers prioritize production of HBM for AI servers, tighter supply of standard DRAM is pushing up prices for memory used in PCs, smartphones and other consumer electronics.

 

Against that backdrop, SK hynix is broadening its portfolio beyond data-center memory to include mobile chips optimized for AI workloads running directly on devices.

 

Industry analysts say the shift toward on-device AI, where smartphones process AI tasks locally rather than through remote servers, is creating a new growth engine for mobile memory with higher bandwidth and better power efficiency.

 

The LPDDR6 chip is designed to support faster response times and longer battery life in AI-enabled smartphones and tablets, enabling complex tasks such as real-time language processing and image recognition without relying heavily on cloud computing.

 

By moving early into LPDDR6 while maintaining leadership in HBM, SK hynix is positioning itself at both ends of the AI memory spectrum — from hyperscale data centers to next-generation mobile devices — as the industry pivots toward AI-driven computing.