SK hynix unveils HBM4 roadmap at TSMC symposium, deepening AI memory partnership

By KIM NA YOON Posted : April 23, 2026, 16:19 Updated : April 23, 2026, 16:19
Ahn Hyun, SK hynix president and chief development officer, delivers a keynote address at the TSMC 2026 Technology Symposium in San Jose, California, on April 22 (local time). [Photo=SK hynix]

SK hynix used a technology event hosted by Taiwan Semiconductor Manufacturing Co., the world’s largest foundry, to lay out a detailed roadmap aimed at strengthening its lead in next-generation high-bandwidth memory, or HBM.
 
The company said it took part in the “TSMC 2026 Technology Symposium” held April 22 (local time) in San Jose, California, where it presented development progress on sixth-generation HBM and its next-generation packaging capabilities.
 
SK hynix highlighted HBM4, a key component for the artificial intelligence era, and underscored a change in how the “base die” will be made starting with HBM4. Up to the fifth generation, SK hynix produced the base die using its own process. For HBM4, it plans to use TSMC’s advanced leading-edge logic process to manufacture it.
 
The base die connects to a GPU and controls the overall memory stack. Applying an advanced process can enable a wide range of customer-specific functions, the company said.
 
The cooperation is aimed at moving beyond parts supply to jointly target the market for customized HBM. SK hynix said it plans to optimize the combination of TSMC’s advanced packaging technology, CoWoS, with its own HBM technology to offer solutions that meet varied requirements from global big tech companies.
 
Along with HBM4, SK hynix displayed products it described as essential for building AI infrastructure, including a 256GB 3DS RDIMM memory module for high-performance servers and data centers and GDDR7, its next-generation graphics memory.
 
SK hynix said it plans to begin mass production of HBM4 in 2026, and to strengthen a close cooperation ecosystem linking customers, foundries and memory makers as it seeks to solidify its position as a total AI memory provider.
 
Ahn Hyun, SK hynix president and chief development officer, said, “A key obstacle for AI technology today is a memory bottleneck, where bandwidth limits prevent rapidly processing the growing volume of data.” He added, “We will go beyond supplying standard HBM to provide ‘custom HBM’ tailored to specific customer needs, and expand to full solutions spanning DRAM and NAND flash.”




* This article has been translated by AI.

Copyright ⓒ Aju Press All rights reserved.