SEOUL, April 20 (AJP) -South Korean chipmaker SK hynix announced Monday that it has begun mass production of its SOCAMM2 192GB memory module, a product specifically optimized for Nvidia's next-generation "Vera Rubin" AI platform.
Based on 10-nanometer-class sixth-generation (1c) LPDDR5X DRAM, the new AI server module aims to resolve memory bottlenecks in massive AI models and maximize GPU processing speeds. The product reconfigures low-power mobile memory for server environments, offering a combination of high bandwidth, low power consumption, and easy module replacement.
According to SK hynix, SOCAMM2 delivers more than twice the bandwidth and over a 75 percent improvement in energy efficiency compared to conventional server RDIMMs. The module acts as a "middle memory" layer between High Bandwidth Memory (HBM) and DDR5 system memory. By adopting an LPDDR-based structure, the product significantly lowers power and cooling costs for data centers.
"Through close cooperation with Nvidia, we will resolve bottlenecks in AI infrastructure and provide optimal performance," said Kim Ju-seon, President of AI Infra at SK hynix. "With the supply of SOCAMM2 192GB, we have set a new standard for AI memory performance."
The module features a press-fit connector structure, ensuring high signal integrity and making replacement and expansion easier compared to traditional onboard LPDDR setups.
The deepened partnership comes as the U.S. tech giant accounts for a significant portion of the chipmaker's business.
According to its recent business report, sales to Nvidia reached 23.26 trillion won last year, making up 24 percent of SK hynix's total revenue.
Reflecting strong investor sentiment toward its AI-driven momentum, shares of SK hynix were trading at 1,170,000 won as of 11:35 a.m. on Monday, up 3.72 percent from the previous session.
Copyright ⓒ Aju Press All rights reserved.