SEOUL, December 18 (AJP) - Samsung Electronics is testing a new LPDDR-based server memory module as artificial intelligence data centers increasingly prioritize power efficiency over raw performance, signaling a potential shift in how memory is deployed in next-generation AI infrastructure.
The company said it has begun sampling its second-generation SOCAMM (Small Outline Compression Attached Memory Module), a server memory module built on low-power LPDDR technology, to global customers. The move comes as operators of AI servers grapple with rising electricity costs and thermal constraints driven by GPU-intensive workloads.
SOCAMM2 is designed to address those pressures by bringing mobile-class low-power memory into the server environment — a domain long dominated by RDIMM-based DRAM. Samsung said the module delivers more than twice the bandwidth of earlier LPDDR-based solutions while cutting power consumption by over 55 percent compared with conventional server memory configurations.
The product is not positioned as a replacement for RDIMM, which remains the industry standard for general-purpose servers. Instead, SOCAMM2 targets specific AI workloads, particularly inference-oriented systems where power efficiency and heat management increasingly outweigh peak compute performance.
According to Samsung, the Korean tech giant is working with Nvidia to validate SOCAMM2 in AI server platforms, reflecting broader efforts to optimize memory architectures for GPU-driven systems. Nvidia did not disclose details of the collaboration, but Samsung said the two companies are jointly evaluating performance and compatibility.
The development highlights a growing reassessment of memory design in AI data centers. While DRAM capacity and speed have historically been the primary focus, power consumption is emerging as a critical bottleneck as AI deployments scale.
Samsung added that discussions are under way to standardize SOCAMM through JEDEC, the industry body that defines memory specifications. Standardization would be a key step toward broader adoption, allowing system makers and cloud providers to integrate the module beyond limited pilot deployments.
Analysts at TrendForce and Omdia have noted that while LPDDR-based server memory offers clear efficiency gains, its adoption will hinge on compatibility with existing server architectures and performance trade-offs at scale.
For Samsung, the move underscores an effort to expand its memory portfolio beyond conventional DRAM and HBM products as AI infrastructure evolves. Whether SOCAMM2 becomes a niche solution or a wider industry option will hinge on customer uptake and standardization progress in the coming quarters.
Copyright ⓒ Aju Press All rights reserved.