SK hynix is sole HBM supplier for Microsoft's Maia AI chip

By Seon Jae-kwan Posted : January 27, 2026, 15:55 Updated : January 27, 2026, 15:55
Microsoft’s Maia 200 AI accelerator/ Courtesy of Microsoft

SEOUL, January 27 (AJP) - SK hynix is reportedly the sole supplier of high-bandwidth memory (HBM) for Microsoft’s in-house artificial intelligence accelerator, the Maia 200.

Industry sources said on Tuesday the South Korean chipmaker will exclusively supply its fifth-generation HBM, known as HBM3E, for the Maia 200 chip that Microsoft unveiled in the U.S. on Jan. 26 local time. The move adds Microsoft to SK hynix’s roster of key customers alongside Nvidia, reinforcing its lead in AI memory used for advanced computing.

Maia 200 is an application-specific integrated circuit manufactured on TSMC’s 3-nanometer process. The chip integrates six 12-layer stacks of HBM3E, providing a total of 216 gigabytes of memory.

Microsoft plans to deploy the chips in data centers in Iowa and Arizona as it seeks to reduce reliance on Nvidia’s graphics processing units.

Major technology companies are accelerating efforts to design their own AI chips to lower costs and optimize performance, challenging Nvidia’s dominance. Google has introduced its seventh-generation tensor processing unit, Ironwood, while Amazon Web Services has rolled out its third-generation Trainium chip, broadening the AI accelerator market.

That shift is creating new opportunities for HBM suppliers, as demand spreads from Nvidia’s GPUs to custom chips developed by large cloud providers, according to industry sources. High-performance HBM is essential for such chips, which are typically designed to be more power-efficient than general-purpose GPUs.

"SK hynix’s exclusive supply agreement with Microsoft, following its strong foothold in Nvidia’s supply chain, reflects advantages in advanced memory manufacturing processes and yield management," a source said.

Samsung Electronics is seeking to narrow the gap by strengthening cooperation with other big technology companies, particularly Google. Industry sources said Samsung supplies a significant portion of the HBM used in Google’s tensor processing units and Broadcom-designed chips, positioning itself within the Google-Broadcom ecosystem.

The next competitive battleground is HBM4, the sixth generation of high-bandwidth memory, which is expected to add computing functions to memory chips and significantly raise technical complexity. Samsung has recently passed HBM4 qualification tests by Nvidia and Advanced Micro Devices and is expected to begin official deliveries as early as next month.

SK hynix is also preparing for the transition, having begun building a mass-production system for HBM4 in September and working with Nvidia on performance optimization.

* This article, published by Aju Business Daily, was translated by AI and edited by AJP.

Copyright ⓒ Aju Press All rights reserved.