SEOUL, April 01 (AJP) - A new artificial intelligence data-compression technology from Google has hit Seoul’s main stock market as hard as Strait of Hormuz-related news, as it involves South Korea’s top memory chipmakers responsible for around 40 percent of not just the KOSPI but also the country’s exports.
Since the unveiling of Google’s TurboQuant, a next-generation quantization algorithm, Samsung Electronics and SK hynix saw their shares tumble 4.7 percent and 6.2 percent, respectively, over a two-day period late last month.
They staged a strong comeback Wednesday — Samsung Electronics up 12.5 percent and SK hynix up 10.4 percent. But Wednesday’s gains hinged on hopes for a war endgame, and any shifts in Washington or Tehran could quickly reverse sentiment.
For the chipmakers, the drag came from a research paper published by Google on March 24 detailing the debut of a technology called TurboQuant.
TurboQuant acts as a highly efficient “packing” mechanism for AI. Large language models (LLMs) like ChatGPT require vast amounts of temporary memory, known as the KV cache, to retain the context of long conversations. TurboQuant compresses this data without losing its core meaning, reducing the memory footprint to as little as one-sixth of conventional levels.
Fearing this level of efficiency would drastically reduce the need for dynamic random-access memory (DRAM) and high-bandwidth memory (HBM) chips, investors aggressively dumped tech shares.
“The impact will be massive,” said Kim Deok-kee, a professor of electronic engineering at Sejong University in Seoul. “If memory requirements are reduced to one-sixth, revenue will drop drastically. With Samsung, SK hynix, Micron and emerging Chinese players in the mix, this could quickly lead to a memory oversupply.”
Kim warned that the current rush to expand capacity could backfire.
“Companies are building multiple fabs based on current high demand, but in two to three years, this could result in massive oversupply and severe financial deficits,” he said.
However, local brokerages strongly argue that this efficiency will actually lower the cost of running AI, thereby expanding overall usage, and are maintaining a robust outlook for South Korean chipmakers.
“The technology simultaneously improves memory and computational efficiency, lowering AI utilization costs,” Jang Moon-young, an analyst at Hyundai Motor Securities, wrote in a recent note. “In the mid-to-long term, it is highly likely to lead to an expansion in memory demand through broader AI adoption and increased usage.”
Kim Dong-won, head of research at KB Securities, said low-cost AI technologies like TurboQuant will “lower entry barriers and explosively expand overall AI demand.”
“Despite geopolitical anxieties stemming from rising tensions in the Middle East, second-quarter memory chip orders are strengthening, exceeding previous estimates,” he said.
Analysts also highlight the coming era of “physical AI,” such as robotics and autonomous driving, which will require massive data processing.
In a report released Wednesday, Daol Investment & Securities analysts Koh Young-min and Kim Yeon-mi said the sell-off over TurboQuant was excessive.
“The best thing this KV cache compression technology does is bring unrealistic demand down to a more realistic level,” they wrote. “The potential for future HBM demand growth remains ample.”
Copyright ⓒ Aju Press All rights reserved.



