-
Bloomberg
-
-
Memory chip stocks extended their losses yesterday after Alphabet Inc’s Google publicized research that could allow more efficient use of the storage needed for artificial intelligence (AI) development.
SK Hynix Inc and Samsung Electronics Co, South Korean leaders in the market, fell more than 6 percent and about 5 percent respectively in Seoul. In the US, Micron Technology Inc, Western Digital Corp and Sandisk Corp slid more than 2 percent in pre-market trading, after they all closed lower on Wednesday.
Memory companies have been on a tear in recent months as the rapid development of AI infrastructure triggered a spike in chip prices, driving up profit and stocks. Four hyperscalers, led by Amazon.com Inc and Google, plan to spend about US$650 billion this year to build data centers, scooping up Nvidia Corp’s AI accelerators and related memory chips.
Visitors walk near a logo of Google at an AI Impact Summit in New Delhi, India, on Feb. 17.
Photo: Bhawika Chhabra, Reuters
SK Group chairman Chey Tae- won recently said that the memory chip crunch will last until 2030.
But Google’s new technology could alleviate some of the shortages, potentially pushing down prices. The company publicized the research on X this week, although it had originally come out last year.
Google said the TurboQuant technology can cut the amount of memory required to run large language models by at least a factor of six, reducing the overall cost of training AI. Investors may harbor concerns that that could reduce hyperscalers’ need for memory, alleviating the shortages that have lifted prices of smartphones and consumer electronics.
Morgan Stanley analyst Shawn Kim wrote in a note the impact on the industry should be more positive because it affects a critical bottleneck. It improves the efficiency of what’s known as the key value cache used for inference, or running AI models.
“If models can run with materially lower memory requirements without losing performance, the cost of serving each query drops meaningfully, resulting in more profitable AI deployment,” he wrote.
Like many of the bulls in the AI industry and analyst community, he cited a theory known as the Jevons Paradox. It’s a concept from an English economist about coal production stating that the more efficient technology becomes, the more demand will rise.
The 19th century premise was also cited by JPMorgan Chase & Co and Citigroup Inc. JPMorgan analysts said that investors may take profits on the news, but there’s no near-term threat to memory consumption.
The tech community also brought up the Jevons Paradox last year when DeepSeek’s (深度求索) low-cost AI model sparked fears of a reduced need for more advanced technology.
TurboQuant is positive for hyperscalers given the return on investment opportunity, Morgan Stanley’s Kim wrote. It may be beneficial for memory makers in the longer term, as “a lower cost per token can also lead to higher product adoption demand.”
The Google development may make “little difference to demand given the extreme supply constraints,” Ortus Advisors Pte Ltd analyst Andrew Jackson wrote in a note on the Smartkarma platform.


