Why Google’s TurboQuant is rattling memory stocks

Investing.com — Memory stocks fell Wednesday despite broader technology sector strength, with shares dropping after Google unveiled TurboQuant, a new compression algorithm that could reduce memory requirements for AI systems. SanDisk Corporation (NASDAQ:SNDK) fell 5.7%, Micron Technology (NASDAQ:MU) dropped 3%, Western Digital (NASDAQ:WDC) declined 4.7%, and Seagate Technology (NASDAQ:STX) slid 4%. The declines came as…


SK Hynix Files US ADR Listing as Shares Jump Over 50% This Year

Investing.com — Memory stocks fell Wednesday despite broader technology sector strength, with shares dropping after Google unveiled TurboQuant, a new compression algorithm that could reduce memory requirements for AI systems.

SanDisk Corporation (NASDAQ:SNDK) fell 5.7%, Micron Technology (NASDAQ:MU) dropped 3%, Western Digital (NASDAQ:WDC) declined 4.7%, and Seagate Technology (NASDAQ:STX) slid 4%. The declines came as the Nasdaq 100 advanced.

Google introduced TurboQuant, a compression technology designed to reduce memory consumption in large language models and vector search engines. The algorithm addresses bottlenecks in key-value cache, which stores frequently accessed information in AI systems.

According to Google’s announcement, TurboQuant can compress key-value cache to 3 bits without requiring training or fine-tuning while maintaining model accuracy. Testing on open-source models including Gemma and Mistral showed the technology achieved a 6x reduction in key-value memory size. The algorithm also demonstrated up to 8x performance increase over unquantized keys on H100 GPU accelerators.

The technology works through two steps: applying the PolarQuant method for high-quality compression by rotating data vectors, and using the Quantized Johnson-Lindenstrauss algorithm to eliminate residual errors. Google said traditional vector quantization methods add 1 to 2 extra bits per number in memory overhead, partially negating compression benefits.

TurboQuant will be presented at ICLR 2026, while PolarQuant is scheduled for presentation at AISTATS 2026. Google tested the algorithms across benchmarks including LongBench, Needle In A Haystack, ZeroSCROLLS, RULER, and L-Eval.

The technology has applications beyond AI models, including vector search capabilities that power large-scale search engines.

Memory stocks have rallied significantly year to date, making them vulnerable to developments that could reduce demand.

Related articles

MU, WDC, SNDK fall: Why Google’s TurboQuant is rattling memory stocks

Citi pushes back Fed rate cuts to May after blowout January jobs report

5 reasons why Jefferies thinks Meta’s pullback is a buying opportunity

Source link