Sunday, December 28, 2025

How much memory do AI Data Centers need?

Investing.com — After hosting a webinar with Gunjan Shah, a former Senior Cloud Engineer, AI and Machine Learning at Google, analysts at Bernstein provided their thoughts on the amount of memory AI data centers require.

In its key takeaways, the firm explained that AI data centers require dramatically different amounts of memory depending on whether they are training or running models.

Analyst Mark Newman says training demands “substantially more memory than inference,” because it requires storing model weights, activations, gradients, optimizer states and “frequent checkpoints.”

Bernstein, citing the expert commentary, noted that even a medium-sized model can consume “~1TB of combined memory” during training. Inference, by contrast, needs far less, with storage limited to temporary tensors and KV caches.

Newman states that hyperscalers were caught off guard by the surge in AI adoption, triggering a sharp rise in memory demand and pricing.

The resulting imbalance is said to have pushed up the cost of key components such as HBM and DRAM.

However, the firm notes that improvements in model architectures, new quantization techniques and next-generation chips should help “manage memory demand over the long term” and support sustainability.

The note highlights storage as another bottleneck. A shortage of HDDs has pushed many operators toward SSDs.

Bernstein adds that SSDs are “five to ten times more expensive” than HDDs, but companies are willing to absorb the cost to continue advancing their models.

SSDs are also said to offer performance and efficiency advantages, including “lower operational costs, reduced power consumption, and minimal cooling requirements.”

Bernstein also points to purpose-built TPUs, which deliver “lower TCO, higher performance per watt and superior scalability,” though GPUs remain favored for rapid prototyping due to their mature ecosystem.

Looking ahead, the firm says High Bandwidth Flash could become a critical new tier, offering terabytes of fast, non-volatile memory and lower energy needs for future AI workloads.

Related articles

How much memory do AI Data Centers need?

17 Undervalued Small-Cap Tech Stocks Primed to Outperform in Q4

10 Beaten-Down Large-Cap Tech Stocks Offering Solid Upside Potential

Source link

Hot this week

SL Green (SLG) Faces Dual Target Cuts as Office REIT Outlook Stays Neutral

SL Green Realty Corp. (NYSE:SLG) is...

Wells Fargo Notes Resilient REIT Operations, Raises LTC Target

LTC Properties, Inc. (NYSE:LTC) is included...

Realty Income Corporation (O) Downgraded as JPMorgan Reshapes 2026 REIT Outlook

Realty Income Corporation (NYSE:O) is included...

Mizuho Modestly Adjusts Valuation on Agree Realty (ADC)

Agree Realty Corporation (NYSE:ADC) is included...

Morgan Stanley Sees Policy Risks Receding for Regeneron (REGN)

Regeneron Pharmaceuticals, Inc. (NASDAQ:REGN) is included...

Topics

Related Articles

Popular Categories