Saturday, January 24, 2026

Analyst makes major change to Micron stock price target

One of the first market truths I learned over my 30 years navigating Wall Street was that semiconductor demand is cyclical. When times are good, orders and prices surge, driving suppliers to ramp up capacity too quickly, which contributes to oversupply when demand softens. Rinse. Repeat.

Get the timing of those cycles correct, and investors can make a significant amount of money, particularly in memory makers like Micron, which is historically prone to the boom bust cycle.

While nobody rings a bell signaling the official start and end of demand supercycles, those who pay attention to things like spot market pricing are increasingly starting to pound the table that we may be on the cusp of a critical inflection point for Micron (MU), including Wall Street research firm Stifel.

This week, Stifel analyst Brian Chin put the situation at Micron bluntly, writing in a research report shared with TheStreet:

Chin notes that surging spending on data centers to outfit them with hardware necessary for handling artificial intelligence workloads has proven a “tipping point” for the memory market, causing prices to swell, and in turn, leading him to ratchet his Micron stock price target up by an eye-popping 54% ahead of its earnings next week.

His higher target is the latest among a slate of recent Wall Street peers, suggesting many are behind the eight ball and rushing to catch up.

<em>Memory prices are surging in 2025 due to growing demand for artificial intelligence servers.</em>Shutterstock&period;
Memory prices are surging in 2025 due to growing demand for artificial intelligence servers.Shutterstock&period;

Hundreds of billions have been spent over the past three years upgrading data center servers to better handle the heavy workloads associated with training and running large language models, such as ChatGPT, and agentic AI apps tasked with boosting worker productivity.

In 2025 alone, I/OFund says hyperscalers capex will be $405 billion.

The rush of activity has been a boon for Nvidia, the semiconductor company behind the chips most effective at AI, and server companies, like Dell, which are building more powerful computers packed with Nvidia’s GPUs to meet hyperscaler demand.

More Tech Stocks:

“Training is significantly and increasingly compute-intensive, but early LLM demands were manageable. Today, compute needs are accelerating rapidly, particularly as more models move into production,” wrote JP Morgan strategist Stephanie Aliaga in October. “Nvidia estimates that reasoning models answering challenging queries could require over 100 times more compute compared to single-shot inference.”

Source link

Hot this week

Topics

Related Articles

Popular Categories