In a fresh and risky move, Nvidia (NVDA) plans to use low power memory chips found in phones for its next wave of AI servers. The chips are called LPDDR and are set to replace DDR5 chips used in most servers today. The shift aims to lower power use in large AI clusters. Even so, it also brings new pressure to a memory market that is already tight.
TipRanks Black Friday Sale
- Claim 60% off TipRanks Premium for the data-backed insights and research tools you need to invest with confidence.
- Subscribe to TipRanks' Smart Investor Picks and see our data in action through our high-performing model portfolio - now also 60% off
In recent months, supply lines have come under strain as major chipmakers moved more of their output toward high end parts for AI gear. Firms cut production of older parts to free up space for high bandwidth memory. As a result, supply of legacy memory has grown thin. Now Nvidia adds new demand for LPDDR chips. Each AI server uses far more chips than a phone. As a result, chipmakers must weigh whether they should shift more factory space to LPDDR. This change could squeeze other parts of the market and lift cost levels across the board.
At the same time, the plan aligns with Nvidia’s pattern of steering industries toward its own tech and making them rely on it. The push has already played out in AI and quantum computing and now extends to server memory choices.
This news also arrives on an important day for Nvidia, as the company prepares to report Q3 2026 earnings after trading hours.

Rising Prices And Who Gets Hit
Counterpoint Research expects server memory prices to rise by close to two times by late 2026. The forecast is based on the view that the supply chain is not ready to serve Nvidia at a phone-maker scale. Chip makers like Samsung Electronics (SSNLF), SK Hynix, and Micron (MU) already face shortages in older dynamic memory lines. As they tilt more lines to LPDDR, buyers of server parts may see sharper price steps.
The rise in costs would first fall on cloud platforms. Amazon Web Services (AMZN), Microsoft Azure (MSFT), and Google parent Alphabet (GOOGL) all build large fleets of AI servers each year. Higher memory prices would add to budgets that are already stretched by high spending on graphics units and upgrades to power systems. In time, higher costs may also reach AI start-ups and large firms that run their own AI stacks.
Furthermore, a longer squeeze in LPDDR supply could spill into the phone and tablet markets. Makers of these devices could face tighter supply or higher chip costs. The trend shows how fast AI demand is now shaping the broader memory sector. For now, the shift to LPDDR helps Nvidia lower power use. Yet the ripple effect across the supply chain may lift server memory prices through 2026.
Is Nvidia Stock a Buy?
Nvidia continues to hold the Street’s endorsement with a Strong Buy consensus rating. The average NVDA price target is $243.09, implying a 34.04% upside from the current price.


