


Nvidia has decided to use memory chips typically found in smartphones for artificial intelligence servers. According to a report published by Counterpoint Research, this decision could lead to a doubling of server memory prices by the end of 2026.
In the past two months, electronic supply chains worldwide have faced a shortage of older memory chips due to manufacturers focusing on high-performance memory chips designed for artificial intelligence applications. However, Counterpoint warns that a new problem is looming.
Nvidia has opted to change the type of memory used to reduce power costs for AI servers to LPDDR. This low-power memory type, typically found in phones and tablets, will replace the DDR5 commonly used in servers.
Since each AI server requires more memory chips than a smartphone, this change is expected to create a sudden demand. However, Counterpoint notes that the industry is not ready to meet this demand.
Companies producing server memory chips, such as Samsung Electronics, SK Hynix, and Micron, have started experiencing shortages in older dynamic random-access memory products as they focus on producing high-bandwidth memory. Counterpoint expressed that this tightness in the lower segments of the market carries a risk of spreading upwards.
Counterpoint stated that Nvidia's shift towards LPDDR could lead to it becoming a significant customer like a smartphone manufacturer, which would mean a major change for the supply chain. The firm expects server memory prices to double by the end of 2026.
Higher server memory prices will increase costs for cloud service providers and AI developers. This could put additional pressure on data center budgets already strained by record spending on graphics processing units and power upgrades.
```.png)
Sizlere kesintisiz haber ve analizi en hızlı şekilde ulaştırmak için. Yakında tüm platformlarda...