


Nvidia has decided to use memory chips typically found in smartphones for AI servers. According to a report published by Counterpoint Research, this could result in server memory prices doubling by the end of 2026.
Over the past two months, global electronic supply chains have faced a shortage of older memory chips due to manufacturers focusing on high-performance memory chips designed for AI applications. However, Counterpoint indicates that a new problem is on the horizon.
Nvidia has opted to switch the type of memory used to LPDDR to reduce power costs in AI servers. This low-power memory type, typically found in phones and tablets, will replace the DDR5 commonly used in servers.
Since each AI server will require more memory chips than a smartphone, this change is expected to create a sudden surge in demand. However, Counterpoint notes that the industry is not prepared to meet this demand.
Companies such as Samsung Electronics, SK Hynix, and Micron, which produce server memory chips, have started facing shortages in older dynamic random-access memory products as they focus on high-bandwidth memory production. Counterpoint expressed that this bottleneck in the lower segment of the market carries the risk of spreading upwards.
They stated that Nvidia’s shift towards LPDDR could lead to it becoming a major customer like a smartphone manufacturer, meaning significant changes for the supply chain. Counterpoint anticipates that server memory prices will double by the end of 2026.
Higher server memory prices will increase costs for cloud service providers and AI developers. This could further pressure data center budgets already strained by record spending on graphics processing units and power upgrades.
```.png)
Sizlere kesintisiz haber ve analizi en hızlı şekilde ulaştırmak için. Yakında tüm platformlarda...