TAIPEI, July 23, 2025 /PRNewswire/ — Nvidia is preparing to enter a new phase in the memory market by planning to deploy between 600,000 and 800,000 SOCAMM modules in 2025. This initiative positions SOCAMM as a potential successor to high-bandwidth memory (HBM). Although the initial deployment volumes are relatively modest compared to HBM, industry analysts suggest the move could trigger a wider transformation in the memory and substrate sectors.
According to reports from ET News and Wccftech, Nvidia has confirmed its intent to integrate SOCAMM into its next-generation AI products, sharing projected order quantities with key memory and substrate suppliers. The company’s upcoming GB300 “Blackwell” platform will be among the first to adopt SOCAMM, alongside the AI PC Digits, unveiled during Nvidia’s GTC 2025 conference in May.
Micron secures first-mover advantage
Nvidia initially tapped Samsung Electronics, SK Hynix, and Micron to co-develop SOCAMM. However, Micron has emerged as the first memory maker to receive approval for volume production, outpacing its South Korean rivals in the race to support Nvidia’s latest architectures.
Designed for low-power, high-bandwidth AI computing, SOCAMM leverages LPDDR DRAM and offers a significant upgrade over conventional notebook DRAM modules such as LPCAMM. The new module boosts input/output speeds and data transfer rates while maintaining a compact, upgrade-friendly form factor.
Micron claims its SOCAMM delivers a 2.5x increase in bandwidth and a one-third reduction in size and power consumption compared to traditional RDIMM modules used in servers.
From servers to PCs
While Nvidia’s initial SOCAMM deployment will focus on AI servers and workstations, the inclusion of the module in the Digits AI PC signals broader ambitions for the consumer market. Industry players believe this crossover potential will be key to scaling adoption.
Though the projected 600,000–800,000 units are dwarfed by Nvidia’s planned procurement of 9 million HBM units in 2025, analysts say SOCAMM’s introduction marks a pivotal inflection point. The module’s appeal lies in bridging the gap between cost-effective, scalable memory and the performance demands of AI workloads.
Substrate makers eye growth opportunity
SOCAMM’s rise is also reshaping dynamics in the substrate sector. The module requires custom-designed PCBs, creating an entirely new category of demand. With Micron already in mass production and both Samsung and SK Hynix actively negotiating supply partnerships, the stage is set for intensifying competition among the world’s top DRAM vendors.
Substrate suppliers, too, are preparing for an inflection in demand. Industry insiders note that while early volumes are limited, a wave of large-scale orders could follow if Nvidia’s SOCAMM strategy gains traction, potentially triggering a fierce scramble among PCB vendors to capture the new business.