Skip to content

Nvidia's groundbreaking AI GPU memory format may expand to other systems, posing a potential threat to LPDDR6 on the horizon.

Preparation Underway for SOCAMM 2 Sample Production: Samsung, SK Hynix, and Micron Unite in the Process

Nvidia's innovative memory format for AI-centric GPUs might expand to other systems, potentially...
Nvidia's innovative memory format for AI-centric GPUs might expand to other systems, potentially challenging LPDDR6 on the horizon.

Nvidia's groundbreaking AI GPU memory format may expand to other systems, posing a potential threat to LPDDR6 on the horizon.

The technology landscape is abuzz with the impending arrival of SOCAMM 2, a new memory module that promises faster transfer speeds and potential game-changing implications for data centers and creative professionals alike.

Currently, the Blackwell Ultra GB300 NVL72 platform is at the forefront of GPU discussions in data center contexts. With the development of SOCAMM 2, it could potentially shift the narrative, influencing the best GPU for video editing as well.

Samsung Electronics, SK Hynix, and Micron are all gearing up for the second-generation SOCAMM 2 design, aiming to expand the supplier base and provide a more competitive market. This collaborative approach could attract JEDEC involvement, making it easier for other companies to adopt similar modules in their systems. If JEDEC involvement occurs, SOCAMM 2 might evolve into a new industry format.

However, analysts remain cautious about the timing of SOCAMM 2's arrival, as it coincides with the acceleration of LPDDR6 development. Industry estimates suggest that SOCAMM 2 will not be available in volume until early next year, with Samsung and SK Hynix planning for mass production in the third quarter.

One of the key selling points of SOCAMM 2 is its transfer speed increase. Compared to the earlier 8,533 MT/s, SOCAMM 2 promises faster transfer speeds of 9,600 MT/s. This translates to a system bandwidth rise from around 14.3TB/s to roughly 16TB/s.

Moreover, the module in SOCAMM 2 still consumes less power than standard DRAM-based RDIMM, a claim that needs validation under real server workloads. This power efficiency could be a significant advantage, particularly in data-intensive applications.

Notably, Nvidia, a leader in AI hardware innovation, has integrated SOCAMM 2 technology directly into its DGX Station Gen2. The expected large-scale availability of SOCAMM 2 aligns with the launch of Nvidia's DGX Station Gen2 in 2025.

The first generation of SOCAMM modules was manufactured only by Micron, creating a single point of dependency. With the involvement of multiple manufacturers, SOCAMM 2 aims to address this issue and provide a more robust and reliable solution.

However, whether SOCAMM 2 becomes the solution to the growing demand for memory to resolve data bottlenecks or just another option in a crowded field will depend on execution, standardization, and the speed of LPDDR6's rollout. Discussions about adopting LPDDR6 suggest that the format of SOCAMM 2 is being designed with long-term scalability in mind.

Despite the technical challenges that led to the abandonment of the SOCAMM 1 project, the promise of SOCAMM 2 has kept the industry excited. With its detachable module form factor with 694 I/O ports, SOCAMM 2 is poised to make a significant impact in the memory technology landscape.

Read also:

Latest