AI-driven storage continues to iterate! Samsung PIM technology mass production is imminent, potentially bypassing CPUs and GPUs for direct computation

robot
Abstract generation in progress

AI is reshaping the supply and demand landscape of the storage market with unprecedented power, while also giving rise to a new wave of technologies. Following the emergence of “black technologies” like HBF and H³, a new direction is now budding in the storage field.

According to media reports, Samsung Electronics plans to apply PIM technology to LPDDR5X memory. Currently, Samsung is collaborating with major clients to develop LPDDR5X PIM technology, with samples expected to be available in the second half of this year. Additionally, both parties are actively exploring the specific standards for applying PIM technology to the next-generation standard LPDDR6.

PIM, short for Processing in Memory, refers to integrating processing units directly within memory modules. It places computational units (ALUs) at the memory storage level. Traditional methods often require transferring data to CPUs or GPUs for processing, but PIM performs computations directly within memory, which is expected to break through the “memory wall.”

In a recent keynote speech at SEMICON Korea 2026 held in South Korea, Samsung Electronics DRAM design team leader Sun Gyo-min emphasized the necessity of PIM technology, stating: “Currently, due to insufficient memory bandwidth, AI cannot fully leverage GPU performance.” In his view, PIM can not only significantly increase bandwidth but also greatly improve energy efficiency.

Currently, Samsung has completed proof-of-concept (PoC) testing for HBM-PIM and other products, and is moving into the commercialization phase, preparing for mass production. The core product of this technology is the LPDDR series, which has been optimized for smartphones and terminal AI devices.

In addition, SK Hynix is also actively developing PIM. At the “CES 2026” exhibition in the United States this year, SK Hynix showcased several innovative products and technologies, including AiMX based on PIM architecture. Shanghai Securities pointed out that to accelerate AI deployment and drive growth in information flow, storage chips have evolved from ordinary components into core value products of the AI industry. Through technological breakthroughs and ecosystem collaboration, they aim to build core competitiveness in AI storage.

China Post Securities states that as a new computing architecture, the core of integrated storage and computing is to fully merge storage and computation, overlay computing capabilities within memory, and perform two-dimensional and three-dimensional matrix calculations using new efficient computing architectures. Coupled with advanced packaging and new memory devices in the post-Moore era, this approach can effectively overcome the Von Neumann architecture bottleneck and achieve an order-of-magnitude improvement in computational efficiency. PIM embeds processing units into memory chips, endowing memory with certain computing capabilities, making it suitable for data-intensive tasks and significantly improving data processing efficiency and energy efficiency.

CITIC Securities notes that analyzing current memory architectures for computing, DRAM performance (“bandwidth” and “capacity”) is the biggest bottleneck. The larger the model during training, the higher the memory capacity required; during inference, the more concurrent users, the greater the bandwidth demand (training is more limited by “capacity,” inference by “bandwidth”). Upgrading is urgently needed. In the AI era, the necessary storage upgrade—“integrated storage and computing”—is an inevitable long-term trend, and “near-memory computing” (PNM) is the currently effective path.

(Article source: Caixin)

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
Add a comment
Add a comment
No comments
  • Pin