We all know that RAM memory is an indispensable component in any computer system, since it provides temporary storage for the processor to save the results of its calculations so that it can access them quickly when it needs them (and we also know that the CPU’s access to memory is almost instantaneous).
However, what if these calculations could be done by the RAM itself and the CPU would find them already done? This concept of “memory that thinks for itself” is precisely what Samsung proposed during the Hot Chips 33 event, and this is what we are going to talk about next.
During the aforementioned event, Samsung announced that it intends to extend its memory processing technology to its DDR4, GDDR6 and LPDDR5X chips, in addition to the HBM2 chips it has been working with for some time now.
Earlier this year, the Korean firm stated that its HBM2 memory was capable of 1.2 TFLOPS computing power for AI tasks, allowing the memory to ease part of the CPU’s stress (or GPU, FPGA, etc.). Of course, this would be generalized to gaming GPUs, which would be a watershed moment in the industry.
Samsung proposes memory capable of performing calculations
The announcement they made during the event unveils the Aquabolt-XL brand and comes hand in hand with the inclusion of AXDIMM DDR4 modules and LPDDR5 memory, both with integrated computing power. These chips feature a small AI engine inside each DRAM chip, allowing in-memory processing, which in turn means that now this data will not have to go from processor to memory and from memory to processor, but as it will already be in memory, the processor will only have to “grab” it and so we literally save a whole trip, saving time and energy.
Samsung’s Aquabolt-XL HBM-PIM chips are already in the company’s product lineup; they work with an HBM2 controller that meets JEDEC requirements, making it a straight replacement for Samsung’s HBM2 memory. Samsung demonstrated at the event how their new memory could be used in lieu of their existing HBM2 memory in a Xilinx Alveo FPGA system without requiring any modifications, resulting in a 2.5X performance boost and a 62 percent decrease in power consumption.
Although this Samsung in-memory processing technology is fully compatible with any existing memory controller, it should be noted that as CPU manufacturers (Intel, AMD, IBM, and so on) expand support for it, it will result in even greater performance, at least in some specific scenarios. Samsung claims that it is actively experimenting with some manufacturers to use it in future products, but because everything is standard, it should work in any system that supports HBM2 (for example, Intel’s Sapphire Rapids or AMD’s Genoa).
As usual, this in-memory processing shown by Samsung would be especially useful in data centers, largely because it is ideal for AI workloads. However, the company also envisions that this technology could be used in consumer platforms as the gains are obvious, and to this end they have also demonstrated a new format of RAM modules: AXDIMM.
We’re looking at a (for the time being) prototype acceleration DIMM that performs data processing work on a buffer chip integrated into the module itself. Samsung claims that these modules can be installed in any server that supports DDR4 RAM in LRDIMM or UDIMM format (presumably DDR5 support is on the way), implying that the pin count and shape are identical to DDR4.
Samsung aims to set a new standard
Samsung showcased the operation of this new sort of memory during the Hot Chips 33 event, which is literally capable of completing processing on its own and, as previously stated, relieves the stress on the system’s central CPU. For the time being, this processing is limited to specific sorts of AI processes, but the firm is excited about its integration and potential.
Taking advantage of the fact that the technology of “thinking” DRAM chips is compatible with any memory controller and that the modules can be physically installed in existing memory sockets, the company has exhibited HBM2, DDR4, LPDDR5X, and even graphics memory, GDDR6.
In this regard, the business is especially upbeat, as CXL support may also be on the horizon. Integrating such chips into the graphics memory of a current graphics card could result in a leap in features and performance that we could only dream about till now.
Finally, the company has said that these first Aquabolt-XL HBM2 products are already available for manufacturers to purchase today, while its other products will enter its portfolio very soon because they are already in development. This means that, in the long term, we could start to see products integrating this Samsung memory capable of processing data in the memory itself, something that in theory will mean a noticeable increase in both performance and energy efficiency.
Whether this is the future or not only time will tell, but Samsung is tremendously optimistic about it and they are betting that it is. It remains to be seen whether and how industry manufacturers will try to integrate this memory into their products.