Site icon Wonderful Engineering

Samsung Has Unveiled The ‘World’s Fastest’ Data Processing AI Chip To Date

Samsung’s HBM3E 12H memory chip, its newest offering, has increased the standard in the data processing industry. Memory technology has advanced significantly with this new chip, which the business claims is the highest-capacity to date for artificial intelligence (AI) applications.
The first 12-stack HBM3E DRAM in the world, the HBM3E 12H was created to offer large data storage capacity at a reasonable price. Yongcheol Bae, Executive Vice President of Memory Product Planning at Samsung, said that this new memory solution was created in response to the growing need among AI service providers for larger capacity HBM. The HBM3E 12H’s remarkable bandwidth of up to 1,280 gigabytes (1.25 TB) per second, which sets a new industry benchmark, is one of its primary features. Furthermore, the chip has a 36 GB capacity, which is more than 50% more than the previous 8-stack HBM3 8H in terms of both bandwidth and capacity.

It is anticipated that Samsung’s most recent technology will be included into NVIDIA’s H200 GPUs, improving their performance by permitting quicker data processing and transmission rates. The 12-layer design of the HBM3E 12H preserves the same height as 8-layer chips while satisfying the requirements of the existing HBM packaging. This design enhancement delivers considerable benefits, especially with taller stacks, and answers concerns about chip die warping related with thinner dies. The announcement comes shortly after Micron revealed its mass production of the 24GB 8L HBM3E, which also boasts high-speed capabilities. However, Samsung’s HBM3E 12H stands out with its 12-layer design and improved NCF material, which increases vertical density by over 20% compared to its previous HBM3 8H product.

Samsung intends to greatly increase the manufacturing of HBM chips, which will intensify competition in the high-performance memory chip industry. In order to compete with Samsung and SK Hynix, Micron, which presently has a small proportion of the worldwide HBM market, is substantially investing in its next-generation product. Innovation in AI, gaming, data centers, and high-performance computing systems—where high bandwidth and low power consumption are critical—will continue to be fueled by these developments in memory technology.

Exit mobile version