Home Technology Samsung and Micron prep advanced HBM3E 3D chips for memory-intensive applications

Samsung and Micron prep advanced HBM3E 3D chips for memory-intensive applications

by News7

Serving the tech enthusiast community for over 25 years.

TechSpot means tech analysis and advice you can trust. Read our ethics statement.

Forward-looking: The AI boom is in full swing, and chip manufacturers are bringing new, advanced memory technologies to the table. Next-generation High Bandwidth Memory (HBM) is poised to deliver significant increases in both bandwidth and capacity, and Samsung is aiming aiming to lead the industry.

Despite entering the HBM3E market somewhat belatedly, Samsung is introducing its HBM3E 12H DRAM chips as a pioneering achievement in 3D layered memory technology. The latest memory chips from the Korean giant utilize a novel 12-layer stack, which can deliver a 50 percent increase in both performance and capacity compared to HBM3E chips with an 8-layer stack.

The HBM3E 12H chips can achieve up to 1,280 gigabytes per second (GB/s) of bandwidth, Samsung claims, providing an unprecedented capacity of 36 gigabytes. Samsung managed to stack 12 layers in the same chip by utilizing advanced thermal compression non-conductive film (TC NCF), which appears to maintain the same height specification as 8-layer chips, meeting current requirements for HBM memory packaging applications.

TC NCF also offers additional benefits by achieving the smallest gaps between chips in the industry at seven micrometers, while also reducing voids between layers. Vertical DRAM density can be increased by 20 percent compared to HBM3E 8H chips. Improvements in manufacturing also provide better thermal properties and a higher product yield, we’re told.

The Seoul-headquartered corporation anticipates that its latest generation of HBM3E (12H) chips will provide an “optimal” solution for AI accelerators with a growing demand for DRAM memory. In comparison to HBM3 8H chips, Samsung HBM3E 12H memories appear to offer a 34 percent increase in the average speed for AI model training. Additionally, the company claims the number of simultaneous users of inference services can be expanded “more than 11.5 times.”

Samsung is currently providing samples of its first HBM3E 12H chips to select customers, with mass production expected in the first half of 2024. Concurrently, Micron, another major player in the HBM3E market, has announced full-scale production of its latest 3D memory chips. The Idaho-based company is placing a significant bet on a “traditional,” 8-layer HBM3E chip design to boost its financial performance in fiscal year 2024.

Micron will supply Nvidia with 24GB 8H HBM3E chips for the upcoming H200 Tensor Core GPU, a powerful AI accelerator set to hit the market in the second half of 2024. Similar to Samsung, Micron is positioning its HBM3E technology as a leading solution for memory-intensive applications and generative AI services.

HBM3E chips offer a pin speed greater than 9.2 gigabits per second (Gb/s), providing over 1.2 terabytes per second (TB/s) of memory bandwidth. The company states that power consumption is 30 percent lower than competing products, and the 24GB capacity enables data center operators to achieve “seamless scalability” for extensive AI applications.

Micron’s Executive VP and Chief Business Officer, Sumit Sadana, highlights that the company’s new HBM3E chips can support business growth amid the surging demand for AI. Looking ahead, Micron is preparing to sample its first 36GB 12-High HBM3E chips in March.

Source : TechSpot

You may also like