As the tech world gears up for CES 2025, all eyes are on SK hynix, the South Korean semiconductor giant, which is set to introduce a significant leap in memory technology with its 16-Hi HBM3E memory chips. This innovation promises to redefine the landscape of AI computing by offering unprecedented memory capacity and performance.
At the heart of SK hynix’s display at CES 2025 will be the 16-Hi HBM3E memory chips, a marvel of engineering where 16 layers of DDR5 DRAM chips are stacked vertically. This configuration achieves a staggering capacity of up to 48GB per stack, positioning it as a leader in high-bandwidth memory (HBM) technology. This development was first teased in November 2024, when SK hynix announced the ongoing development with optimistic projections about its yield matching that of the current 12-layer HBM3E, which is the pinnacle of existing mass-produced memory technology.
The engineering behind these chips involves an advanced mass reflow-molded underfill (MR-MUF) process. This method not only allows for the stacking of 16 layers but does so while controlling chip warpage—a common challenge in high-density memory stacking. It also maximizes thermal performance, crucial for maintaining stability and efficiency in high-performance computing environments like AI accelerators.
Looking ahead, SK hynix has hinted at the possibility of scaling up to 20-layer stacks without the need for hybrid bonding, a testament to their confidence in the MR-MUF process’s capabilities. Hybrid bonding, while promising, represents a future step in semiconductor packaging technology that could further reduce the size and increase the efficiency of memory chips.

Justin Kim, Chief Marketing Officer at SK hynix, articulated the company’s vision at CES, stating, “We will broadly introduce solutions optimized for on-device AI and next-generation AI memories, as well as representative AI memory products such as HBM and eSSD at this CES. Through this, we will publicize our technological competitiveness to prepare for the future as a Full Stack AI Memory Provider.” This statement not only underlines the focus on AI but also positions SK hynix as a comprehensive solution provider in the memory sector.
Echoing this sentiment, CEO Kwak Noh-jung highlighted the broader implications of their technological advancements: “The changes in the world triggered by AI are expected to accelerate further this year, and SK hynix will produce 6th generation HBM (HBM4) in the second half of this year to lead the customized HBM market to meet the diverse needs of customers. We will continue to do our best to present new possibilities in the AI era through technological innovation and provide irreplaceable value to our customers.”
The introduction of 16-Hi HBM3E chips could significantly impact AI GPU designs, offering developers and manufacturers more memory in a smaller footprint, which is essential for training complex AI models and handling large datasets with speed and efficiency. With AI becoming increasingly integral across various sectors from autonomous vehicles to healthcare, the demand for such high-performance memory solutions is only expected to grow.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
