The Korea Herald

소아쌤

Nvidia’s new AI chip to offer boon for Korean chip giants’ HBM business

Samsung, SK hynix dominate nascent but promising advanced memory chip market

By Jo He-rim

Published : Aug. 10, 2023 - 14:23

    • Link copied

Nvidia GH200 Grace Hopper™ platform (Nvidia) Nvidia GH200 Grace Hopper™ platform (Nvidia)

Nvidia’s newest AI chip utilizing the world’s first HBM3e processor is expected to boost growth for SK hynix and Samsung Electronics, the two chipmakers dominating the still-nascent High Bandwidth Memory market.

Nvidia, a leading producer of GPUs based in the US, unveiled Tuesday its next-generation NVIDIA GH200 Grace Hopper platform at this year's SIGGRAPH Conference, an annual event on computer graphics, held from Sunday to Thursday in the US.

The latest chip has been built for the era of accelerated computing and generative AI, based on a new Grace Hopper Superchip and utilizing the world’s first HBM3e processor to raise the processing speed, the company said.

“To meet surging demand for generative AI, data centers require accelerated computing platforms with specialized needs,” said NVIDIA founder and CEO Jensen Huang.

“The new GH200 Grace Hopper Superchip platform delivers this with exceptional memory technology and bandwidth to improve throughput, the ability to connect GPUs to aggregate performance without compromise, and a server design that can be easily deployed across the entire data center,” Huang added.

The CEO explained the latest platform will be available in a wide range of configurations, as it is created to handle the world’s most complex generative AI workloads, spanning large language models, recommender systems and vector databases.

Leading system manufacturers are expected to deliver systems based on the platform in the second quarter of 2024, Nvidia said.

The chip has been designed to deliver up to 3.5 times more memory capacity and three times more bandwidth than the current generation is offering, the company added.

SK hynix's latest HBM 3 chip (SK hynix) SK hynix's latest HBM 3 chip (SK hynix)

While the US chipmaker did not reveal which company will supply the HBM3e for its chips, the industry anticipates South Korean chipmakers SK hynix or Samsung would likely be selected, as they are the two dominant suppliers in the global HBM market.

HBM is considered the advanced level chip with high input and output speeds that can embrace greater computational needs amid the rise of generative AI technology. For now, HBM accounts for only about 1.5 percent of the DRAM market, but industry watchers expect the market to grow quickly with the demand for massive AI servers.

Market tracker TrendForce said HBM could overcome hardware-related bottlenecks in AI development, predicting global demand would reach 290 million gigabytes this year, growing 60 percent on-year.

While HBM3 is the most up-to-date version of the chip in the market, the HBM3e memory that would be used in Nvidia’s newest AI chip is 50 percent faster, Nvidia said. HBM3e delivers a total of 10 terabytes per second of combined bandwidth, allowing the new platform to run models 3.5 times larger than the previous version, while improving performance with three times faster memory bandwidth, Nvidia said.

Samsung's unique HBM Processing-In-Memory chip (Samsung Electronics) Samsung's unique HBM Processing-In-Memory chip (Samsung Electronics)

While the two Korean chipmakers take up almost 90 percent share of the global HBM market, they are ramping up efforts to expand their businesses.

SK hynix has been considered the front-runner in the race, taking up almost 50 percent of the market share as of 2022, in the data released by TrendForce. For this year, the market tracker expects both Samsung and SK hynix will each take 46 to 49 percent of the market share, and Micron Technology to take 4 to 6 percent.

In April, SK hynix announced that it succeeded in developing the world’s first 12-layer HBM3. At present, the company is the only one capable of mass-producing HBM3 chips.

Samsung currently produces HBM2 and HBM2E chips, with plans to start mass production of 8-layer HBM3 and 12-layer HBM3E in the second half of this year.

Samsung recently purchased more advanced equipment with the goal of doubling its HBM production capacity by the end of next year. It is reportedly planning to invest some 1 trillion won ($758 million) to bolster production facilities.

SK hynix is also considering an investment of some 1 trillion won to expand the production capacity of its HBM plant in Icheon, Gyeonggi Province.