Samsung Electronics once held a commanding lead in the memory semiconductor sector, positioning itself exceptionally well to benefit from the surge in artificial intelligence. However, the South Korean tech powerhouse has now lagged behind its longtime competitor, SK Hynix, in the development of next-generation chips that are crucial to the AI silicon leader, Nvidia.
The consequences have been dire: Samsung’s profits have plummeted, approximately $126 billion has been erased from its market value, and an executive publicly apologized for the company’s recent financial struggles.
Memory chips, essential for data storage, are ubiquitous in devices ranging from smartphones to laptops. For years, Samsung was the undisputed front-runner in this domain, outpacing SK Hynix and U.S. rival Micron.
As AI applications such as OpenAI’s ChatGPT surged in popularity, the need for a strong infrastructure to support the training of their large-scale models became increasingly urgent. Nvidia has solidified its position as the front-runner in this arena with its graphics processing units (GPUs), which have become the benchmark for AI training among tech giants.
A pivotal element of semiconductor architecture is high-bandwidth memory (HBM). This cutting-edge memory technology leverages the stacking of multiple dynamic random access memory (DRAM) chips. However, prior to the AI boom, HBM was a relatively niche market, and Samsung missed the opportunity to invest in its development.
“HBM has long been a specialized product, and Samsung has not directed its resources toward advancing it,” remarked Kazunori Ito, director of equity research at Morningstar, in an email to CNBC. “The technological complexities involved in stacking DRAMs, along with the limited size of the target market, led many to believe that the substantial development costs were unwarranted.”
Recognizing a gap in the market, SK Hynix seized the opportunity. The company proactively launched HBM chips that gained approval for use in Nvidia’s architecture, thereby forging a strong partnership with the tech giant. Notably, Nvidia’s CEO urged SK Hynix to expedite the supply of its next-generation chips, highlighting the critical role of HBM in Nvidia’s offerings. As a result, SK Hynix achieved a record quarterly operating profit in September.
Brady Wang, associate director at Counterpoint Research, conveyed to popular news outlets that “With robust investments in research and development and well-established industry partnerships, SK Hynix is positioned at the forefront of HBM innovation and market engagement.” Additionally, Samsung reported a remarkable growth of over 70% in total HBM sales in the third quarter compared to the previous quarter. They confirmed that their latest product, HBM3E, is now in mass production and generating revenue. Furthermore, Samsung shared that the development of the next-generation HBM4 is progressing as planned, with expectations to commence mass production in the latter half of 2025.
Samsung explained that, in the third quarter, total HBM sales grew more than 70% quarter-on-quarter. The tech giant added that the current product known as HBM3E is in mass production and generating sales.
The South Korean tech company noted that development for its next-generation HBM4 is “underway according to plan” and that the company is targeting starting “mass production” in the second half of 2025.
Is there any Light at the end of the Tunnel?
Analysts have pointed out that Samsung is falling behind its competitors due to several factors, including insufficient investment in High Bandwidth Memory (HBM) and its lack of first-mover advantage. In the end, it is accurate to state that Samsung has struggled to bridge the gap with SK Hynix in the HBM development roadmap.