On January 29, 2021, a man walked past the company’s logo in the lobby of SK hynix’s Bundang office in Seongnam.
Jung Yeon-je | AFP | Getty Images
SK Hynix, one of the world’s largest memory chip makers, said on Thursday that second-quarter profit hit its highest level in six years as it maintained its leadership in advanced memory chips critical to artificial intelligence computing.
Here’s how SK hynix’s second-quarter results compare to the LSEG SmartEstimate, which is weighted based on more consistent and accurate analyst forecasts:
- income: 16.42 trillion won (approximately US$11.86 billion), compared with 16.4 trillion won
- operating profit: 5.47 trillion won vs 5.4 trillion won
June quarter operating profit hit its highest level since the second quarter of 2018, rebounding from a 2.88 trillion won loss in the same period last year.
Revenue from April to June increased 124.7% from 7.3 trillion won a year ago. This is the highest quarterly revenue in the company’s history, according to LSEG data dating back to 2009.
SK Hynix said on Thursday that the overall price of its memory products continued to rise due to strong demand for AI memory, including high-bandwidth memory, resulting in revenue growth of 32% from the previous quarter.
The South Korean giant supplies companies such as Nvidia with high-bandwidth memory chips that meet the needs of artificial intelligence chipsets.
SK Hynix shares fell 7.81% on Thursday morning.
decline comes as South Korea Cospi Disappointing U.S. tech stocks sold off overnight, with index down 1.91% letter and Tesla income. The reports mark investors’ first look at how large companies performed in the second quarter.
“Strong demand for AI servers is expected to continue in the second half of this year with the launch of AI-enabled PCs and mobile devices, and traditional markets will gradually recover,” the company said in an earnings call on Thursday.
Taking advantage of the strong demand for artificial intelligence, SK hynix plans to “continue to maintain its leading position in the HBM market through mass production of 12-layer HBM3E products.”
The company will begin mass production of the 12-layer HBM3E after providing samples to major customers this quarter, and is expected to ship to customers in the fourth quarter.
Supply is tight
Memory leaders such as SK Hynix have been actively expanding HBM production capacity to meet the growing demand for artificial intelligence processors.
HBM requires more wafer capacity than conventional dynamic random access memory products, a type of computer memory used to store data, which SK Hynix said is also facing supply constraints.
SK said: “Investment demand is also rising in order to meet the demand for traditional DRAM as well as HBM. HBM requires more wafer production capacity than ordinary DRAM. Therefore, the capital expenditure level this year is expected to be higher than what we expected at the beginning of the year.” Hynix .
“While overcapacity is expected to increase next year due to increased industrial investment, a large part of it will be used to increase HBM production. Therefore, the tight supply situation of traditional DRAM is likely to continue.”
SK Kim of Daiwa Capital Markets said in a report on June 12 that they expect “tight supply of HBM and memory to continue until 2025 due to bottlenecks in HBM production.”
“As such, we expect the favorable price environment to continue and SK hynix to benefit from its competitiveness in AI graphics processing unit HBM and AI server high-density enterprise SSD (eSSD) to record strong sales in 2024-25 earnings, thereby driving a re-rating of the stock,” King said.
As large language models such as ChatGPT drive the explosive adoption of artificial intelligence, the supply of high-bandwidth memory chips is already stretched thin.
Analysts warn that the boom in artificial intelligence is expected to strain the supply of high-end memory chips this year. SK Hynix and Micron said in May that they had sold out of their high-bandwidth memory chips for 2024 and that their 2025 inventories were almost sold out.
Large language models require large numbers of high-performance storage chips because these chips allow the models to remember details of past conversations and user preferences in order to produce human-like responses.
SK Hynix dominates the high-bandwidth storage chip market and was the sole supplier of Nvidia’s HBM3 chips until rival Samsung reportedly passed tests for using its HBM3 chips in Nvidia processors in the Chinese market.
The company said it expects to begin shipping the next-generation 12-layer HBM4 in the second half of 2025.