What is the story about?
SK Hynix Inc. reported a 62% jump in profit and revealed it’s sold its entire memory chip lineup for next year, illustrating how a global AI infrastructure buildout is ratcheting up sector-wide demand.
The top supplier of high-bandwidth memory to Nvidia Corp. plans to devote far more capital toward ramping up capacity next year, racing to expand output to ride the unprecedented spending led by industry leaders from OpenAI to Meta Platforms Inc. SK Hynix will begin supplying next-generation HBM4 components to customers this quarter, shifting to full-fledged sales in 2026.
SK Hynix offers investors an early glimpse into the rapidly expanding AI infrastructure sphere, ahead of earnings this week by sector linchpins including Meta and Amazon.com Inc. The Korean company’s partnership with Nvidia is central to that ecosystem, and its rapid growth underscores how big tech firms are hoovering up the chips, servers and other gear necessary to train and operate AI services. Shares in SK Hynix have roughly tripled in 2025.
“The insatiable HBM demand is likely to persist into next year, accelerated by massive projects like OpenAI’s Stargate,” on top of demand from hyperscalers and countries pursuing their own sovereign AI, said Greg Roh, an analyst with Hyundai Motor Securities Co.
In the September quarter, the company logged a record operating profit of 11.4 trillion won ($8 billion), slightly above the average of analyst estimates. Sales climbed to 24.5 trillion won. Its shares climbed as much as 5% Wednesday morning in Seoul.
The results defy caution from some investors that market valuations have run too high, given the lack of mainstream AI applications and the circular nature of AI infrastructure financing and supply deals. They warn that OpenAI and Nvidia are fueling an increasingly complex web of business transactions that’s artificially propping up the trillion-dollar artificial intelligence boom. Apart from Meta and Amazon, OpenAI-backer Microsoft Corp., Alphabet Inc. and Apple Inc. are also unfurling results in coming days.
“I don’t believe we’re in an AI bubble,” Nvidia chief Jensen Huang said during a Bloomberg Television interview hours before SK Hynix’s results. “All of these different AI models we’re using — we’re using plenty of services and paying happily to do it.”
HBM has been selling out since 2023 and supply will remain tight in 2027, SK Hynix executives said on an earnings call.
In the longer run, many investors and tech firms are betting the advent of AI will trigger a “super-cycle” in the memory market, particularly for the HBMs required to make AI accelerators and power services like ChatGPT. The belief is that new applications for AI in areas like autonomous driving and robotics will emerge, boosting new AI chip entrants such as Qualcomm Inc. and further pushing up demand for high-end memory chips.
OpenAI alone has struck deals for data centers and chips that could easily top $1 trillion. SK Hynix and rival Samsung Electronics Co. have secured initial agreements to supply the US startup’s Stargate project and estimate that project will need more than twice the world’s current HBM capacity.
The fierce race to build AI capabilities is also constraining supplies of conventional memory, which are necessary in AI data centers alongside higher-end chips. The global semiconductor market could grow at double-digit percentages for three consecutive years, a streak the industry hasn’t seen in three decades, Eugene Investment & Securities Co. analyst Lee Seung-woo said.
Both SK Hynix and Samsung raised traditional memory chip prices by as much as 30% in the fourth quarter, according to a local media report. Several producers of older types of memory including Nanya Technology Corp. and Kioxia Holdings Corp. have seen their share prices more than double since the beginning of this year.
The impact is being felt across the industry. Xiaomi Corp. executives last week said it jacked up the price tag on its latest smartphones in part because of the rising cost of memory chips.
The top supplier of high-bandwidth memory to Nvidia Corp. plans to devote far more capital toward ramping up capacity next year, racing to expand output to ride the unprecedented spending led by industry leaders from OpenAI to Meta Platforms Inc. SK Hynix will begin supplying next-generation HBM4 components to customers this quarter, shifting to full-fledged sales in 2026.
SK Hynix offers investors an early glimpse into the rapidly expanding AI infrastructure sphere, ahead of earnings this week by sector linchpins including Meta and Amazon.com Inc. The Korean company’s partnership with Nvidia is central to that ecosystem, and its rapid growth underscores how big tech firms are hoovering up the chips, servers and other gear necessary to train and operate AI services. Shares in SK Hynix have roughly tripled in 2025.
“The insatiable HBM demand is likely to persist into next year, accelerated by massive projects like OpenAI’s Stargate,” on top of demand from hyperscalers and countries pursuing their own sovereign AI, said Greg Roh, an analyst with Hyundai Motor Securities Co.
In the September quarter, the company logged a record operating profit of 11.4 trillion won ($8 billion), slightly above the average of analyst estimates. Sales climbed to 24.5 trillion won. Its shares climbed as much as 5% Wednesday morning in Seoul.
The results defy caution from some investors that market valuations have run too high, given the lack of mainstream AI applications and the circular nature of AI infrastructure financing and supply deals. They warn that OpenAI and Nvidia are fueling an increasingly complex web of business transactions that’s artificially propping up the trillion-dollar artificial intelligence boom. Apart from Meta and Amazon, OpenAI-backer Microsoft Corp., Alphabet Inc. and Apple Inc. are also unfurling results in coming days.
“I don’t believe we’re in an AI bubble,” Nvidia chief Jensen Huang said during a Bloomberg Television interview hours before SK Hynix’s results. “All of these different AI models we’re using — we’re using plenty of services and paying happily to do it.”
HBM has been selling out since 2023 and supply will remain tight in 2027, SK Hynix executives said on an earnings call.
In the longer run, many investors and tech firms are betting the advent of AI will trigger a “super-cycle” in the memory market, particularly for the HBMs required to make AI accelerators and power services like ChatGPT. The belief is that new applications for AI in areas like autonomous driving and robotics will emerge, boosting new AI chip entrants such as Qualcomm Inc. and further pushing up demand for high-end memory chips.
OpenAI alone has struck deals for data centers and chips that could easily top $1 trillion. SK Hynix and rival Samsung Electronics Co. have secured initial agreements to supply the US startup’s Stargate project and estimate that project will need more than twice the world’s current HBM capacity.
The fierce race to build AI capabilities is also constraining supplies of conventional memory, which are necessary in AI data centers alongside higher-end chips. The global semiconductor market could grow at double-digit percentages for three consecutive years, a streak the industry hasn’t seen in three decades, Eugene Investment & Securities Co. analyst Lee Seung-woo said.
Both SK Hynix and Samsung raised traditional memory chip prices by as much as 30% in the fourth quarter, according to a local media report. Several producers of older types of memory including Nanya Technology Corp. and Kioxia Holdings Corp. have seen their share prices more than double since the beginning of this year.
The impact is being felt across the industry. Xiaomi Corp. executives last week said it jacked up the price tag on its latest smartphones in part because of the rising cost of memory chips.
Do you find this article useful?

/images/ppid_59c68470-image-176165257403528280.webp)

/images/ppid_59c68470-image-176156008688450948.webp)
/images/ppid_59c68470-image-176161507367063418.webp)
/images/ppid_59c68470-image-17617000663814947.webp)
/images/ppid_59c68470-image-176169505111512454.webp)
/images/ppid_59c68470-image-176160754314793866.webp)

/images/ppid_59c68470-image-176167003813733469.webp)
/images/ppid_59c68470-image-176163004632793160.webp)
/images/ppid_a911dc6a-image-176162707733244917.webp)
/images/ppid_59c68470-image-176167504559137334.webp)