Skip to main content

Lam Research Corp(LRCX-Q)
NASDAQ

Today's Change
Real-Time Last Update Last Sale Cboe BZX Real-Time

Beyond Nvidia: Why Lam Research, Oracle, and Micron Technology Are Prime AI Investments in 2024

Motley Fool - Wed Mar 20, 5:30AM CDT

While Nvidia(NASDAQ: NVDA) might be the poster child for the business side of artificial intelligence (AI), the true depth of this technological revolution lies in its vast ecosystem. In this arena, many players contribute to AI's expanding horizon from very different angles.

Three seasoned Motley Fool contributors have spotted opportunities that venture beyond Nvidia's glamorous, but perhaps overpriced, AI success. They spotlight Lam Research(NASDAQ: LRCX), a titan in semiconductor equipment, Oracle(NYSE: ORCL), a cloud-computing giant pivoting to AI, and Micron Technology(NASDAQ: MU), a key player in the indispensable memory sector. Each business is uniquely positioned to thrive in the AI era, promising robust stock returns along the way.

In the quest for AI dominance, there's more to the story than just one company. Let's see what savvy investors can learn from this diverse collection of AI insights and promising investment ideas.

This equipment maker is set to dominate key technology inflections that enable AI chips

Billy Duberstein(Lam Research): While Nvidia dominates the AI training chipset today, will it be as dominant in the future? It's certainly possible but not assured.

But no matter which AI chip wins out, some things are nearly guaranteed. One is that the AI revolution will require a slew of leading-edge logic, DRAM memory, and NAND flash storage, based on emerging chipmaking technologies.

Semiconductor capital-equipment player Lam Research makes the critical equipment that enables foundries to make all of these advanced chips, in some cases winning 100% market share of specific process steps. The processes Lam tends to dominate are in the etch and deposition of chip components in stacked or vertical structures. These verticalization steps will be increasingly crucial in new AI technologies.

The technologies include gate-all-around transistors in which transistors are "stacked" and surrounded on all sides by gate nanosheets; backside power in which chip components are etched on the back of chips to allow for greater transistor content on the front; and advanced packaging in which holes and connectors are drilled vertically through 3D-stacked chips and high-bandwidth memory. In addition, Lam management highlighted its new dry-resist technology that helps better absorb photons of extreme ultraviolet (EUV) machines as a major opportunity. Currently, dry resist is in lab testing at all major foundry customers.

These are each incremental billion-dollar opportunities, according to Lam's management. And it means no matter whether Nvidia or one of its competitors gains the most share of AI chips or memory, Lam stands to benefit due to these broader chipmaking-technology transitions.

Moreover, Lam's prior "core" business in NAND flash is in its worst downturn ever. Despite that, Lam still made nearly $26 per share over the past 12 months in just about the worst environment one can imagine for the NAND investment industry.

Not only should NAND investment recover, but investment in NAND will probably eventually exceed its prior highs, CFO Doug Bettinger said at a recent investor conference. That assertion seems likely in the context of AI growth as well. While AI will be a known boon to both leading ledge logic and DRAM chips, AI servers also require about three times the NAND flash memory as a traditional server. And if edge devices and PCs eventually run AI models on edge devices, those devices will also need more "fast storage" -- i.e., NAND flash -- per device, too.

So when one thinks about a simultaneous AI-driven tech inflection in logic and DRAM chips along with the prospect of a an eventual NAND flash recovery later this year or next year, it's pretty apparent Lam is in for some powerful earnings growth. Combined with robust free-cash-flow generation, even in a trough year last year, and a growing dividend, and Lam looks like an AI winner to hold over the long term.

Don't forget how many memory chips the AI boom requires

Anders Bylund (Micron Technology): Powerful AI systems require a lot of memory chips. And I mean a lot of memory chips. You know the Nvidia A100 AI accelerators that power the supercomputer OpenAI used to train ChatGPT's large language model (LLM)? That computing beast runs 3,456 computing nodes, each equipped with four A100 chips, and each one holds 64 gigabytes (GB) of DRAM memory right on the card. The nodes also hold 512 GB of shared DRAM managed by an Intel processor. That's 2.6 million gigabytes of random access memory, or 2.6 petabytes.

But that's not all. The Leonardo supercomputer also contains nearly 11 petabytes of flash memory in high-speed, solid-state drive (SSD) storage modules. Altogether, the ChatGPT-training supercomputer needs about 14 petabytes of high-end memory.

That's just one system, already belonging to an older generation of supercomputers. The next wave of LLM training platforms uses even more AI accelerators, more DRAM, and more flash-based storage. And since every tech giant worth its salt wants to take advantage of the AI boom, demand for these computing monsters runs high.

Micron Technology sees enormous value in the AI frenzy. The central data-crunching is also supported by beefier -- and more memory-hungry -- computing systems in many places.

"Generative AI use cases are expanding from the data center to the edge with several recent announcements of AI-enabled PCs, smartphones with on-device AI capabilities as well as embedded AI in the auto and industrial end markets," Micron CEO Sanjay Mehrotra said in December's first-quarter 2024 earnings call. "From the data center to the edge, AI has emerged as a significant secular driver that will further bolster the industry toward record revenue [total addressable market] in 2025 and drive growth for years to come."

AI computing will be a serious growth driver for Micron in the next few years. Unit prices are also on the rebound after a couple of weak years -- a common pattern in the highly cyclical memory-chip market. And amid these bullish signs, Micron shares trade at the affordable valuation ratios of 6 times sales and 13.5 times forward-earnings estimates. This leading provider of world-class memory chips is poised to soar as the AI mania plays out, and the stock is a bargain-bin find next to Nvidia's lofty valuation ratios.

A new cloud giant, getting lots of Nvidia help

Nicholas Rossolillo (Oracle): In the autumn of 2022, in the midst of the bear market and before all the Nvidia AI frenzy got rolling, an old and forgotten software company called Oracle quietly struck a deal with Nvidia.

You see, Oracle was still in the early stages of building its cloud-computing infrastructure service in which it rents out the use of data-center computing to customers, which can access those services on demand via an internet connection. In a bid to catch up to the cloud-infrastructure giants Amazon AWS, Microsoft Azure, and Alphabet Google Cloud, Oracle Cloud inked a deal to be the first to purchase and install Nvidia's AI-accelerated computing servers.

The rest, as they say, is history. Nvidia stock has gone off like a rocket since then.

But in all reality, the old software giant of yesteryear, Oracle, hasn't done so badly either. It's been putting up its best growth rates in years thanks to its head start plugging Nvidia chips and hardware into its cloud. The last quarter (which just ended in February 2024) was another case in point. Revenue increased 7% year over year, but resulting earnings per share jumped 16% year over year. Talk about positive operating leverage. A little additional revenue is almost entirely trickling down to the bottom-line profit for Oracle right now.

Oracle's cloud-infrastructure segment (versus the slower-growing and larger software segments) is responsible for much of this expansion, increasing 49% year over year to $1.8 billion in sales. For investors feeling like they missed the boat on Nvidia stock, Oracle offers a way to access Nvidia's success via proxy -- and a relative value at that. Oracle shares are currently valued at just 22 times expected current fiscal-year earnings (Oracle's current fiscal year wraps up in May) and 20 times next fiscal year's expected earnings. If you like what Nvidia is cooking up but are worried about valuation, give Oracle a serious look.

Should you invest $1,000 in Micron Technology right now?

Before you buy stock in Micron Technology, consider this:

The Motley Fool Stock Advisor analyst team just identified what they believe are the 10 best stocks for investors to buy now… and Micron Technology wasn’t one of them. The 10 stocks that made the cut could produce monster returns in the coming years.

Stock Advisor provides investors with an easy-to-follow blueprint for success, including guidance on building a portfolio, regular updates from analysts, and two new stock picks each month. The Stock Advisor service has more than tripled the return of S&P 500 since 2002*.

See the 10 stocks

*Stock Advisor returns as of March 18, 2024

John Mackey, former CEO of Whole Foods Market, an Amazon subsidiary, is a member of The Motley Fool’s board of directors. Suzanne Frey, an executive at Alphabet, is a member of The Motley Fool’s board of directors. Anders Bylund has positions in Alphabet, Amazon, Intel, Micron Technology, and Nvidia. Billy Duberstein has positions in Alphabet, Amazon, Lam Research, Micron Technology, and Microsoft. Nicholas Rossolillo has positions in Alphabet, Amazon, Lam Research, Micron Technology, and Nvidia. The Motley Fool has positions in and recommends Alphabet, Amazon, Lam Research, Microsoft, Nvidia, and Oracle. The Motley Fool recommends Intel and recommends the following options: long January 2023 $57.50 calls on Intel, long January 2025 $45 calls on Intel, long January 2026 $395 calls on Microsoft, short January 2026 $405 calls on Microsoft, and short May 2024 $47 calls on Intel. The Motley Fool has a disclosure policy.

Paid Post: Content produced by Motley Fool. The Globe and Mail was not involved, and material was not reviewed prior to publication.

More from The Globe