Skip to main content

Hewlett Packard Enterprise Comp(HPE-N)
NYSE

Today's Change
Real-Time Last Update Last Sale Cboe BZX Real-Time

Better Artificial Intelligence Stock: Nvidia vs. Super Micro Computer

Motley Fool - Mon Oct 2, 2023

The current market is torn between the believers in artificial intelligence and the skeptics. For the believers, this year's outsize move in AI stocks is just a taste of things to come, given that we are still in the early innings of this technological sea change. For skeptics, today's huge growth in AI-related names amounts to a pull-forward of demand and a near-term bubble.

But if you're in the former camp, which I lean toward, then the August-to-September pullback presents an opportunity to get in on long-term AI winners after short-term traders have taken profits. Meanwhile, two of the biggest winners year to date are Nvidia(NASDAQ: NVDA) and Super Micro Computer(NASDAQ: SMCI). Having roughly tripled this year, both stocks have also pulled back 15.5% and 27.9%, respectively, from their recent all-time highs.

For those looking to get in on the AI megatrend, which is the better buy right now?

Nvidia's and Super Micro's AI advantages

Why have these two stocks been such AI winners? Because each has been building a business model optimized for AI computing not just this year, but actually over the course of the last couple of decades. So when the AI trend started taking off in earnest, both were in pole position to capitalize.

For Nvidia, its background was in high-end graphics chips for gamers. But in 2006, Jensen Huang and his team realized that GPUs could be programmed for general-purpose computing, ideal for massive workloads that needed to be processed in parallel. Thus, the company developed its CUDA software platform, allowing developers to use GPUs not just for graphics but also for high-performance computing.

So, this year's gains are actually far from an "overnight success" but rather one nearly 20 years in the making.

Similarly, Super Micro Computer has long been pegged as a "commodity" server provider in a highly competitive, low-margin industry; however, it's more differentiated than many think. In fact, SMCI has been profitable every year since its founding in 1993.

As CEO Charles Liang explained in an interview just last week, Super Micro was founded and run in Silicon Valley, where costs are higher than for Asian original design manufacturers (ODMs). Therefore, SMCI had to differentiate, and it does so in three main ways.

First, it developed a "building block" architecture for its servers, rather than mass-producing optimized models. By optimizing each server component at a more micro level, SMCI can mass-customize servers for specific applications and exacting customer requirements. Second, the company has been focused on energy-efficient designs and cooling systems for about two decades, long before it was fashionable. With AI solutions being incredibly power-hungry and heat-generative, SMCI's energy-efficient designs are also in demand with AI. And finally, SMCI's Silicon Valley presence enables a closer working relationship with Nvidia and other chipmakers, allowing Super Micro to often be first to market with the latest designs and solutions.

A server room in blue light.

Image source: Getty Images.

How strong is the moat?

Nvidia has a strong head start in AI, but it is unclear if it will run away with the vast majority of AI chip spend. Certainly, its lead looks relatively safe for the next few quarters. However, Advanced Micro Devices(NASDAQ: AMD) will be ramping up its MI300 AI chip soon, and Intel's (NASDAQ: INTC) Gaudi accelerators are already in use, with a leading-edge version called Falcon Shores set to hit the market in 2025.

Nvidia's CUDA software will make its lead difficult to overcome. But AMD, Intel, and the rest of Big Tech are also collaborating on building out open-source AI programming languages that can program not only Nvidia GPUs but also others. So while its lead is formidable, Nvidia has a lot of well-funded companies gunning for its hardware and software moats.

Meanwhile, if one is already a higher-end standard-model server original equipment manufacturer such as Dell or HP Enterprise, it will probably be somewhat difficult to copy Super Micro's building block modular solutions, as these high-end server providers would have to change their entire manufacturing architecture to do so. But Asian ODMs, which sell individual server components in "white box" solutions, may be able to compete in modular designs in the future. However, it will also be difficult, as most component ODMs don't do the kinds of optimized integration Super Micro does.

So looking two or three years out, it's possible competitors can catch up to Nvidia and Super Micro, if they invest and innovate aggressively. But both companies seem to have formidable moats for the next couple of years.

Valuation based on AI growth

Obviously, Nvidia looks much more expensive than Super Micro, at 105 times earnings versus Super Micro's 23 P/E ratio. But these trailing ratios don't tell us much, as the determinant of value will be related to each company's forward growth.

Of note, AMD's Lisa Su recently forecast that AI accelerators will grow at a 50% annualized rate, from a $30 billion market recently to a $150 billion market by 2027. Meanwhile, back in May, research firm Trendforce estimated that AI server shipments would grow nearly 40% this year, then at a high-20% rate for the next three years through 2026. However, this is for shipments, not revenue. Given that AI-related servers are much more expensive than regular servers, revenue growth associated with shipments could be higher.

Assuming a 50% compound annual growth rate for 2024 through 2026, both Nvidia and Super Micro look much cheaper than their trailing valuations suggest.

Nvidia made over $10 billion in data center revenue last quarter out of $13.5 billion in total revenue, and it has guided for $16 billion in total revenue next quarter. That seems to put it on pace for about $40 billion in data-center revenue in 2023. Assuming a 50% growth rate from 2024 through 2026, Nvidia's data-center revenue would reach $135 billion, with total revenue probably closer to $150 billion or so.

Nvidia's non-GAAP net profit margin was a stunning 50% last quarter; assuming it can maintain that pace, Nvidia would be earning about $75 billion in 2026. If achieved, Nvidia trades at just 14.3 times that 2026 earnings projection today.

Meanwhile, Super Micro earned 52% of revenue from AI systems last quarter, equating to $1.13 billion. Assuming continued growth, Super Micro is probably on track to make $4 billion in AI-related revenue in 2023 out of about $8 billion in total revenue. Assuming a 50% growth rate in AI servers and modest growth in non-AI servers, AI-related revenue would reach $13.5 billion in 2026 and total revenue would be around $18 billion. At a 10% net profit margin -- margins were 9% over the past 12 months -- Super Micro would be earning $1.8 billion in 2026. Today, Super Micro trades at just eight times that profit outlook.

Can Nvidia and Super Micro do better than these figures?

Of course, Nvidia typically gets a big premium today by virtue of its having a current monopoly on general-purpose AI GPUs, with an estimated 80% to 95% market share.

Meanwhile, Super Micro has only an estimated 7% market share of the global server market, according to analysts at Barclays. However, within the AI server market, analysts at Northland Capital recently estimated that Super Micro's market share jumped from 7% to 17% over the course of the June quarter -- a rather stunning improvement in a short time period. The 50% of revenue from AI systems was also much higher than at its peers, who generally reported about 20% of revenue from AI systems last quarter.

So while Nvidia is looking strong today, it seems it may be easier for Super Micro to expand on its 17% AI market share than Nvidia will be able to expand on its 90% or so market share. And given that Super Micro can make servers for Nvidia, AMD, or Intel, it can still grow even if these challengers eventually make inroads.

The verdict

I own Super Micro and not Nvidia. But if I had to buy one today, it would be a difficult choice. After all, Nvidia isn't that much more expensive based on these very rough 2026 projections.

Still, I think will be an easier task for Super Micro to improve on its lower market share and expand its lower margin today than it will be more Nvidia to improve its already-high market share and margins. That's why even though Nvidia may have more proprietary AI technology and thus be the more essential AI company, Super Micro may actually be the better AI stock, given its lower valuation and prospects for improvement.

After all, SMCI has actually outperformed Nvidia's stock this year. While both should continue to do well, I'd expect SMCI's stock to continue doing a tad better going forward. But I also wouldn't scoff at anyone picking Nvidia or owning both stocks for their high-growth AI plays.

10 stocks we like better than Nvidia
When our analyst team has a stock tip, it can pay to listen. After all, the newsletter they have run for over a decade, Motley Fool Stock Advisor, has tripled the market.*

They just revealed what they believe are the ten best stocks for investors to buy right now... and Nvidia wasn't one of them! That's right -- they think these 10 stocks are even better buys.

See the 10 stocks

*Stock Advisor returns as of September 25, 2023

Billy Duberstein has positions in Super Micro Computer and has the following options: short January 2025 $110 puts on Super Micro Computer, short January 2025 $125 puts on Super Micro Computer, short January 2025 $130 puts on Super Micro Computer, short January 2025 $280 calls on Super Micro Computer, short January 2025 $380 calls on Super Micro Computer, and short January 2025 $85 puts on Super Micro Computer. His clients may own shares of the companies mentioned. The Motley Fool has positions in and recommends Advanced Micro Devices and Nvidia. The Motley Fool recommends Barclays Plc, Intel, and Super Micro Computer and recommends the following options: long January 2023 $57.50 calls on Intel and long January 2025 $45 calls on Intel. The Motley Fool has a disclosure policy.

Paid Post: Content produced by Motley Fool. The Globe and Mail was not involved, and material was not reviewed prior to publication.