You are currently viewing NVIDIA JOINS THE TRILLION DOLLAR CLUB: THE UNSEEN POWERHOUSE DRIVING THE AI REVOLUTION

NVIDIA JOINS THE TRILLION DOLLAR CLUB: THE UNSEEN POWERHOUSE DRIVING THE AI REVOLUTION

Only four corporations worldwide are valued at more than $2 trillion. Apple, Microsoft, the oil corporation Saudi Aramco, and, as of 2024, Nvidia. It’s understandable if the name doesn’t seem familiar. The corporation does not produce a dazzling product that stays in your hand all day, every day, as Apple does. Nvidia creates a chip concealed deep within the complex internals of a computer, a specialised product that more people rely on every day. You’d be hard-pressed to find an investor who does not regard Nvidia as the apex of artificial intelligence (AI) businesses. Its graphics processing units (GPUs), which crunch AI workloads, are far and away the finest in class and have been employed by nearly every AI-focused enterprise.

Rewind the clock to 2019, when Nvidia’s market capitalisation was over $100 billion. The AI mania boosted its astounding speedrun to 20 times its already impressive size. Nvidia is undoubtedly the greatest winner in the artificial intelligence business. OpenAI, the company that launched this fixation, is presently valued at roughly $85 billion, while Grand View Research estimated that the global AI business was worth less than $200 billion in 2023. Both are only a tiny portion of Nvidia’s worth. With all eyes on the company’s incredible progress, the real question is whether Nvidia can maintain its lofty position — but first, let’s look at how it got here.

FROM VIDEO GAMES TO AI

In 1993, long before creepy AI-generated art and humorous AI chatbot conversations dominated our social media feeds, three Silicon Valley electrical engineers, Jensen Huang, Chris Malachowsky, and Curtis Priem, founded a company focused on an exciting, rapidly developing niche of personal computing: video games.

Nvidia was created to provide a particular type of chip known as a graphics card — also known as a GPU (graphics processing unit) — which allows for the display of sophisticated 3D images on computer screens. The more powerful the graphics card, the faster high-quality visuals can be generated, which is critical for gaming and video editing. In the prospectus release before its first public offering in 1999, Nvidia said that its future success would be dependent on the continuous growth of computer applications that use 3D graphics. For the majority of Nvidia’s life, gaming visuals were its primary purpose.

Nvidia became a powerhouse selling video game cards — now an entertainment industry giant with over $180 billion in sales last year — but decided it would be wise to diversify beyond only providing graphics cards for games. Not all of its experiments worked out. Over a decade ago, Nvidia struggled to establish itself as a key player in the mobile chip business. Still, Android phones employ a variety of non-Nvidia processors, while iPhones use Apple-designed ones.

Another gamble, however, paid off, and that is the reason we’re all talking about Nvidia today. In 2006, the business introduced CUDA, a programming language that, in essence, freed the ability of its graphics cards for more general computing tasks. Its processors could now do a lot of heavy lifting for things other than producing gorgeous gaming visuals, and it turned out that graphics cards could multitask even better than the CPU (central processing unit), also known as the computer’s primary “brain”. This made Nvidia’s GPUs ideal for computationally intensive operations such as machine learning (and cryptocurrency mining). 2006 was the year Amazon established its cloud computing business; Nvidia’s drive into general computing coincided with the popularisation of massive data centres.

Nvidia’s current dominance is particularly noteworthy since, for most of Silicon Valley’s existence, there was already a chip-making behemoth: Intel. Intel manufactures CPUs, GPUs, and other goods, as well as its own semiconductors — but after a series of blunders, including failing to invest in the creation of AI processors promptly, the rival chipmaker’s dominance has waned. In 2019, when Nvidia’s market worth was just over $100 billion, Intel’s value was double that; now, Nvidia has entered the ranks of corporate heavyweights known as the “Magnificent Seven,” a cabal of tech firms whose combined value exceeds the whole stock market of several wealthy G20 countries.

Today, Nvidia’s four primary businesses are gaming, professional visualisation (such as 3D modelling), data centres, and the automotive sector, where it produces processors for self-driving cars. A few years ago, its gaming sector still accounted for the majority of sales, at over $5.5 billion, compared to its data centre segment, which brought in approximately $2.9 billion. Then the pandemic broke out. People were spending more time at home, and demand for computer hardware, notably GPUs, skyrocketed — the company’s gaming income in fiscal year 2021 increased by 41%. However, there were clues of the upcoming AI tsunami, as Nvidia’s data centre income increased by an even more remarkable 124%. Nvidia’s shares rose 237% in 2023, adding $858 billion to its market worth after the firm provided “jaw-dropping” revenue projections for its AI-powered GPU processors in both its first-quarter and second-quarter earnings results.

However, with the AI boom, the firm pledged to become increasingly integrated with the technology that people use daily. Tesla’s self-driving cars use Nvidia CPUs, as do all big tech companies’ cloud computing services. These services serve as the foundation for many of our regular online activities, such as Netflix streaming and the use of office and productivity applications. To train ChatGPT, OpenAI combined tens of thousands of Nvidia AI processors. People underestimate how much AI they utilise daily because they are unaware of how AI has improved some of the automated processes on which they rely. Popular applications and social media platforms appear to incorporate new AI functions daily: TikTok, Instagram, X (previously Twitter), and even Pinterest all provide some AI capability to experiment with. Slack, a popular messaging application in the workplace, has added the ability to produce thread summaries and channel recaps using AI.

NVIDIA CONTINUES TO SELL OUT

Jensen Huang, Nvidia’s co-founder and CEO, has consistently raised the bar. To preserve its dominant position, his organisation has provided consumers with access to specialist computers, computing services, and other tools for their developing trade. Nvidia has effectively become a one-stop shop for AI developers. Google, Amazon, Meta, and IBM have all built AI processors. According to Omdia, Nvidia now accounts for more than 70% of AI processor sales and has an even larger lead in training generative AI models.

The company’s prominence as the biggest winner of the AI revolution became apparent when it anticipated a 64% increase in quarterly revenue, significantly above Wall Street’s expectations. For Nvidia’s consumers, the issue with strong demand is that the corporation may charge exorbitant prices for its products. The processors used in AI data centres cost tens of thousands of dollars, with top-of-the-line products occasionally selling for more than $40,000 on sites such as Amazon and eBay. Last year, some clients clamouring for Nvidia’s AI processors had to wait up to 11 months.

Think of Nvidia as the “Birkin bag” of AI processors. A comparable solution from another chipmaker, AMD, is purportedly being sold to clients such as Microsoft for $10,000 to $15,000, a fraction of Nvidia’s prices. It is not limited to the AI chips. Nvidia’s gaming business continues to develop, and the pricing difference between its high-end gaming card and a comparably performing AMD card has widened. Nvidia recorded a 76% gross margin in its most recent financial quarter. This means that it only costs them 24 cents to make one dollar in sales. AMD’s most recent gross margin was only 47%.

HOW DID INTEL FALL BEHIND NVIDIA?

Firstly, Intel and Nvidia began competing in 2022. Previously, there was no overlap in their product portfolios, and when Nvidia attempted to create devices that would compete with Intel, Intel slapped a lawsuit against them. The main issue was Nvidia’s right to make chipsets that were compatible with Intel CPUs. The case focused on complicated technical and legal interpretations of the licensing agreement. Intel attempted to restrict Nvidia from producing chipsets for their new CPUs, which Nvidia argued was an attempt to suppress competition in the semiconductor industry.

AI used to be tested on the CPU since almost everything that ends up being processed on a GPU is first evaluated on a CPU, and Intel built several specific products that combined the best of CPU and GPU. These were termed Phi cards. To use the previous comparison, Phi cards were nearly as bright as a standard CPU but had 50-90 cores, allowing them to crunch data in volume. These were most likely the first AI inferencing computers utilised for image recognition on a data centre scale. Still, they were not suitable for general LLM usage due to their high power consumption.

In 2022, Intel’s ARC series of GPUs became its first fully competitive offering with Nvidia. It took them around 15 years to create and improve it to the point where it could be commercialised, and it still lagged well behind Nvidia, which had been producing GPUs for about 35 years. Intel did not fall behind in this race because they were never in it in the first place. The Arc GPU does well in AI considering its pricing, although it competed with mid-range Nvidia RTXs, not 4090s and H100s.

Intel, which makes its own chips, began experiencing serious challenges with miniaturisation, requiring it to maintain its present techniques while the rest of the industry transitioned to more modern ones. Their 10 nanometer-wide manufacturing technique was planned to be released in 2017-19, but owing to problems, they were forced to continue with 14 nanometers. This led to a CPU that required more power and produced more heat to get the same performance. In essence, they got stuck.

Meanwhile, Nvidia was generating a lot of money by selling crypto hardware, a bus that Intel could not get on. The GPU scarcity developed, increasing their costs. This was also around the time that GPUs were strong enough to do real-world AI tasks. Nvidia invested money and time in refining its software to get an advantage in artificial intelligence. Nvidia also offered AI data centre products (Hopper H100) for $30k per unit, producing around $25k in profit. Finally, misinformation spread, claiming that you need CUDA to accomplish AI. CUDA is an Nvidia technology. Nvidia accounted for 80% of general-purpose graphics processing unit (GPGPU) sales for AI, and Intel did not even have a product to market.

NVIDIA AND AI: A MATCH MADE IN HEAVEN?

It is obvious that Nvidia made significant investments in wooing the AI business before others took notice, but its market dominance is not unshakeable. An army of competitors, ranging from tiny startups to large corporations, is on the march, including Amazon, Meta, Microsoft, and Google, all of whom employ Nvidia processors. Whether Nvidia can maintain its $2 trillion valuation — or possibly reach new heights — is dependent on whether consumer and investor interest in AI can be maintained. Silicon Valley is becoming saturated with freshly created AI businesses, but what percentage will succeed, and how long will investors continue to pour money into them?

ChatGPT increased AI awareness since it was a simple novelty for the general public to feel excited about. However, much of AI research is now focused on AI training rather than AI inference, which involves using trained AI models to solve a problem, such as how ChatGPT reacts to a user question or how facial recognition technology detects people. Though the AI inference market is growing (and maybe faster than expected), the majority of the sector will continue to invest substantial time and money in training. Nvidia’s first-class CPUs are expected to remain the most popular for training for the foreseeable future. However, as AI inference gains mainstream, demand for such high-performance machines will decline, and Nvidia’s supremacy may fade.

Despite these potential difficulties, the safe bet is that Nvidia will soon become as well-known a technology business as Apple and Google. AI fever is why Nvidia is in the exclusive club of trillion-dollar corporations — but it could also be argued that AI is so popular because of Nvidia.

Author: Amar Chowdhury

Leave a Reply