Increase in Nvidia’s stock ranks the company among the largest

Related

Official: BlackRock’s Bitcoin ETF surpasses Grayscale’s

Sooner or later it had to happen: BlackRock’s Bitcoin...

Binance: il dirigente incarcerato in Nigeria potrebbe avere la malaria

The famiglia del dirigente di Binance, Tigran Gabaryan, incarcerato...

Binance: il dirigente incarcerato in Nigeria potrebbe avere la malaria

The family of the Binance executive, Tigran Gabaryan,...

Share

Nvidia, the famous chip maker, has emerged as one of the early winners of the artificial intelligence boom, the recent increase in its share price has taken the company to the next level. 

On 30 May, the company’s market capitalization passed the remarkable milestone of $1 trillion. 

This achievement has great significance for Nvidia, considering that it operates solely as a chip designer with no manufacturing capabilities of its own. 

As of 1 June, the company’s market capitalization was around $970 billion, placing Nvidia in an elite club shared by only five other companies: Apple, Microsoft, Alphabet, Amazon and Saudi Aramco. 

Previously, only three other companies, Tesla, Meta and PetroChina, had crossed the $1 trillion threshold.

Nvidia stock boom since the beginning of 2023

Since the beginning of the year, Nvidia’s stock price has seen a staggering increase of about 170%, outpacing other members of the S&P 500 index. 

This growth is directly related to the growing awareness and use of AI tools, highlighting the potential impact on businesses and consumers.

Nvidia’s success is linked to its dominant position in the graphics processing unit (GPU) market. 

The ChatGPT language model, which has attracted attention in meeting rooms and casual conversations since its release in November 2022, has already attracted 100 million users as of January 2023. 

ChatGPT, built on a huge language model comprising 175 billion parameters, has been trained using about 10,000 Nvidia A100 GPUs. 

Nvidia currently has an 80% share of the global GPU market, whose use extends beyond AI algorithms to include data mining and cryptocurrency mining due to their parallel processing capabilities. 

Market research firm IDTechEx recently released a report predicting Nvidia‘s continued dominance, not only in the GPU arena, but particularly as a leader in AI hardware. 

The report predicts that Nvidia will capture a significant portion of the estimated $257 billion in AI chip revenue by 2033.

Nvidia’s revenue comes mainly from the data center and networking market segment, which includes data center platforms, autonomous vehicle solutions, and cryptocurrency mining processors. 

Nvidia revenues in 2023

In the fiscal year 2023, Nvidia generated as much as $15.01 billion in revenue from data centers, accounting for 55.6% of the company’s total revenue for that year. 

This means a substantial 41% increase in data center revenue over 2022, demonstrating Nvidia’s steady growth of more than 40% year-on-year since 2020. 

In contrast, other AI chip designers such as AMD, which recently acquired Xilinx, as well as Qualcomm are lagging behind Nvidia in the data center AI space.

Despite its current performance, Nvidia continues to push forward. Earlier this year, the company announced the H100 GPU, based on its new Hopper architecture.

The Hopper architecture, produced with TSMC’s 4N process (an improved version of the 5nm node), features 80 billion transistors, surpassing the 54.2 billion transistors of the A100 produced with a 7nm process. 

With speed improvements ranging from 7 to 30 times for training and inference over the A100, along with comparable thermal design power in the PCIe form factor, Nvidia is poised to provide the essential hardware required to support increasingly complex AI algorithms in the future.

While Nvidia continues to dominate the data center computing market, there are ample opportunities for chip designers in the edge computing space. 

According to IDTechEx’s latest report on AI chips, the edge market is expected to experience a higher compound annual growth rate than the cloud AI market over the next decade. Edge AI poses different requirements, particularly with regard to power consumption due to the thermal limits of embedded devices. 

Since edge chips typically consume only a few watts, the complexity of the models they run must be greatly simplified. 

As a result, state-of-the-art chips such as the A100, with its large footprint and high transistor density, would not be suitable for edge applications. 

Instead, companies may choose to design chips using more mature node processes, which offer a lower price point and lower barrier to entry than leading edge nodes.

The exponential growth of artificial intelligence

It is difficult to say exactly when the inflection point of artificial intelligence will occur and how far into the future. 

However, there is no denying the ongoing AI boom and the transformative potential of AI tools in various industries. Nvidia’s success testifies to the growing importance of AI in shaping the technological landscape.

For those interested in learning more about the global AI chip market, including technology developments, key players, and market prospects for AI-capable hardware, IDTechEx’s “AI Chips 2023-2033” report offers valuable insights.

The fact that Nvidia’s market capitalization has crossed the $1 trillion mark signifies not only the company’s triumph, but also the immense potential of AI technology. 

With its dominant position in the GPU market and a strong position in the data center AI space, Nvidia is poised to lead the future of AI hardware. 

The company’s continued innovation, exemplified by the upcoming H100 GPU based on the Hopper architecture, further solidifies its position as an industry leader.

However, Nvidia’s success does not reduce the opportunities available to other chip designers. The burgeoning edge computing market has significant growth potential, with AI applications requiring custom chips optimized for power consumption and embedded devices. 

As demand for AI increases in edge computing, chip designers can explore manufacturing at more mature nodes, enabling cost-effective solutions without sacrificing performance.