What and how many processors are used to power chatGPT

What and how many processors are used to power chatGPT

For over 25 years, Nvidia has spearheaded advancements in computer graphics, earning a special place in the hearts of gamers worldwide. Dominating the graphics processing unit (GPU) market since the introduction of the GeForce 256 in 1999, Nvidia generated over $9 billion in revenue from gaming last year despite a recent downturn.

However, Nvidia's latest earnings success highlights a shift in the GPU industry. Artificial intelligence (AI) is now driving the GPU boom. Nvidia's CEO, Jensen Huang, expressed in a CNBC interview last month that the company recognized early on the potential of AI in transforming software development. As a result, they reoriented the entire organization to focus on AI, with every chip designed to support it.

Nvidia's early investment in AI is finally paying off, as their technology powers large language models (LLMs) like ChatGPT. This has helped offset challenges faced by the broader semiconductor industry, such as U.S.-China trade tensions and global chip shortages.

Nevertheless, Nvidia is not immune to geopolitical issues. In October, the U.S. implemented new regulations prohibiting the export of cutting-edge AI chips to China. With China accounting for roughly one-quarter of Nvidia's revenue, including sales of the popular A100 AI chip, the company had to adapt quickly. Huang reported that Nvidia has been able to serve its Chinese customers with regulated components while remaining compliant with the new rules.

The annual GTC developer conference, hosted by Nvidia from March 20-23, will focus heavily on AI. In an interview with CNBC at Nvidia's headquarters in Santa Clara, California, Huang discussed the company's central role in the burgeoning generative AI field. He attributed Nvidia's success to a combination of luck and foresight, primarily in the area of accelerated computing.

GPUs are the core of Nvidia's business, contributing over 80% of their revenue. These GPUs, often sold as cards that plug into a PC's motherboard, enhance the computing power of central processing units (CPUs) manufactured by companies like AMD and Intel.

Today, tech firms competing with ChatGPT are proudly showcasing the number of Nvidia's A100s, which cost around $10,000 each, in their possession. Microsoft revealed that the supercomputer developed for OpenAI utilized 10,000 A100 GPUs.

TOP