The rise of OpenAI’s ChatGPT has sparked global conversations about the future of artificial intelligence (AI) and its impact on society. But discourse isn’t the only trend the revolutionary large language model (LLM) has instigated. Demand for AI chips is currently soaring thanks to the increased attention to artificial intelligence technologies.
Currently, Nvidia controls the majority of the AI chip sector—as much as 75% according to many analysts. This dominance is the result of decades of preparation, research, and experience. Now, though, other players in the chip industry are looking to catch up and grab their own share of the booming AI semiconductor market. Analysts predict the total AI chip market will reach a staggering $60 billion as soon as 2027.
Although competition is heating up, it remains unknown whether anyone can come close to catching Nvidia. If there’s one ray of hope, it’s the growing demand for low-power chips used in AI applications that don’t need the most powerful GPUs on the market.
Though AI generally requires powerful chips, not all artificial intelligence algorithms are created equal. They don’t all require top-of-the-line silicon to function effectively. While applications like ChatGPT and Google’s Bard certainly need as much processing power as they can get, companies around the world are using AI in more ways than ever before.
Many of these applications don’t need a high-end GPU. Thanks to neural processing unit (NPU) technology, many AI applications can be run on more affordable chips. This creates a balance between performance and cost-effectiveness that’s difficult to ignore.
According to industry experts, the performance requirement of edge AI chips currently resembles those of mid- to high-end smartphones. So, while other manufacturers can’t match the design or processes Nvidia uses to power out its AI chips, that might not be necessary moving forward.
Indeed, we’ve already seen products rolling off the lines. STMicroelectronics recently unveiled its first microcontroller with built-in NPUs for edge AI-based systems. Despite closing its chip design unit earlier this month, Chinese smartphone maker Oppo also dabbled in making its own NPU-enabled chips using 6nm tech from TSMC.
So as the demand for edge AI chips continues to grow, innovations like this will allow smaller manufacturers to not only compete but thrive. The many use cases of edge AI mean there is more than enough demand to go around. Unsurprisingly, industry experts are observing major design houses MediaTek, Novatek, and Realtek, among others, ramp up their AI chip efforts. These developments will be crucial to watch in the coming months.
While NPUs and low-power AI applications will allow other players to compete in the space, there’s no denying the dominance of Nvidia. The company’s H100 chip is one of the most sought-after in the market, currently fetching prices of $45,000 per unit on the secondary market.
With so little time to respond, few manufacturers are equipped to meet the current demand. Nvidia, however, has been planning for years to be the “picks and shovels seller” of the AI gold rush.
A recent Bank of America report discussing the company’s future reads, “We remind investors that success in AI requires full-stack computing and scale/experience across silicon, software, application libraries, developers, plus enterprise and public cloud incumbency.”
The report goes on to call Nvidia’s process a “10+ year well-honed turnkey model” and its competitors' “piecemeal silicon-only solutions.”
Ultimately, the AI market is Nvidia’s to lose. If the giant can meet demand—not only in top-end chips but also affordable midrange silicon—it’s hard to see any others catching up. Though the likes of AMD and Intel are certainly trying. Considering the scale and potential growth of the AI sector, producing the necessary chips will be a top priority across the industry in the coming years.