"Nvidia would love to have 100% of it, but customers would not love for Nvidia to have 100% of it," said Sid Sheth, co-founder of aspiring rival D-Matrix. "It’s just too big of an opportunity. It would be too unhealthy if any one company took all of it."
Founded in 2019, D-Matrix plans to release a semiconductor card for servers later this year that aims to reduce the cost and latency of running AI models. The company raised $110 million in September.
In addition to D-Matrix, companies ranging from multinational corporations to nascent startups are fighting for a slice of the AI chip market that could reach $400 billion in annual sales in the next five years, according to market analysts and AMD. Nvidia has generated about $80 billion in revenue over the past four quarters, and Bank of America estimates the company sold $34.5 billion in AI chips last year.
Many companies taking on Nvidia’s GPUs are betting that a different architecture or certain trade-offs could produce a better chip for particular tasks. Device makers are also developing technology that could end up doing a lot of the computing for AI that’s currently taking place in large GPU-based clusters in the cloud.
"Nobody can deny that today Nvidia is the hardware you want to train and run AI models," Fernando Vidal, co-founder of 3Fourteen Research, told CNBC. "But there’s been incremental progress in leveling the playing field, from hyperscalers working on their own chips, to even little startups, designing their own silicon."