-
“I think this is an opportunity for us to write the next chapter of the AMD growth story,” Su told Fortune in a mid-September interview. “There are so few companies in the world that have access to the [intellectual property] that we have and the customer set that we have, and the opportunity frankly to really shape how AI is adopted across the world. I feel like we have that opportunity.”
-
“There will certainly be scenarios in 2024 when we would imagine Nvidia GPUs are sold out and customers only have access to AMD, and AMD can win some business that way, just based on availability,” says Morningstar tech sector director Brian Colello.
-
Gregory Diamos, cofounder of the AI startup Lamini and a former CUDA architect at Nvidia, says he believes AMD is closing the gap. “AMD has been putting hundreds of engineers behind their general-purpose AI initiative,” he says.
-
The forthcoming MI300-series data center chip combines a CPU with a GPU. “We actually think we will be the industry leader for inference solutions, because of some of the choices that we’ve made in our architecture,” says Su.
-
But many analysts believe the bigger part of the AI market lies not in training LLMs, but in deploying them: setting up systems to answer the billions of queries that are expected as AI becomes part of everyday existence. This is known as “inference” (because it involves the AI model using its training to infer things about the fresh data it is presented), and whether GPUs remain the go-to chips for inference is an open question.
-
“It has become abundantly clear, certainly with the adoption of generative AI in the last year, that this [industry] has the space to grow at an incredibly fast pace,” she said. “We’re looking at 50% compound annual growth rate for the next five-plus years, and there are few markets that do that at this size when you’re talking about tens of billions of dollars.”
Subscribe To Our Free Newsletter |