AMD focuses on significantly improving AI workload capabilities with MI300X

2023-11-29

AMD predicts that MI300X will generate $2 billion in revenue for AMD in 2024. The processor will be unveiled at the AMD Advancing AI event on December 6th next week.

"We have seen tremendous demand for the MI300X," said Mark Papermaster, Chief Technology Officer at AMD, referring to companies like Lamini, Moreh, and Databricks that have announced their success after using the MI250x to train their generative AI models. "In fact, Lisa Su (CEO) has already said that MI300X will be the fastest-growing product in AMD's history," added Papermaster.

Another reason for AMD's hardware success is the success of its software stack - ROCm (AMD Radeon Open Compute), Papermaster emphasized that it has reached production-level performance for high-performance computing. Additionally, he added that the next version of ROCm, 6.0, will be released soon and will be production-ready for AI workloads.

"We are bringing a leader product"

Microsoft announced at Ignite 2023 that it will use MI300X to handle their AI workloads. Interestingly, at the same conference, Microsoft also announced that it will use the NVIDIA GH200 superchip for the same purpose. "Competition is a good thing. It brings innovation," said Papermaster.

"We not only bring competition, but also a leader product in the speculative application," he further added. "The upcoming GPU will be recognized for its leadership position in training and inference applications, thereby solidifying AMD's position in the AI hardware field."

"MI300X is a highly performant GPU that can scale to very large cluster sizes," he further emphasized how AI is evolving, and the company is also focused on edge computing in smaller form factors for laptops.

AMD highlights the widespread adoption of Ryzen AI, the first exclusive AI accelerator available on x86 processors. Now, over 50 systems are equipped with Ryzen 7000 series processors and Ryzen AI, with millions of AMD AI personal computers available in the market.

"Historically, if you look back, most AI inference seems to rely on CPUs. And today, at least we still fully support inference on AMD CPUs. But especially for generative AI applications, the demand for computational power is higher. So when you train and infer it, it needs acceleration, and that's where we introduce competition with our AMD Instinct roadmap GPUs," he said.

"What I want to emphasize is that AMD has a broader strategy for AI than just GPUs. That's where we differ from NVIDIA," said Gilles Garcia, Senior Director of Business Operations at AMD's Data Center Communications Group.

AMD believes that most AI workloads can be handled solely by CPUs. "CPUs are best suited for handling most of the current edge processing problems, such as thermal management, cost-effectiveness, and reducing footprint by working at the edge," said Garcia.