Microsoft announces the launch of new Maia 100 and Cobalt 100 chips.

2023-11-17

Microsoft to Launch Two New Chips Next Year Microsoft, the tech giant, announced at the Microsoft Ignite conference on November 15th that it will release two new chips next year. The Microsoft Azure Maia 100 is designed for AI workloads, while the Microsoft Cobalt 100 CPU is designed for general computing workloads on the Microsoft cloud. Customized Chips Based on Internal Specifications Both the Maia 100 and Cobalt 100 chips are manufactured internally by Microsoft. The tech giant states that this allows for customization at every stage, from silicon selection, software, and servers to racks and cooling systems, tailored to the expected customer workloads. Microsoft Azure Maia 100 AI Accelerator is optimized for AI tasks and generative AI (Figure 1). Microsoft has shared the design of Maia 100 with OpenAI to ensure optimization for large language workloads. Figure 1 Microsoft Cobalt 100 CPU is an Arm-based processor designed for the Microsoft cloud (Figure 2). Figure 2 "Microsoft is building the infrastructure to support AI innovation. We are reimagining every aspect of our data centers to meet the needs of our customers," said Scott Guthrie, Executive Vice President of Microsoft Cloud + AI Group. "Optimizing and integrating every layer of the infrastructure stack is crucial for us at our scale to maximize performance, diversify our supply chain, and provide infrastructure choices for our customers." Sidekick Server Racks with Liquid Cooling To make room for the Microsoft Maia 100 AI Accelerator in data centers, Microsoft has developed custom server racks. These sidekick racks are wider than typical Microsoft server formats and are positioned next to the Microsoft Maia 100 racks. Liquid cooling fluid flows from the sidekick racks to the Maia 100 racks and back, creating a cooler environment. The custom racks can be used with industry partner silicon. New Chips Designed for Cloud Workloads Microsoft expects customers to use the new chips for AI and cloud computing, including running Microsoft Copilot or Azure OpenAI services. The Maia 100 and Cobalt 100 chips are manufactured for custom racks within Microsoft data centers. Microsoft has been steadily working on producing more and more components for its cloud. Silicon is the final piece of the puzzle. "We can see the entire stack, and silicon is just one component," said Rani Borkar, Corporate Vice President of Microsoft Azure Hardware Systems and Infrastructure. Microsoft's Relationship with Silicon Competitors Providing internal chips prevents Microsoft from relying on competitors to run large AI workloads. In particular, the Maia chip may compete with NVIDIA's AI-focused GPUs. AMD, Arm, AWS, Intel, Meta, Google, SambaNova, and Qualcomm also produce chips designed for AI workloads. Borkar told The Verge that he does not see the AI chip field as a competition, but rather Microsoft's chips can be "complementary" to partner relationships, including other companies in the AI chip field. "Everything we build, whether it's infrastructure, software, or firmware, we can leverage, whether it's deploying our chips or chips from our industry partners," said Pat Stemen, Partner Program Manager at AHSI. "It's the choices that customers can make, and we're trying to provide them with the best set of options, whether it's performance, cost, or other aspects they care about." Microsoft does not plan to replace any existing hardware from AMD, Intel, or NVIDIA. Instead, the company aims to offer more choices by introducing silicon options for customers. Microsoft plans to produce second-generation versions of the Maia and Cobalt chips at an unspecified time in the future.