Microsoft created Custom Chips, Focusing on AI Applications

Microsoft has developed its own AI chip, named Azure Maia, and an Arm-based CPU, Azure Cobalt, to enhance its Azure data centers and meet the growing AI demands of its business clients.

These chips, launching in 2024, are in response to the increased demand for Nvidia’s H100 GPUs, commonly used in AI and image generation applications. Rani Borkar, head of Azure hardware systems at Microsoft, highlighted the company’s history in silicon development, including past collaborations on Xbox and Surface devices, and stated that these new chips build upon that experience.

Azure Maia AI chip and Azure Cobalt CPU are entirely Microsoft’s creations, focusing on enhancing performance, power, and cost efficiency in cloud servers. Borkar emphasized rethinking cloud infrastructure for AI, optimizing each layer for better results. The Azure Cobalt CPU, a 128-core processor, is tailored for general cloud services and features thoughtful design for performance and power management. Microsoft is testing Cobalt CPU with applications like Microsoft Teams and SQL server, and plans to offer it to customers next year.

Microsoft’s Partnership with OpenAI

The Maia 100 AI accelerator, designed for cloud AI tasks like language model training, is a key component in Microsoft’s partnership with OpenAI. Sam Altman, OpenAI’s CEO, expressed excitement about working with Microsoft on refining Maia. This chip, made using a 5-nanometer process, has fewer transistors than AMD’s MI300X AI GPU but supports advanced data types for faster training and inference times.

Maia is also notable for being Microsoft’s first completely liquid-cooled server processor, aiming for higher server density and efficiency. The Maia 100 server rack includes a unique cooling system. Microsoft is currently testing Maia 100 with GPT 3.5 Turbo, the model behind ChatGPT, Bing AI, and GitHub Copilot.

While specifics on Maia’s performance and comparisons with Nvidia’s H100 GPU and AMD’s MI300X are not yet disclosed, Microsoft continues its partnerships with Nvidia and AMD, focusing on optimizing cloud AI infrastructure and offering diverse choices to customers. The company is not planning to sell these chips like Nvidia, AMD, Intel, and Qualcomm, but rather use them for its own Azure cloud workloads.

Microsoft’s development of these chips suggests a series of future advancements, although detailed roadmaps are not shared. The company’s focus now is on deploying Maia to accelerate its AI initiatives and impact AI cloud service pricing. Microsoft has also launched Copilot for Microsoft 365, an AI-powered Office assistant, currently available to its major customers.

Alex Patel

An accomplished editor with a specialization in AI articles. Born and raised in NYC, USA. He graduated the Columbia University, and worked in tech companies of USA and Canada.

Add a comment