Microsoft's Latest AI Chip Competes Directly with Amazon and Google
Hacker News
Microsoft is rolling out its new Maia 200 AI chip to its data centers, directly challenging the AI chip offerings from competitors like Amazon and Google.
Hacker News
Microsoft is rolling out its new Maia 200 AI chip to its data centers, directly challenging the AI chip offerings from competitors like Amazon and Google.
AI 生成摘要
微軟正將其最新的Maia 200 AI晶片部署至自家數據中心,此舉直接挑戰亞馬遜和Google等競爭對手的AI晶片產品。
Posts from this topic will be added to your daily email digest and your homepage feed.
See All News
Posts from this topic will be added to your daily email digest and your homepage feed.
See All AI
Posts from this topic will be added to your daily email digest and your homepage feed.
See All Tech
The Maia 200 chip is starting to roll out to Microsoft’s data centers today.
The Maia 200 chip is starting to roll out to Microsoft’s data centers today.
Posts from this author will be added to your daily email digest and your homepage feed.
See All by Tom Warren
Posts from this author will be added to your daily email digest and your homepage feed.
See All by Tom Warren
Microsoft is announcing a successor to its first in-house AI chip today, the Maia 200. Built on TSMC’s 3nm process, Microsoft says its Maia 200 AI accelerator “delivers 3 times the FP4 performance of the third generation Amazon Trainium, and FP8 performance above Google’s seventh generation TPU.”
Each Maia 200 chip has more than 100 billion transistors, which are all designed to handle large-scale AI workloads. “Maia 200 can effortlessly run today’s largest models, with plenty of headroom for even bigger models in the future,” says Scott Guthrie, executive vice president of Microsoft’s Cloud and AI division.
Microsoft will use Maia 200 to host OpenAI’s GPT-5.2 model and others for Microsoft Foundry and Microsoft 365 Copilot. “Maia 200 is also the most efficient inference system Microsoft has ever deployed, with 30 percent better performance per dollar than the latest generation hardware in our fleet today,” says Guthrie.
Microsoft’s performance flex over its close Big Tech competitors is different to when it first launched the Maia 100 in 2023 and didn’t want to be drawn into direct comparisons with Amazon’s and Google’s AI cloud capabilities. Both Google and Amazon are working on next-generation AI chips, though. Amazon is even working with Nvidia to integrate its upcoming Trainium4 chip with NVLink 6 and Nvidia’s MGX rack architecture.
Microsoft’s Superintelligence team will be the first to use its Maia 200 chips, and the company is also inviting academics, developers, AI labs, and open-source model project contributors to an early preview of the Maia 200 software development kit. Microsoft is starting to deploy these new chips today in its Azure US Central data center region, with additional regions to follow.
Posts from this author will be added to your daily email digest and your homepage feed.
See All by Tom Warren
Posts from this topic will be added to your daily email digest and your homepage feed.
See All AI
Posts from this topic will be added to your daily email digest and your homepage feed.
See All Microsoft
Posts from this topic will be added to your daily email digest and your homepage feed.
See All News
Posts from this topic will be added to your daily email digest and your homepage feed.
See All Tech
A free daily digest of the news that matters most.
This is the title for the native ad
This is the title for the native ad
© 2026 Vox Media, LLC. All Rights Reserved