newsence
來源篩選

Microsoft Launches Maia 200 AI Chip for Faster Model Inference

Hacker News

Microsoft has unveiled the Maia 200, a new AI accelerator chip designed to significantly boost inference performance on its cloud and enterprise platforms, featuring 140 billion transistors and advanced memory.

newsence

微軟推出Maia 200 AI晶片,加速模型推理

Hacker News
大約 1 個月前

AI 生成摘要

微軟發布了下一代AI加速器晶片Maia 200,旨在大幅提升其雲端和企業平台的模型推理效能,該晶片擁有1400億個電晶體和先進記憶體。

Microsoft Launches Maia 200 AI Chip for Faster Model Inference

Oops, something went wrong

News

Life

Entertainment

Finance

Sports

New on Yahoo

Yahoo Finance

Image

Microsoft Launches Maia 200 AI Chip for Faster Model Inference

This article first appeared on GuruFocus.

Microsoft (MSFT, Financials) announced the launch of Maia 200, a next-generation AI accelerator chip designed to enhance inference performance across its cloud and enterprise platforms.The chip has 140 billion transistors, 216GB of HBM3e memory, and can do up to 10 petaFLOPS of work. It is based on Taiwan Semiconductor Manufacturing Co.'s (TSMC) 3 nanometer technology.

Warning! GuruFocus has detected 2 Warning Sign with MSFT.

Is MSFT fairly valued? Test your thesis with our free DCF calculator.

Scott Guthrie, the company's Executive Vice President, said that Maia 200 has "30% better performance per dollar" than current-generation technology. This makes it the company's most economical AI processor to date. The processor will run OpenAI's GPT-5.2 models, as well as Microsoft 365 Copilot and Foundry apps.

The first deployments are happening now at Microsoft's data centers in Iowa and Arizona, and more are planned. The business also released a Maia software development kit that includes support for PyTorch and the Triton compiler to help developers make models work better on the chip.

Recommended Stories

Sign in to access your portfolio