Breaking News
Microsoft has introduced the second generation of its in-house artificial intelligence processor, Maia 200, alongside a new suite of software tools aimed at weakening Nvidia’s long-held dominance among AI developers.
The Maia 200 chip has gone live this week at a Microsoft data centre in Iowa, with a second deployment planned in Arizona, the company said. The launch builds on Microsoft’s first Maia chip, which debuted in 2023 as part of efforts to reduce dependence on external AI hardware suppliers.
The announcement comes as major cloud computing players — including Microsoft, Google and Amazon Web Services — increasingly design custom AI chips that compete directly with Nvidia, whose processors power a large share of today’s generative AI workloads.
Google has made notable progress on this front, drawing interest from large Nvidia customers such as Meta Platforms, which is working closely with Google to narrow software compatibility gaps between their respective AI chip platforms.
Beyond hardware, Microsoft said it will provide developers with a full software stack for Maia 200. A key component is Triton, an open-source programming framework with significant contributions from OpenAI. Triton performs similar functions to Nvidia’s CUDA platform, which many analysts view as Nvidia’s most powerful competitive advantage.
Like Nvidia’s newly announced “Vera Rubin” processors, Maia 200 is manufactured by Taiwan Semiconductor Manufacturing Co using advanced 3-nanometre process technology. The chip also uses high-bandwidth memory, although Microsoft confirmed it relies on an older and slower generation than Nvidia’s upcoming products.
Microsoft has, however, adopted a design strategy used by several emerging AI chipmakers by integrating a substantial amount of SRAM, a fast memory type that can improve performance for chatbots and other AI systems handling large volumes of user requests.
That approach is also used by firms such as Cerebras Systems, which recently signed a $10 billion agreement with OpenAI to supply computing capacity, and Groq, whose technology Nvidia reportedly licensed in a deal valued at around $20 billion.
The Maia 200 chip has gone live this week at a Microsoft data centre in Iowa, with a second deployment planned in Arizona, the company said. The launch builds on Microsoft’s first Maia chip, which debuted in 2023 as part of efforts to reduce dependence on external AI hardware suppliers.
The announcement comes as major cloud computing players — including Microsoft, Google and Amazon Web Services — increasingly design custom AI chips that compete directly with Nvidia, whose processors power a large share of today’s generative AI workloads.
Google has made notable progress on this front, drawing interest from large Nvidia customers such as Meta Platforms, which is working closely with Google to narrow software compatibility gaps between their respective AI chip platforms.
Beyond hardware, Microsoft said it will provide developers with a full software stack for Maia 200. A key component is Triton, an open-source programming framework with significant contributions from OpenAI. Triton performs similar functions to Nvidia’s CUDA platform, which many analysts view as Nvidia’s most powerful competitive advantage.
Like Nvidia’s newly announced “Vera Rubin” processors, Maia 200 is manufactured by Taiwan Semiconductor Manufacturing Co using advanced 3-nanometre process technology. The chip also uses high-bandwidth memory, although Microsoft confirmed it relies on an older and slower generation than Nvidia’s upcoming products.
Microsoft has, however, adopted a design strategy used by several emerging AI chipmakers by integrating a substantial amount of SRAM, a fast memory type that can improve performance for chatbots and other AI systems handling large volumes of user requests.
That approach is also used by firms such as Cerebras Systems, which recently signed a $10 billion agreement with OpenAI to supply computing capacity, and Groq, whose technology Nvidia reportedly licensed in a deal valued at around $20 billion.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



