Breaking News
Microsoft has deployed its first homegrown AI chips in one of its data centers but will continue purchasing chips from Nvidia and AMD, CEO Satya Nadella confirmed this week. The move underscores the company’s strategy to combine in-house innovation with strong partnerships in the rapidly evolving AI hardware market.
The new chip, named Maia 200, is designed as an “AI inference powerhouse,” optimised for running compute-intensive AI models in production. Microsoft claims Maia 200 surpasses Amazon’s latest Trainium chips and Google’s newest Tensor Processing Units in processing speed.
Vertical Integration Doesn’t Mean Exclusivity
Despite developing its own advanced hardware, Nadella emphasised that Microsoft would continue sourcing AI chips from other leading vendors. “We have a great partnership with Nvidia, with AMD. They are innovating. We are innovating,” he said. “Because we can vertically integrate doesn’t mean we just only vertically integrate.”
The company’s approach highlights the challenges in securing high-performance AI chips, as supply shortages continue to affect cloud providers globally. Many cloud giants have begun designing proprietary chips to reduce reliance on third-party suppliers, but Nadella stressed that collaboration remains essential to maintain cutting-edge capabilities.
Maia 200 to Power Microsoft’s AI Frontier Models
Maia 200 will primarily support Microsoft’s Superintelligence team, led by Mustafa Suleyman, former Google DeepMind co-founder, which is developing the company’s next-generation AI models. This initiative could eventually reduce Microsoft’s dependence on external AI model providers such as OpenAI and Anthropic.
The chip will also be used to run OpenAI’s models on Microsoft’s Azure cloud platform, offering high-performance access to paying customers. Suleyman shared the milestone on social media, noting that his team would be the first to use Maia 200 in developing frontier AI models.
Microsoft’s launch signals the growing trend of cloud providers designing custom AI hardware while maintaining partnerships with traditional chip makers to address global demand and supply constraints. The company plans to roll out more Maia 200 chips across additional data centers in the coming months.
See What’s Next in Tech With the Fast Forward Newsletter
Tweets From @varindiamag
Nothing to see here - yet
When they Tweet, their Tweets will show up here.



