Technology
Microsoft Maia AI Chip Delayed Until 2026, Will Lag Behind Nvidia Blackwell: Report
Microsoft’s custom AI silicon initiatives are facing severe challenges. The company’s next-generation Maia AI chip, internally codenamed Braga, is now expected to enter mass production a year later than previously indicated, which is now set for 2026 instead of 2025, per The Information. This puts Microsoft further behind Nvidia, Google, and Amazon in the AI […]

Microsoft’s custom AI silicon initiatives are facing severe challenges. The company’s next-generation Maia AI chip, internally codenamed Braga, is now expected to enter mass production a year later than previously indicated, which is now set for 2026 instead of 2025, per The Information. This puts Microsoft further behind Nvidia, Google, and Amazon in the AI hardware race.
Here’s a detailed look at what we know so far.
Maia AI Chip Delay: What Happened?
- Initial Plans: We planned on AMD integrating its new CPU core, MAIA, into its Zen4-based APU lineup with an intended launch sometime around late 2023. It aims to provide accelerated performance across all workloads, like it did with the Azure GPUs that made our infrastructure perform at speeds never imagined before.
- Current Status: Now we have pushed back its schedule to late 26 due to:
- Unanticipated design changes
- Staffing challenges and high employee turnover
- Internal constraints in development teams
Performance Concerns vs Nvidia’s Blackwell
- Microsoft’s Maia chip is reportedly expected to fall short of Nvidia’s Blackwell GPU in performance.
- Nvidia Blackwell, unveiled in late 2024, is designed to handle the most demanding AI training and inference tasks with top-tier efficiency.
- This performance gap could limit Microsoft’s ability to reduce its dependency on Nvidia chips in the near term.
Why Custom AI Chips Matter
Big Tech companies are now focusing on custom-made AI processors to:
- Decrease reliance on Nvidia’s popular and pricey chips
- Streamline the focus area for improved outcomes
- Manage expenses relative to cloud infrastructure, strategic hardware investments builder
Here’s how Microsoft compares with its cloud rivals:
Company |
AI Chip Name |
Launch Status |
Performance Target |
Microsoft |
Maia (Braga) |
Mass production in 2026 (delayed) |
Likely behind Nvidia Blackwell |
|
TPU v7 |
Unveiled in April 2025 |
Designed for large-scale AI workloads |
Amazon |
Trainium3 |
Set to launch late 2025 |
Focused on cost-efficient AI training |
Nvidia |
Blackwell |
Released late 2024 |
Industry-leading performance |
Strategic Implications for Microsoft
- Cloud Competition: The stronger position of Google and Amazon with in-house chip development puts Microsoft at risk for falling behind the competition in providing AI-powered cloud services.
- Azure Impact: Delays could hinder Microsoft Azure’s ability to enhance ROI by optimally deploying their advanced Ai infrastructures tweaks while containing operational overheads.
- AI Push: Accelerated adoption driven by software advancements (e.g., Copilot and OpenAI integrations) is counteracted by ecosystem-wide enhancements stymied by hardware holdup.
Final Thoughts
Microsoft is struggling to manufacture Maia AI’s chip Braga, which points out the difficulty of entering into the custom semiconductor space while competing with Amazon, Nvidia, Google and others. While chips are still in production, Mainstream Performance gaps and Timeline indicators cause red flags for whether Microsoft will succeed in controlling its AI ecosystem shortly. In response to current needs, Microsoft now has an enormous opportunity to keep pace with the rapidly growing requirement for compute-intensive frameworks.