Meta partners with Broadcom to co-develop multiple generations of MTIA AI chips with 1GW initial deployment
Meta and Broadcom announced a strategic partnership to co-develop multiple generations of Meta's MTIA (Meta Training and Inference Accelerator) AI chips, leveraging Broadcom's custom AI accelerator (XPU) platform. The collaboration includes chip design, advanced packaging, and networking integration with Broadcom's Ethernet technologies to support high-bandwidth networking across Meta's AI compute clusters. The initial deployment phase targets over 1GW of MTIA chips, with plans for a multi-gigawatt rollout over time. Meta also revealed it had unveiled the next four generations of MTIA chips (MTIA 300, 400, 450, and 500) in March 2026, designed to support generative AI workloads with improvements in compute, memory bandwidth, and efficiency. Broadcom's CEO Hock Tan will transition from Meta's board to an advisory role, providing guidance on Meta's custom silicon roadmap and future infrastructure investments.
Counterparts (2)
Register free to access full counterpart details, deal analysis, and timeline.
Register free →Timeline
Get the full picture — timeline, source intelligence, and counterpart analysis.
Register free →
Global Infrastructure Sherpa