📡 Breaking news
Analyzing latest trends...

Meta and Broadcom Partner to Build World First 2nm AI Chips for 1GW Data Centers.

Meta and Broadcom Partner to Build World First 2nm AI Chips for 1GW Data Centers.
Meta and Broadcom Ink Major Deal to Develop 2nm AI Chips: A New Era for MTIA

Meta has announced a significant expansion of its partnership with Broadcom to co-develop the next generations of the Meta Training and Inference Accelerator (MTIA). This strategic collaboration is set to run through at least 2029 and will extend beyond inference accelerators to include various custom silicon solutions for Meta’s global data center infrastructure.

The Race to 2nm Technology

According to Broadcom, the upcoming MTIA iteration is poised to become the world’s first AI chip manufactured using cutting-edge 2-nanometer (2nm) process technology. This move signals Meta's ambition to decrease its reliance on third-party GPU providers like NVIDIA by developing highly efficient, specialized hardware tailored for its massive social media and Metaverse workloads.

Massive Scale Infrastructure

The partnership also involves a colossal infrastructure commitment. The initial computing power within the collaborative data centers is set at 1 gigawatt (GW), with plans for further expansion. To put this in perspective, 1 GW is enough to power approximately 750,000 homes, highlighting the immense energy and processing requirements of next-generation AI.

Strategic Leadership Shift

In a notable governance change, Hock Tan, the CEO of Broadcom who has served on Meta’s Board of Directors, will step down from his board seat. However, he will remain closely tied to the company, transitioning into a Strategic Advisor role for Meta’s silicon development. This shift allows Tan to focus more directly on the technical and strategic execution of the chip partnership while avoiding potential conflicts of interest.

Meta is following in the footsteps of Google (TPU) and Amazon (Trainium/Inferentia). Collaborating with Broadcom, a king of ASIC (Application-Specific Integrated Circuit) design, will allow Meta to create chips that are vastly more energy-efficient than typical GPUs, key to running future models like Llama 4 or Llama 5.

The announcement of 1 gigawatt is no small feat. The big problem with AI isn't just the chip, but "power." Meta's planning of this level of infrastructure reflects their preparation for highly complex Artificial General Intelligence (AGI) systems that require massive amounts of power.

Bringing Hock Tan in as a direct consultant shows Meta's desire to integrate the software design (PyTorch) to hardware (MTIA) process (vertical integration) to maximize processing speed and reduce long-term costs.

The gamble on 2-nanometer technology (expected to be manufactured by TSMC or Samsung) is a test of efficiency per watt. If Meta succeeds, MTIA will become the most powerful and energy-efficient inference chip in the world.

 

Google Launches Skills Library for Gemini in Chrome. 

 

Source: Meta 

💬 AI Content Assistant

Ask me anything about this article. No data is stored for your question.

Comments

Popular posts from this blog

TSMC Beats Expectations AI Demand Drives Q1 Revenue to $35.6 billion.

Amazon Reveals Demand for Graviton Chips Two Secret Clients Tried to Buy the Entire 2026 Supply.

iPhone Ultra Leaks Apple $2,000 Foldable Revealed in New Dummy Images.

Google Meet for Apple CarPlay is Finally Here.

Kevin Weil Internal Memo Reveals OpenAI Strategy for 2026.

Google Unveils Notebooks Integration for Gemini.

Roblox Raises the Bar for Developers Targeting Young Audiences.