Elon Musk Says xAI’s ‘Colossus 1’ Supercluster With 230K GPUs Including 30K GB200s Operational for Training Grok, Plans To Double AI Scaling for ‘Colossus 2’ in 5 Years

Elon Musk announced that xAI’s ‘Colossus 1’ supercluster having 230,000 GPUs including 30,000 GB200s, was operational for training its AI model Grok. He also revealed plans to double AI compute scale with ‘Colossus 2’, deploying 550,000 GB200s and GB300s over the next five years.

xAI Logo (Photo Credits: X/@xAI)

Elon Musk announced that massive 230K GPUs, including 30K GB200s, were operational for Grok at xAI. The tech billionaire said that the GPUs were operational in a single supercluster called "Colossus 1", and the inference was done by the cloud providers. Musk said, "At Colossus 2, the first batch of 550k GB200s & GB300s, also for training, start going online in a few weeks. As Jensen Huang has stated, @xAI is unmatched in speed. It’s not even close". Elon Musk shared xAI's goal - 50 million units of H100 equivalent-AI compute (but much better power-efficiency) online within five years. X Wins Legal Battle As DC Circuit Court of Appeals Limits US Government’s Ability To Issue Gag Orders, Elon Musk Says ‘Protecting Your Freedom of Speech’.

Elon Musk Announces Massive Plan to Introduce 'Colossus 2' With 550K NVIDIA GB200s

xAI Goal to Reach 50 Million H100s: Elon Musks

(SocialLY brings you all the latest breaking news, fact checks and information from social media world, including Twitter (X), Instagram and Youtube. The above post contains publicly available embedded media, directly from the user's social media account and the views appearing in the social media post do not reflect the opinions of LatestLY.)

Share Now