Elon Musk's xAI developed Colossus in just 122 days — "The most powerful AI training system in the world" powered by 100K NVIDIA H100 GPUs could that make Grok 'the most powerful AI by every metric'

Elon Musk
Elon Musk (Image credit: Tesla)

What you need to know

  • Elon Musk's xAI team launched Colossus, powered by 100,000 NVIDIA H100 GPUs.
  • The company plans to double the system's capacity with an additional 50,000 NVIDIA H100 and H200 GPUs in the next few months.
  • Colossus could help train Grok 3 to become the most powerful AI by every metric.

A while ago, Elon Musk announced that his team at xAI had elaborate plans to begin training Grok AI using “the most powerful AI training cluster in the world,” to make the AI-powered chatbot "the world's most powerful AI by every metric."

Recently, Elon Musk announced the world's most powerful training system is officially online. The system, dubbed Colossus supercomputer, is powered by 100,000 NVIDIA H100 GPUs and will be used for training. The company plans to expand the system's capabilities with an additional 50,000 NVIDIA H100 and H200 GPUs in the next few months. 

According to Elon Musk:

"This weekend, the team brought our Colossus 100k H100 training cluster online. From start to finish, it was done in 122 days. Colossus is the most powerful AI training system in the world. Moreover, it will double in size to 200k (50k H200s) in a few months. Excellent work by the team, Nvidia and our many partners/suppliers."

While details on the cost incurred to bring the Memphis, Tennessee-based project to completion remain slim at best, NVIDIA's H100 costs around $30,000. This closely aligns with the tech mogul's estimated $3–4 billion expenditure on sourcing GPUs. As you may know, AI ventures have a high demand for electricity to power their advances and cooling water. 

Grok 2 recently shipped exclusively to X premium and X premium plus subscribers with image and text generation capabilities. Lauded for its uncensored nature, the LLM was trained using 15,000 GPUs. With Colossus’ launch, the trajectory of Grok 2's successors is very promising, since they'll have access to over 100,000 NVIDIA H100 GPUs for training. 

RELATED: Grok 2 outperforms Anthropic's Claude 3.5 Sonnet and even GPT-4-Turbo

Elon Musk plans to potentially ship Grok 2's successor by December, which will be the most powerful AI by every metric. This is amid rising privacy concerns after users discovered that their X data is being used to train the AI model by default without their consent. Regulators are investigating the matter, and if X can't establish a legal basis for its action, it could lose up to 4% of its global annual turnover in fines.

🎒The best Back to School deals📝

CATEGORIES
Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You'll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.