Elon Musk flaunts 'the most powerful training cluster in the world' that will transform Grok into the 'most powerful AI' by December to take on Microsoft and OpenAI

Elon Musk in an interview
Image of Elon Musk during an interview (Image credit: BBC News)

What you need to know

  • Elon Musk recently announced xAI has begun training Grok with " the world’s most powerful AI cluster."
  • The cluster is powered by 100,000 liquid-cooled NVIDIA H100 graphic processing units (GPUs) connected with a single RDMA.
  • Musk says the AI company will ship Grok 3 (the world’s most powerful AI by every metric) in December 2024.

Microsoft and Google have seemingly dominated the AI landscape due to their early investment and adoption of the technology, but new players are exploring the space. Apple is at the forefront of its AI strategy—Apple Intelligence.

Billionaire and X (formerly Twitter) owner Elon Musk has joined the fray, too. In a post on X, Musk recently announced that his xAI has officially begun training its LLM, Grok, using “the most powerful AI training cluster in the world,” dubbed Memphis Supercluster. The cluster leverages the power of 100,000 liquid-cooled NVIDIA H100 graphic processing units (GPUs) connected with a single RDMA.

"This is a significant advantage in training the world's most powerful AI by every metric by December this year," added Musk. As you may know, there's been a high demand for NVIDIA's H100 GPUs for training AI models. This has heavily influenced NVIDIA's rise to become the most profitable chip brand and, for a few days, the world's most valuable company ahead of Microsoft and Apple.

On paper, Musk seems to have every piece of the AI puzzle in place to give Microsoft and Google a run for their money in the AI landscape. Musk refers to xAI's Grok 3 AI system as "the world's most powerful AI by every metric." 

Per Musk's post, xAI could ship the AI system to broad availability as early as December 2024. The demand for AI chips doesn't meet the available supply. As such, Microsoft and OpenAI are investing $100 billion in a project dubbed Stargate to free themselves from overreliance on NVIDIA for AI chips.

xAI could potentially follow Microsoft and OpenAI's route. In a separate report, xAI and Oracle recently halted their discussions on a $10 billion server deal. Musk later announced his plan to build a data center for xAI. The billionaire also announced he'd source AI chips for the company's projects. 

It's worth noting the xAI facility is expected to consume up to 150 megawatts of electricity per hour. AI's exorbitant power demands have raised concern, especially after a recent report highlighted that Microsoft and Google's electricity consumption surpasses the power usage of over 100 countries.

In a recent interview, Musk announced that xAI's Grok 2 will ship in August and that it will be "on par or close" to OpenAI's GPT-4.

🔥The hottest trending deals🔥

TOPICS
CATEGORIES
Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You'll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.