OpenAI's ChatGPT is 'only possible' with Microsoft Azure, says Microsoft

Azure Cloud
(Image credit: Microsoft)

In a new blog post, Microsoft waxed lyrical about its massive Azure cloud, scaled up to meet the specific demands of OpenAI. 

A few years ago, Microsoft entered into a massive partnership with OpenAI, a team at the cutting edge of artificial intelligence research. The company, perhaps ironically named now, entered into something of a closed partnership with the tech giant. The collaboration has led to an AI arms race that has seen Google's own rushed efforts become the subject of some ridicule in recent weeks. The threat to Google's search dominance is unlike anything the company has faced in decades, and Microsoft isn't taking this opportunity lying down. 

To that end, Microsoft today claimed that ChatGPT and other OpenAI models are only possible with its own Azure cloud, which has been rapidly and massively scaled up to meet OpenAI's needs. 

"Only Microsoft Azure provides the GPUs, the InfiniBand networking, and the unique AI infrastructure necessary to build these types of transformational AI models at scale, which is why OpenAI chose to partner with Microsoft,” said Scott Guthrie, EVP of AI and Cloud at Microsoft. "Azure really is the place now to develop and run large transformational AI workloads.

An aerial shot of one of Microsoft's massive data centers in Washington state.  (Image credit: Microsoft)

Microsoft discussed the challenges of industrializing AI, which went from lab work and niche use cases to commercialization at scale quite rapidly over the last few months. To that end, Microsoft's Nidhi Chappell, head of product for Azure high-performance computing, described the need to build something that could train AI models more rapidly. "There was definitely a strong push to get bigger models trained for a longer period of time, which means not only do you need to have the biggest infrastructure, you have to be able to run it reliably for a long period of time."

To that end, Microsoft teamed up with NVIDIA, linking together thousands of NVIDIA A100 GPUs leveraging the firm's proprietary 400 GB/s Quantum-2 Infiniband networking technology. Working together, these GPUs can process and create inference and context for absolutely vast amounts of data — the kind of data needed to take the dumb chatbots of yesteryear to the eerily natural-sounding responses we see on Bing Chat and ChatGPT today.  

Microsoft is speeding up its transformation to an AI-first company, integrating OpenAI tech in virtually all of its products. Word, Github, Visual Studio, and many more tools offered by the company are getting at least some form of AI integration to speed up workflows and make life easier. 

The excitement around AI and its potential to enhance civilization has been flanked by concerns about how it could lead to job losses, accelerate the quality and dissemination of false news, and aid nefarious actors in committing all sorts of cyber crimes. Like any new technology revolution, there will be winners and losers, but it remains to be seen just how many winners, and how many losers Microsoft's AI tech will create. Will AI enhance the lives of ordinary people, or simply enhance inequality? Either way, the latest experiment in industrial technology continues unabated. 

CATEGORIES
Jez Corden
Executive Editor

Jez Corden is the Executive Editor at Windows Central, focusing primarily on all things Xbox and gaming. Jez is known for breaking exclusive news and analysis as relates to the Microsoft ecosystem while being powered by tea. Follow on Twitter (X) and Threads, and listen to his XB2 Podcast, all about, you guessed it, Xbox!