OpenAI announces GPT-4o, its new flagship model, promising GPT-level intelligence to everyone, including free users
OpenAI promises GPT-4-level intelligence from its new flagship model, even for free users.
Recent updates
The OpenAI Spring Update event just ended, so we'll update this piece as more information becomes available.
What you need to know
- OpenAI announced GPT-4o at its live stream today.
- GPT-4o is faster than GPT-4 and improves upon its predecessor in several areas.
- GPT-4o promises "GPT-4-level intelligence" while being available to free users.
- Paid users will continue to have five times the capacity of free users.
OpenAI had three pieces of news to share at its spring event. The company started its event by unveiling GPT-4o, the latest flagship model from OpenAI. GPT-4o delivers "GPT-4-level intelligence" to everyone, including free users. It also improves on its predecessor in several areas, including being faster.
GPT-4o is half the price as GPT-4 turbo but is twice as fast and has five times the rate limit, as highlighted by OpenAI CEO Sam Altman.
The live stream was relatively short, only lasting around 25 minutes. During that time, OpenAI showcased real-time voice communication, video capabilities, and ChatGPT being able to detect and replicate emotion.
OpenAI will roll out the new features unveiled for ChatGPT over the coming weeks. The new voice mode in ChatGPT will first roll out to ChatGPT Plus users.
Real-time audio capabilities
OpenAI also demonstrated its real-time audio capabilities. The demo showed ChatGPT giving feedback on how to give a live presentation. The tool seemed conversational and to work without much delay, though the demo was short. One thing that differentiates the latest ChatGPT version is that you can interrupt it. The tool can also sense your emotion to some extent, such as identifying someone's breath indicating stress (when asked).
A second demo highlighted how OpenAI can add emotion to its speech by sharing an over-the-top bedtime story on request.
Later during the presentation, OpenAI translated back-and-forth between Italian and English in real-time.
Get the Windows Central Newsletter
All the latest news, reviews, and guides for Windows and Xbox diehards.
Visual capabilities
You can also interact with ChatGPT using video. A demo during the event shows a user filming himself working through a math equation and ChatGPT walking him through the problem. The tool also correctly determined a doodle that said "I 💗 ChatGPT."
The presenters of the live stream had ChatGPT look at a person's face and identify his emotions. ChatGPT said the presenter had a "big smile and a touch of excitement." ChatGPT then said the presenter was making it blush when told the reason for the happiness was presenting ChatGPT.
ChatGPT in action
Altman live-tweeted the live stream on X. Here are some of his posts:
and with video mode!! pic.twitter.com/cpjKokEGVdMay 13, 2024
especially at coding pic.twitter.com/EYC3dMCmNpMay 13, 2024
audience request to act as a translator pic.twitter.com/E2qbfhyVmXMay 13, 2024
Sean Endicott is a tech journalist at Windows Central, specializing in Windows, Microsoft software, AI, and PCs. He's covered major launches, from Windows 10 and 11 to the rise of AI tools like ChatGPT. Sean's journey began with the Lumia 740, leading to strong ties with app developers. Outside writing, he coaches American football, utilizing Microsoft services to manage his team. He studied broadcast journalism at Nottingham Trent University and is active on X @SeanEndicott_ and Threads @sean_endicott_.