$2,000 per month for next-gen AI models? OpenAI could reportedly hike subscription prices amid bankruptcy claims: "That's a price point for an employee, not a chatbot. The only way it would make any sense is if it was legit AGI"

ChatGPT and Microsoft Logo
ChatGPT and Microsoft Logo (Image credit: Daniel Rubino)

What you need to know

  • OpenAI is reportedly deliberating on increasing the price of its subscription-based services, up to $2,000 per month.
  • The ChatGPT maker recently surpassed 1 million paid business users amid claims that it's on the cusp of bankruptcy.
  • OpenAI will reportedly receive another round of funding from Microsoft, NVIDIA, and Apple to remain afloat, pushing its market cap to over $100 billion.

In an odd but not entirely surprising turn of events, OpenAI is reportedly deliberating on increasing the subscription charge for its AI chatbots. As you may know, the startup charges $20 monthly for its ChatGPT Plus service, which gives users priority access to new features, custom GPTs, DALL-E 3 image generation technology, and more. 

According to a report by The Information, the price hike for OpenAI's services could reach as high as $2,000 per month. The reasoning behind this potential change in pricing and its cause remains unclear. It's unclear if these changes will impact OpenAI's ChatGPT Plus service or if it will be limited to its new and sophisticated models like Strawberry. 

For context, the ChatGPT maker is reportedly developing a new AI model under the codename Strawberry. While details about the project remain slim at best, Strawberry will spot enhanced reasoning capabilities, allowing it to handle more complex tasks. 

OpenAI is reportedly leveraging a new technique dubbed post-training to develop new LLMs. It allows the models to fine-tune their performance and capabilities even after the training phase. It can be based on metrics like human feedback, rating the quality of its responses to queries. 

The news of OpenAI potentially hiking the price of its subscription-based services comes after it recently hit a major milestone—surpassing 1 million paid business users (via Reuters). Users across social media have received the news with mixed feelings, with some indicating that the subscription price constituted a paycheck for some employees. "The only way it would make any sense is if it was legit AGI," a Reddit user concluded.

Last week, ChatGPT surpassed 200 million daily active users. The firm attributed the majority of the success to GPT-4o's launch. Admittedly, GPT-4o's launch also contributed to ChatGPT's "biggest spike ever" in revenue and downloads and its continued region on mobile as Microsoft, Google, and Anthropic attempt to pick up the slack.

AI is an expensive venture digging deeper into investors' pockets

OpenAI logo (Image credit: OpenAI)

It's no secret that generative AI is on every major tech corporation's agenda for obvious reasons. Microsoft, NVIDIA, and Apple have traded places for the world's most valuable company in the past few months. Experts and market analysts attribute the success to the early adoption and integration of the technology into their products and services. This doesn't apply to Apple, which recently dabbled in the space with Apple Intelligence

Ironically, OpenAI, debatably the face of AI, is on the cusp of bankruptcy within the next 12 months. Projections indicate that it could incur losses amounting to $5 billion. However, Microsoft, NVIDIA, and Apple will reportedly extend the ChatGPT maker's lifeline with another round of funding. This cash infusion could push OpenAI's market valuation to $100 billion. 

As you may know, OpenAI spends up to $700,000 per day to keep ChatGPT running. This cost doesn't include AI's high electricity and cooling water demand. For context, Google and Microsoft consume more electricity than 100 countries

Investors in the AI landscape recently questioned Microsoft's excessive spending on AI projects that seemingly fail to turn profits. This criticism is also amid reports that AI has hit its peak and is a fad, with 30% of its projects speculated to be abandoned by 2025 after proof of concept. 

🎒The best Back to School deals📝

Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. You'll also catch him occasionally contributing at iMore about Apple and AI. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.

  • wojtek
    No, i wouldn't but J don't even pay now (nor actually use chatgpt)... AI bubbles should burst - the sooner the better...
    Reply
  • fjtorres5591
    wojtek said:
    No, i wouldn't but J don't even pay now (nor actually use chatgpt)... AI bubbles should burst - the sooner the better...
    See, here's the thing: everybody focuses and pearl clutches over chatbots, when the *real* money is in enterprise data processing.

    Remember, big companies have been collecting and mining data for over a generation. Big data is mission critical to them and anything that is faster, more accurate, and more insightful (yes, insightful) in identifying chokepoints and underserved markets is worth big bucks. Way more than even $50k a year. And it's not something that needs to go to every single Data Processing employee anyway.

    That is the biggest market but far from the only one.

    Even "trivialities" like image generation have value for big business. In hollywood they fret over digital actors when the value lies in giving directors a tool for trying different staging options before filming. Instead of a dozen takes, the director can play with a hundred options before the actors hit the set.

    Or look at the glasshouse corporate publishers: these days they rely on outsourced stock images for their covers with absurdly comical or worse, generic, covers. Look up some of the bad cover websites online. Often the cover has no relation to the story inside. This is because they no longer have an inhouse art department or even hire freelancers for pennies. A good LLM might allow the overworked "art director" with two unpaid interns to process a book for a prompt to a quality image generation app and the cover might actually bear a resemblance to the book contents.

    Now, the big publishing houses are run by luddites but small publishers and author/publishers aren't. They'll usecanything that will reduce costs and boost revenues.

    Like it or not, LLMs and focused SLMs are here to stay. For now, they need big expensive datacenters to run. That will not last. The tech is already migrating to PCs, laptops, and even smartphones. Give it two more years, three at the outside and it will be thoroughly embeded in most local software and the AI hype will be as dated as the GUI vs CLI wars of a past genertion.

    Railing against AI, dreaming of OpenAI magically vanishing in a puff of smoke and brimstone is as futile as similar FUD campaigns against Microsoft in the 90's, Amazon in the aughts, and ebooks in the teens. All succeededcand thevworld didn't end. LLMs are just the latest focus of people on soapboxes who are scared of change.

    Time passes, things change, and today's demonic new tech that is "going to destroy the world as we know it" is tomorrow's everywhere tool we can't live without. Most humans adapt. The rest just froth at the mouth impotently because the world goes as it will, not as they want.

    Fighting the future is futile. Find better things to do.
    Reply