Report claims that OpenAI has burned through $8.5 billion on AI training and staffing, and could be on track to make a $5 billion loss
Generative AI makes billions of dollars in revenue but more is needed to cover the operational costs.
The average PC user will probably think 'ChatGPT' whenever generative AI is mentioned, as it's arguably the world's most well-known and widely used LLM system. An investigative report into the financial status of its creator, OpenAI, suggests that despite its popularity, the immense costs of training the AI models and running the servers that host them are so high that the company is on track to make an operational loss of $5 billion.
That's according to The Information (via Window's Central), which based its projection on numbers garnered from unreleased internal financial statements and various industry figures. Founded in 2015, OpenAI has grown in size and scope due to multiple investments, with Microsoft pouring billions of dollars into the AI firm over the years.
The most recent injection of cash, $10 billion in early 2023, was rumoured to include a 75% slice of OpenAI’s profits and a 49% stake in the company, as well as integrating ChatGPT into Bing and other Microsoft systems. In return, as well as the investment, OpenAI is thought to get access to Azure cloud servers and a substantially reduced rate.
However, The Information's report suggests that OpenAI just isn't making anywhere near enough money and could be on track to post an operational loss of $5 billion by the end of the financial year.
It's claimed that OpenAI has spent roughly $7 billion on LLM (large language models) training and inference, and as much as $1.5 billion on staffing. Other analysts have estimated that it costs in the order of $700,000 per day to run ChatGPT, due to the expense of Nvidia's AI servers, though determining the exact cost isn't realistically feasible. Even if one combines all of this, OpenAI should still be making a profit, so why is it losing so much money?
The problem appears to be a multi-faceted one. OpenAI is heavily invested in being the first company to make AGI (artificial general intelligence) a reality and that will certainly be eating into some of the profits. It's also not the only company developing generative AI systems and faces competition from Anthropic, Amazon, Google, Nvidia, Meta, xAI, and numerous others—while ChatGPT is the most well-known, it's getting an increasingly smaller slice of the total pot of revenues that's up for grabs.
While The Information claims that OpenAI could possibly go bankrupt, this isn't the first time that such a thing has been said. Dire warnings of the AI company's future were suggested last year, though other analysts pointed out that the relative cost of AI training, inference, and chip manufacturing will decrease in time, as more companies get involved in the market.
The biggest gaming news, reviews and hardware deals
Keep up to date with the most important stories and the best deals, as picked by the PC Gamer team.
Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.
Given the size of Microsoft's total investment so far, I don't think OpenAI is going to bring in the administrators just yet. Early this year, it was estimated to be worth $80 billion and while that's hardly chump change, Microsoft could afford to buy it outright if it looked like things really were heading for the drains.
Then again, such a purchase would undoubtedly be investigated by anti-monopoly commissions around the world. OpenAI could increase the query cost for ChatGPT, to improve revenues, but this would likely result in a reduction in the number of daily users.
Perhaps OpenAI's main hope for the future is that cheaper AI servers become available, to help reduce operational costs, as sales of much-hyped AI PCs appear to be driven by interest in better battery life, rather than their AI features.
Nick, gaming, and computers all first met in 1981, with the love affair starting on a Sinclair ZX81 in kit form and a book on ZX Basic. He ended up becoming a physics and IT teacher, but by the late 1990s decided it was time to cut his teeth writing for a long defunct UK tech site. He went on to do the same at Madonion, helping to write the help files for 3DMark and PCMark. After a short stint working at Beyond3D.com, Nick joined Futuremark (MadOnion rebranded) full-time, as editor-in-chief for its gaming and hardware section, YouGamers. After the site shutdown, he became an engineering and computing lecturer for many years, but missed the writing bug. Cue four years at TechSpot.com and over 100 long articles on anything and everything. He freely admits to being far too obsessed with GPUs and open world grindy RPGs, but who isn't these days?