OpenAI Is A Bad Business

Edward Zitron 32 min read

OpenAI, a non-profit AI company that will lose anywhere from $4 billion to $5 billion this year, will at some point in the next six or so months convert into a for-profit AI company, at which point it will continue to lose money in exactly the same way. Shortly after this news broke, Chief Technology Officer Mira Murati resigned, followed by Chief Research Officer Bob McGrew and VP of Research, Post Training Barret Zoph, leaving OpenAI with exactly three of its eleven cofounders remaining.  

This coincides suspiciously with OpenAI's increasingly-absurd fundraising efforts, where (as I predicted in late July) OpenAI has raised the largest venture-backed fundraise of all time $6.6 billion— at a valuation of $157 billion.

EDITOR'S NOTE: Not long after this newsletter published, OpenAI's funding round closed. As a result, this newsletter reflects the round as pending, when it has now closed.

Yet despite the high likelihood of the round's success, there are quite a few things to be worried about. The Wall Street Journal reported last week that Apple is no longer in talks to join the round, and while one can only speculate about its reasoning, it's fair to assume that Apple (AAPL), on signing a non-disclosure agreement, was able to see exactly what OpenAI had (or had not) got behind the curtain, as well as its likely-grim financial picture, and decided to walk away. Nevertheless, both NVIDIA (NVDA) and Microsoft (MSFT) are both investing, with Microsoft, according to the Wall Street Journal, pushing another $1 billion into the company — though it's unclear whether that's in real money or in "cloud credits" that allow OpenAI to continue using Microsoft's cloud.

Yet arguably the most worrying sign is that SoftBank's Vision Fund will be investing $500 million in OpenAI. While it might seem a little weird to be worried about a half-billion dollar check, SoftBank — best-known for sinking $16 billion or more into WeWork and getting swindled by its founder, and dumping a further €900m into Wirecard, which turned out to be an outright fraud, with one founder now a fugitive from justice in Russia — is some of the dumbest money in the market, and a sign that any company taking it is likely a little desperate. 

While SoftBank has had a number of hits — NVIDIA, ARM and Alibaba, to name a few — it is famous for piling cash into terrible businesses, like Katerra (a construction company that died despite a $2 billion investment in 2021) and Zume Pizza (a robotic pizza company with a product that never worked that closed after raising more than $400 million, with $375 million coming from SoftBank).

No, really, the SoftBank Vision Fund is in a bad way. Last year SoftBank's Vision Fund posted a record loss of $32 billion after years of ridiculous investments, a year after CEO Masayoshi Son promised investors that there would be a "stricter selection of investments." One might think that three years of straight losses would humble Son, and you would be wrong. He said in June that he was "born to realize artificial superintelligence," adding that he was "super serious about it.

One of Son's greatest inspirations (who he begged simply to see the face of when he flew to meet him when he was 16-years-old) is Den Fujita, the thankfully-dead founder of McDonald's Japan, and the author of a book called "The Jewish Way of Doing Business," which suggested that Jews had taken over the business world and implored businesspeople to copy them, while also suggesting that Jews had settled in Osaka 1000 years ago, making the people there "craftier," a comment that McDonald's had to issue a public apology for.

In any case, OpenAI will likely prevail and raise this round from a cadre of investors that will have to invest a minimum of $250 million to put money behind a company that has never turned a profit, that has no path to profitability, and has yet to create a truly meaningful product outside of Sam Altman's marketing expertise. This round is a farce — a group delusion, one borne of one man's uncanny ability to convince clueless idiots that he has some unique insight, despite the fact that all signs point to him knowing about as much as they do, allowing him to prop up an unsustainable, unprofitable and directionless blob of a company as a means of getting billions of dollars of equity in the company — and no, I don't care what he says to the contrary.

Last week, the New York Times reported that OpenAI would lose $5 billion in 2024 (which The Information had estimated back in July), and that the company expected to raise the price of ChatGPT's premium product to $44-a-month over the next five years, and intended to increase the price of ChatGPT to $22-a-month by the end of 2024, a pale horse I've warned you of in the past.

Interestingly (and worryingly), the article also confirms another hypothesis of mine — that "fund-raising material also signaled that OpenAI would need to continue raising money over the next year because its expenses grew in tandem with the number of people using its products" - in simpler terms, that OpenAI will likely raise $6.5 billion in funding, and then have to do so again in short order, likely in perpetuity.

The Times also reports that OpenAI is making estimates that I would describe as "fucking ridiculous." OpenAI's monthly revenue hit $300 million in August, and the company expects to make $3.7 billion in revenue this year (the company will, as mentioned, lose $5 billion anyway), yet the company says that it expects to make $11.6 billion in 2025 and $100 billion by 2029, a statement so egregious that I am surprised it's not some kind of financial crime to say it out loud.

For some context, Microsoft makes about $250 billion a year, Google about $300 billion a year, and Apple about $400 billion a year.

To be abundantly clear, as it stands, OpenAI currently spends $2.35 to make $1.

OpenAI loses money every single time that somebody uses their product, and while it might make money selling premium subscriptions, I severely doubt it’s turning a profit on these customers, and certainly losing money on any and all power users. As I've said before, I believe there's also a subprime AI crisis brewing because OpenAI's API services — which lets people integrate its various models into external products — is currently priced at a loss, and increasing prices will likely make this product unsustainable for many businesses currently relying on these discounted rates.

As I've said before, OpenAI is unprofitable, unsustainable and untenable in its current form, but I think it's important to explain exactly how untenable it is, and I'm going to start with a few statements:

  • For OpenAI to hit $11.6 billion of revenue by the end of 2025, it will have to more than triple its revenue.
  • At the current cost of revenue, it will cost OpenAI more than $27 billion to hit that revenue target. Even if it somehow halves its costs, OpenAI will still lose $2 billion.
  • OpenAI has not had anything truly important since the launch of GPT-3.5, and its recent o-1 model has not been particularly impressive. It's also going to be much, much more expensive to run, as the "chain-of-thought" "reasoning" that it does requires a bunch of extra calculations (an indeterminate amount that OpenAI is deliberately hiding), and OpenAI can't even seem to come up with a meaningful use case.
  • OpenAI's products are increasingly-commoditized, with Google, Meta, Amazon and even Microsoft building generative AI models to compete. Worse-still, these models are all using effectively-identical training data (and they're running out!), which makes their outputs (and by extension their underlying technology) increasingly similar.
  • OpenAI's cloud business — meaning other companies connecting their services to OpenAI's API — is remarkably small, to the point that it suggests there's weaknesses in the generative AI industry writ large. It’s extremely worrying that the biggest player in the game only makes $1 billion (less than 30% of its revenue) from providing access to their models.

And, fundamentally, I can find no compelling evidence that suggests that OpenAI will be able to sustain this growth. In fact, I can find no historical comparison, and believe that OpenAI's growth is already stumbling.

Let's take a look, shall we?

How Does OpenAI Make Money?

To do this right, we have to lay out exactly how OpenAI makes money.

According to the New York Times, OpenAI expects ChatGPT to make about $2.7 billion in revenue in 2024, with an additional $1 billion coming from "other businesses using its technology."

Let's break this down.

ChatGPT Plus, Teams, and Enterprise — 73% of revenue (approximately $2.7 billion).

  • OpenAI sells access to ChatGPT Plus to consumers for $20 a month, offering faster response times, "priority access to new features," and 24/7 access to OpenAI's models, with "5x more messages for GPT-4o," access image generation, data analysis and web browsing. Importantly, OpenAI can use anything you do as training data, unless you explicitly opt-out.
  • OpenAI sells access to a "Teams" version of ChatGPT Plus, a self-service product that allows you to share chatbots between team users, costing $25-a-user-a-month if paid annually (so $300 a year per-user), and $30-a-user-a-month if paid monthly. From this point on, your data is excluded from that used to train OpenAI's models by default.
  • OpenAI sells "enterprise" subscriptions that include an expanded context window for longer prompts (meaning you can give more detailed instructions), admin controls, and "enhanced support and ongoing account management."
    • It isn't clear how much this costs, but a Reddit thread from a year ago suggests it's $60-a-user-a-month, with a minimum of 150 seats on an annual contract.
    • I don’t know for certain, but it’s likely OpenAI offers some kind of bulk discount for large customers that buy in volume, as is the case with pretty much every enterprise SaaS business. I’ll explain my reasoning later in this piece. 
    • Assuming this is the case, that’s bad for OpenAI, as generative AI isn’t like any other SaaS product. Economies of scale don’t really work here, as servicing each user has its own cost (namely, the cloud computing power used to answer queries). That cost-per-user doesn’t decrease as you add more customers. You need more servers. More GPUs. 
    • Cutting prices, therefore, only serves to slash whatever meager margins exist on those customers, or to turn those potentially-profitable customers into a loss center.     

Licensing Access To Models And Services — 27% of revenue (approximately $1 billion).

  • OpenAI makes the rest of its money by licensing access to its models and services via its API. One thing you notice, when looking at its pricing page, is the variety of models and APIs available, and the variation in pricing that exists.
  • OpenAI offers a lot of options: its most powerful GPT-4o model; the less-powerful-yet-cheaper GPT-4o-mini model; the "reasoning" model o1 (and its "mini" counterpart); a "text embeddings" API that is used primarily for tasks where you want to identify anomalies or relationships in text, or classify stuff in text; an "assistants API" for building assistants into an application (which in turn connect to one of the other models, which includes things like interpreting code or searching for files); three different image generation models; three different audio models; and a bunch of older legacy APIs and models.
  • In many cases, customers can get a 50% discount by using the Batch API. This delays completion by as much as 24 hours and requires all tasks to be submitted in one batch, rather than as-and-when. This might be useful for using GPT to dig through masses of data.
    • For example, when using the batch API, the cost of using GPT-4o drops from $5 per 1m input tokens to $2.5, and from $15 per 1m output tokens to $7.5.
    • Batch pricing is not available for o1-preview.
    • Additionally, this discount is not available when buying training tokens for fine-tuning models (although you still get the same discount for input and output tokens).
    • Batch pricing is not available for DALL-E, the Assistants API, or the audio models.
    • It’s also not available for GPT-3.5-turbo-instruct and the latest 4-o model.
  • The pricing of these products gets a little messy, much like it does with basically every cloud company.
  • OpenAI also makes around $200 million a year selling access to its models through Microsoft, according to Bloomberg.
  • In conclusion, this means that OpenAI makes roughly $800 million a year by directly selling access to their API, with a further $200m coming from an external channel.

As a result of these numbers, I have major concerns about the viability of OpenAI's business, and the generative AI market at large. If OpenAI — the most prominent name in all of generative AI — is only making a billion dollars a year from this, what does that say about the larger growth trajectory of this company, or actual usage of generative AI products?

I'll get to that in a bit.

First, we've gotta talk about the dollars.

The Revenue Problem

So, as it stands, OpenAI makes the majority — more than 70% — of its revenue from selling premium access to ChatGPT.

A few weeks ago, The Information reported that ChatGPT Plus had "more than 10 million paying subscribers," and that it had 1 million more that were paying for  "higher-priced plans for business teams." As I've laid out above, this means that OpenAI is making about $200 million a month from consumer subscribers, but "business teams" is an indeterminate split between teams ($25-a-user-a-month paid annually) and enterprise (at least $60-a-user-a-month, paid annually, with a minimum of 150 seats).

One important detail: 100,000 of the 1 million business customers are workers at management consultancy PwC, which has also become OpenAI's "first partner for selling enterprise offerings to other businesses."  It isn't clear whether these are enterprise accounts or teams accounts, or whether PwC is paying full price (I'd wager it isn’t).

Here’s how this would play out in revenue terms across several assumed divisions of the customer base, and an assumption that every Teams customer is paying $27.5 (that plan costs either $25 or $30, depending on whether you pay monthly or yearly, but for the sake of fairness, I went with the middle ground). From there, we can run some hypothetical monthly revenue numbers based on a million "higher-priced plans for business teams."

  • 25% Enterprise, 75% Teams: $35,625,000
  • 50% Enterprise, 50% Teams: $43,750,000
  • 75% Enterprise, 25% Teams: $51,875,000

Sadly, I don't think things are that good, and I honestly don't think these would be particularly-impressive numbers to begin with.

We can actually make a more-precise estimate by working backwards from the New York Times' estimates. ChatGPT Plus has 10 million customers, making OpenAI around $2.4 billion dollars a year (ten million users spending $20 each month equates to $200 million. Multiply that by 12, you get $2.4 billion). This means that business users make up about $300 million a year in revenue, or $25 million a month.

That is, to be frank, extremely bad. These are estimates, but even if they were doubled, these would not be particularly exciting numbers.

For all the excitement about OpenAI's revenue — putting aside the fact that it spends $2.35 to make $1 — the majority of the money it makes is from subscriptions to ChatGPT Plus for consumers, though one can fairly say there are professionals that use it under the consumer version too.

While 10 million paying subscribers might seem like a lot, "ChatGPT" is effectively to generative AI what "Google" is to search. Ten million people paying for this is table stakes. 

OpenAI has been covered by effectively every single media outlet, is mentioned in almost every single conversation about AI (even when it's not about generative AI!), and has the backing and marketing push of Microsoft and the entirety of Silicon Valley behind it. ChatGPT has over 200 million weekly users, and the New York Times reports that OpenAI has "350 million people use [OpenAI's] services each month as of June" (though it's unclear if that includes those using the API). Collectively, this means that OpenAI — the most popular company in the industry — can only convert about 3% of its users.

This might be because it's not obvious why anyone should pay for a premium subscription. Paying for ChatGPT Plus doesn't dramatically change the product, nor does it offer a particularly-compelling new use case for anyone other than power users. As a company, OpenAI is flat-out terrible at product. While it may be able to attract hundreds of millions of people to dick around with ChatGPT (losing money with every prompt), it's hard to convert them because you have to, on some level, show the user what ChatGPT can do to get them to pay for it… and there isn't really much you can charge for, other than limiting how many times they can use it.

And, if we're honest, it still isn't obvious why anyone should use ChatGPT in the first place, other than the fact everybody is talking about it. You can ask it to generate something — a picture, a few paragraphs, perhaps a question — and at that point say "cool" and move on. I can absolutely see how there are people who regularly use ChatGPT's natural language prompts to answer questions that they can't quite phrase (a word that's on the tip of their tongue, a question they're not sure how to phrase, or to brainstorm something) but beyond that, there really is no "sticky" part of this product beyond "a search engine that talks back to you."

That product is extremely commoditized. The free version of ChatGPT is effectively identical to the free version of Anthropic's Claude, Meta's AI assistant, Microsoft's Copilot, and even Twitter's "Grok." They all use similar training data, all give similar outputs, and are all free. Why would you pay for ChatGPT Plus when Meta or Microsoft will give you their own spin on the same flavor? Other than pure brand recognition, what is it that ChatGPT does that Copilot (powered by ChatGPT) doesn't? And does that matter to the average user?

I'd argue it doesn't. I'd also argue that those willing to pay for a "Plus" subscription are more likely to use the platform way, way more than free users, which in turn may (as one Redditor hypothesized regarding Anthropic's "Claude Pro" subscription) lose it the revenue on said premium subscriber. While there's a chance that OpenAI could have a chunk of users that aren't particularly active, one cannot run a business based on selling stuff you hope that people won't use.

A note on “free” products: Some of you may suggest that OpenAI having 350 million free users may be a good sign, likely comparing it to the early days of Facebook, or Google. It’s really important to note how different ChatGPT is to those products. While Facebook and Google had cloud infrastructure costs, they were dramatically lower than OpenAI’s, and both Facebook and Google had (and have) immediate ways to monetize free users.

Both Meta and Google monetize free users through advertising that is informed by their actions on the platform, which involves the user continually feeding the company information about their preferences based on their browsing habits across their platforms. As a result, a “free” user is quite valuable to these companies, and becomes more so as they interact with the platform more.

This isn’t really the case with OpenAI. Each free user of ChatGPT is, at best, a person that can be converted into a paying user. While OpenAI can use their inputs as potential training data, that’s infinitesimal value compared to operating costs. Unlike Facebook and Google, ChatGPT’s most frequent free users actually become less valuable over time, and become a burden on a system that already burns money.

I’ll touch on customer churn later, but one more note about ChatGPT Plus users: as with any other consumer-centric subscription product, these customers are far more likely to cut their spending when they no longer feel like they’re getting value from their product, or when their household budgets demand it. Netflix — the biggest name in streaming — lost a million customers in 2022, around the time of the cost-of-living crisis (and, from 2025, it plans to stop reporting subscriber numbers altogether)

ChatGPT Plus is likely, for many people, a “lifestyle product.” And the problem is that, when people lose their jobs or inflation hikes, these products are the first to get slashed from the household budget. 

OpenAI also has a unique problem that makes it entirely different to most SaaS solutions — the cost of delivering the solution. While 3% conversion of free customers to paying customers might regularly be on the low side of "good," said solutions are nowhere near as expensive as running software using generative AI.

There's also another wrinkle.

If the majority of OpenAI's revenue — over 70% — comes from people paying for ChatGPT Plus, then that heavily suggests the majority of its compute costs come from what is arguably its least-profitable product. The only alternative is that OpenAI's compute costs are so high that, despite making  two-thirds of its revenue, ChatGPT creates so much overhead that it sours the rest of the business.

You see, ChatGPT Plus is not a great business. It's remarkable that OpenAI found 10 million people to pay for it, but how do you grow that to 20 million, or 40 million?

These aren't idle questions, either. At present, OpenAI makes $225 million a month — $2.7 billion a year — by selling premium subscriptions to ChatGPT. To hit a revenue target of $11.6 billion in 2025, OpenAI would need to increase revenue from ChatGPT customers by 310%.

If we consider the current ratio of Plus subscriptions to Teams and Enterprise subscriptions — about 88.89% to 11.11% — OpenAI would need to find 18.29 million paying users (assuming a price increase of $2 a month), while also retaining every single one of its current ChatGPT Plus users at a new price point, for a total $7.4 billion, or $616 million or so a month. It would also have to make $933 million in revenue from its business or enterprise clients, which, again, would require OpenAI to more-than-triple their current users.

OpenAI's primary revenue source is one of the most easily-commoditized things in the world — a Large Language Model in a web browser — and its competitor is Mark Zuckerberg, a petty king with a huge warchest that can never, ever be fired, even with significant investor pressure. Even if that wasn't the case, the premium product that OpenAI sells is far from endearing, still looking for a killer app a year-and-a-half into its existence, with its biggest competitor being the free version of ChatGPT.

There are ways that OpenAI could potentially turn this around, but even a battalion of experienced salespeople will still need paying, and will have the immediate job of "increase revenue by 300%" for a product that most people have trouble explaining.

No, really. What is ChatGPT? Can you give me an answer that actually explains what the product does? What is the compelling use case that makes this a must-have?

I am hammering this point because this is the majority of OpenAI's revenue. OpenAI lives and dies on the revenue gained from ChatGPT, a product that hasn't meaningfully changed since it launched beyond adding new models that do, for most users, exactly the same thing. While some might find ChatGPT's voice mode interesting, "interesting" just isn't good enough today.

And to drill down further, the majority of OpenAI's revenue is from ChatGPT Plus, not its Enterprise or Teams product, meaning that hiring a sales team is far from practical. How do you sell this to consumers, or professionals? Even Microsoft, which has a vast marketing apparatus and deep pockets, struggled to sell Copilot — which is based on OpenAI’s GPT models — on its weird (and presumably expensive) Superbowl ads, or on the countless commercials that dotted the 2024 Olympic Games. 

To triple users, ChatGPT must meaningfully change, and do so immediately, or disclose multiple meaningful, powerful use cases that are so impressive that 18 million new people agree to pay $22 a month. That is an incredible — and some might say insane — goal, and one that I do not think this company is capable of achieving.

Yet this is far from the most worrying part of the current OpenAI story.

The Cloud Services Problem

What's astounded me about this whole story is how little of OpenAI's revenue comes from providing other companies the means to integrate generative AI into their systems, for two big reasons:

  1. Assuming that OpenAI makes $1 billion a year selling API access (and thus letting you integrate their models into your products), it suggests that even the biggest company in generative AI can't find enough customers to make its cloud services viable.
  2. This in turn suggests that there is a remarkably small amount of demand for generative AI integrations, or considered another way, that the companies connecting to OpenAI aren't making them very much money.
    1. This is when things get confusing. According to The Information, OpenAI is projected to get an annualized $200 million from selling access to its models via Microsoft's Azure OpenAI business, where it gets a 20% cut of all revenue. That suggests that Microsoft was, at the time (in June), on course to make a billion dollars in revenue a year from OpenAI's models. Yet a story from two weeks later suggested that OpenAI was "exceeding what Microsoft makes from an equivalent business." 
    2. Compare that to the story from the New York Times, which suggested that OpenAI is still only on course to make one billion dollars from selling access to its own models — which highly suggests that growth has plateaued for its cloud services, and that Microsoft is making about as much as OpenAI is selling access to their services.
    3. It also highlights how ineffective OpenAI’s own sales channels are. If a reseller — which Microsoft effectively is — can match OpenAI’s own sales figures, it suggests that OpenAI isn’t really good at selling its own stuff. Or, perhaps, that the people most likely to use OpenAI’s models and APIs would rather buy access through their existing cloud infrastructure provider than directly from the developer itself. 
    4. And so, it has two options: Either it relies on partnerships and external sales channels, allowing it to potentially increase the gross number of customers, but at the expense of the money it makes, or it can build a proper sales and marketing team.
    5. Both options kinda suck. The latter option also promises to be expensive, costly, and has no guarantees of success. 
  3. There’s very little information about how many developers actively use OpenAI’s models and APIs in their code. At the 2024 OpenAI DevDay event — its  developer conference, which took place on October 1 — the company said that over 3,000,000 developers are building apps using OpenAI’s infrastructure. That works out to about $333 per developer. 
    1. To be fair, any SaaS company that offers API integrations — think Twilio — has a large chunk of users that are either hobbyists, or people just experimenting with a technology out of curiosity. That demographic inevitably skews the average revenue per developer. 
    2. I imagine that OpenAI — by virtue of being the biggest name in generative AI — has a disproportionate number of those non-paying or low-paying users. People who either made an account, but never actually used it, or spent $10 on building some passion project that they can brag about on Twitter or Hacker News, but never actually commercialize.
    3. I also imagine that some companies are using GPT internally, and that these probably account for a huge proportion of its API/model revenue, especially considering that there aren’t many hugely popular consumer apps using GPT. 
    4. PWC — which, as mentioned earlier, recently bought 100,000 ChatGPT Enterprise seats for its own internal use — also has a custom GPT model that it uses in-house called (and I swear I’m not making this up) ChatPWC. According to its 2023 financial review, PWC has around 364,000 employees
    5. I can easily imagine this company being one of OpenAI’s “whales” — even though the reviews of ChatPWC on the PWC subreddit are mixed at best, with some finding the service actually useful, and others describing it as “absolutely shite.”
    6. Let’s go back to Twilio — a company that makes it easy to send SMS messages and push notifications. Over the past quarter, it made around $1bn in revenue. That’s what OpenAI made from renting out its models/APIs over the past year.
    7. Twilio also made roughly $4bn over the past four quarters — which is more than OpenAI’s projected revenue for the entirety of 2024. OpenAI, I remind you, is the most hyped company in tech right now, and it’s aiming for a $150bn valuation. Twilio’s market cap is, at the time of writing, just under $10bn.
    8. Does this sound like an in-demand technology to you? Does this sound like something with a vast, untapped market?
  4. One last thing: At the latest OpenAI DevDay event, the company said it had reduced the cost of accessing its APIs by 99 percent over the past two years — although TechCrunch noted that this was likely due to competition from Meta and Google. 
    1. Remember when I said that generative AI was an incredibly commoditized product? 

In the event that OpenAI and Microsoft are each making about a billion dollars in annualized revenue — again, these are estimates based on current growth trajectories — it heavily suggests that there is...two billion dollars of revenue? For Microsoft and OpenAI combined?

That's god damn pathetic! That's really, really bad! And the fact these "annualized" figures have changed so little since June means that growth has slowed, because otherwise said numbers would have increased (as "annualized" is based on projections). 

Let me explain my reasoning a bit. “Annualized” revenue is a somewhat-vague term that can mean many things, but taking The Information’s definition from mid-June (when annualized revenue was $3.4 billion), annualized revenue is “a measure of the past month’s revenue multiplied by 12.” That suggests that OpenAI’s revenue in that month was $283 million, a number which the New York Times updated in their piece, saying that OpenAI made $300 million of revenue in the month of August, which works out to $3.6 billion in annualized revenue, and OpenAI expects to hit $3.7 billion this year, which works out to $308 million.

Just so we’re abundantly clear — this means that OpenAI, unquestionably the leader and most-prominent brand and the first thought anybody will have when integrating generative AI, is only making about $80 million a month selling access to its models?

This heavily-suggests that generative AI, as a technology, doesn’t necessarily have a product-market fit. According to a survey by Andreessen Horowitz earlier in the year, “the 2023 market share of closed-source models [was estimated at] 80 to 90%, with the majority of share going to OpenAI.” Another survey from IOT Analytics published late last year suggests that number might look a little bit different, with 39% of the market share going to OpenAI and 30% going to Microsoft. 

Assuming that the latter numbers are true, this suggests that the generative AI market is incredibly small. If OpenAI — which dominates with, I’d wager, at least a 30% market share — is only making $1 billion a year from selling access to its models, and at this stage in the massively-inflated hype bubble, there may not even be $10 billion of annual revenue from companies integrating generative AI into their products. And let’s be honest, this is where all of the money should be. If this stuff is the future of everything, why is the revenue stream so painfully weak? Why isn’t generative AI in everything — not just big apps and services hoping to ride a wave as it crests, oblivious to the imminent collapse — but all apps, and present in a big way, where it’s core to the app’s functionality?  

OpenAI making so little selling access to their models suggests that, despite the hype-cycle, there either isn’t the interest in integrating these products from developers, or these integrations aren’t that useful, attractive or impressive to users. Again, these products are charged on usage — and so, it’s possible for generative AI to be integrated into a service, but not actually drive much revenue for OpenAI. 

While ChatGPT has brand recognition, companies integrating OpenAI’s models in their products are far more indicative of the long-term health of both the company and the industry itself, because if OpenAI can’t convince people to integrate and use this shit, do you really think other companies are succeeding? 

There are two counterarguments:

  1. OpenAI is struggling to sign up developers, and this problem is unique to OpenAI. 
    1. This argument is easily-refuted. Microsoft is on course to make a billion dollars selling OpenAI’s models themselves this year, which only reinforces my point that the generative AI market is small. Again, if Microsoft can’t make more than a billion dollars on this, how much could the rest of the market be making?
  2. OpenAI has tons of adoption, but it’s deliberately underpricing their models to scale.
    1. This is perhaps the most plausible, as by its own admission, OpenAI has reduced the cost of its APIs by 99 percent over the past two years.
    2. If that’s the case, then OpenAI has to raise prices. Also, Microsoft has to raise prices. And that will likely spook those who have gotten used to these subsidized prices — which I discussed in The Subprime AI Crisis a few weeks ago — and they’ll start to disentangle themselves from generative AI.
    3. If this is scale, it isn’t working very well? Unless, of course, OpenAI has realized that pricing is their only lever to pull that can influence adoption. 

While I can’t say for certain, and I’ll happily admit if I’m wrong in the future, the numbers here suggest that OpenAI’s cloud services business — as in integrating its supposedly industry-leading technology into other products — is nowhere near as viable a business as selling subscriptions to ChatGPT, which in turn suggests that there is either a lack of interest in integrating generative AI, a lack of adoption once said integration has happened, or a lack of usage by users of features powered by generative AI itself. 

The only other avenue is that OpenAI isn’t charging what it believes these services are worth.

The “Staying Alive” Problem

As discussed earlier in the piece, OpenAI needs to more-than-triple revenue in the next 15 months to hit $11.6 billion in sales. Furthermore, at its current burn rate, OpenAI is currently spending $2.35 to make $1 — meaning that $11.6 billion in revenue will cost $27 billion to make.

While costs could foreseeably come down, all signs point to them increasing. As noted above, GPT-4 cost $100 million to train, and more complex future models will cost hundreds of millions or even a billion dollars to train. The Information also estimated back in July that OpenAI's training costs would balloon to $3 billion in 2024, and it’s fair to assume that models like o1 and GPT-5 (also known as “Orion”) will be significantly more expensive to train. 

OpenAI also has a popularity problem. While it’s usually great news that a product has 300 million free users, every single time somebody uses the service costs OpenAI money, and lots of it. The Information estimated OpenAI will spend around $4 billion on server costs in 2024 to run ChatGPT and host other companies running services using GPT and its other models, effectively meaning that every dollar of revenue is immediately eaten by the costs of acquiring it. And that’s before you factor in paying the more-than 1,500 people that work at the company (another $1.5 billion in costs alone, and OpenAI expects to hire another 200 by the end of the year), and other costs like real estate and taxes.

As a result, I am now updating my hypothesis. I believe that OpenAI, once it has raised its current ($6.5 billion to $7 billion) round, will have to raise another round of at least the same size by July 2025, if not sooner. To quote the New York Times’ reporting, “fund-raising material also signaled that OpenAI would need to continue raising money over the next year because its expenses grew in tandem with the number of people using its products.”

OpenAI could potentially reduce costs, but has shown little proof that it’s able to, and its one attempt (the more efficient “Arrakis” model) failed to launch

One would also imagine that this company — now burning $5 billion a year — is doing all it can to try and bring costs down, and even if it isn’t, I severely doubt that it’s possible to. After all, Anthropic is facing exactly the same problem, and is on course to lose $2.7 billion in 2024 on $800 million in revenue, which makes me think the likelihood of this being a quick fix is extremely low.

There is, however, another problem, one caused by their current fundraise. 

OpenAI is, at present, raising $6.5 to $7 billion in capital at a $150 billion valuation. Assuming it completes this raise — which is likely to happen — it will mean that all future rounds have to be at either $150 billion or higher. A lower valuation (called a down-round) would both upset current investors and send a loud signal to the market that the company is having trouble, which overwhelmingly suggests that OpenAI’s only way to survive is to raise its next round at a valuation of $200 billion, and require yet another giant raise, likely of at least $10 billion.

For context, the biggest IPO valuation in US corporate history was Alibaba, which debuted onto the New York Stock Exchange with a market cap of nearly $170bn. That figure is more than double the runner-up, Facebook, which had a value of $81bn. Are you telling me that OpenAI — a company that burns vast piles of money like the Joker in The Dark Knight, and has no obvious path to profitability, and a dubious addressable market — is worth more than Alibaba, which, even a decade ago, was a giant of a company?   

Further souring the terms are its prior commitments to Microsoft, which owns 75% of all future profits (ha!) until it recoups its $13 billion investment, after which point it receives 49% of all profits until it is paid $1.56 trillion dollars, or 120 times the original amount, though the move to a non-profit structure may remove the limits on OpenAI’s ridiculous “profit participation units” where previous investors are given a slice of (theoretical) profits from the least profitable company in the world. 

Nevertheless, Microsoft’s iron grip on the future profits of this company — as well as complete access to its research and technology — means that anyone investing is effectively agreeing to help Microsoft out, and makes any stock (or PPUs) that Microsoft owns far more valuable.

There’s also the other problem: how OpenAI converts its bizarre non-profit structure into a for-profit, and whether that’s actually possible. Investors are, according to the New York Times, hedging their bets. OpenAI has two years from the deal’s close to convert to a for-profit company, or its funding will convert into debt at a 9% interest rate

As an aside: how will OpenAI pay that interest in the event they can't convert to a for-profit business? Will they raise money to pay the interest rate? Will they get a loan?

As a result, OpenAI now faces multiple challenges:

  1. It must successfully close its current $6.5 billion to $7 billion round within the next month — and the deal has been “almost closing” for a couple of weeks.
  2. It must, on closing said deal, convert itself to a for-profit company, within two years.
  3. In the event that it wishes to grow, OpenAI will have to close another round of funding by the middle of 2025, and raise both more money (likely $10 billion), and raise it at a higher valuation (likely at minimum $175 billion, if not $200 billion to $250 billion.)

Also, at some point, OpenAI will have to work out a way to go public, because otherwise there is absolutely no reason to invest in this company, unless you are doing so with the belief that you will be able to offload your shares in a future secondary sale. At that point, OpenAI resembles a kind of investment scam where new investors exist only to help old investors liquidate their assets rather than anyone investing in the actual value of the company itself.

And, after that, OpenAI will still need to raise more rounds of funding. ChatGPT’s free version is a poison in its system, a marketing channel that burns billions of dollars to introduce people to a product that only ten million people will actually pay for, and its future depends largely on its ability to continue convincing people to use it. And in a few months, OpenAI’s integration with Apple devices launches, meaning that millions of people will now start using ChatGPT for free on their iPhones, with OpenAI footing the bill in the hopes they’ll upgrade to ChatGPT Plus, at which point Apple will take $6 of the $20 a month subscription. 

How does this continue? How does OpenAI survive? 

What Are We Doing Here?

I realize I’ve been a little repetitive in this piece, but it’s really important to focus on the fact that the leader in the generative AI space does not appear to make that much money (less than 30% of its revenue) helping people put generative AI in their products, and makes most of their money selling subscriptions to a product that mostly coasts on hype. 

While it may seem impressive that OpenAI has 10 million paying subscribers, that’s the result of literally every AI story mentioning its name in almost every media outlet, and its name being on the lips of basically everybody in the entirety of the tech industry — and a chunk of the business world at large. 

Worse still, arguably the most important product that OpenAI sells is access to its models, and it’s the least-viable part of its business, despite “GPT” being synonymous with the concept of integrating generative AI into your company. 

Because access to its APIs isn’t a particularly big business, OpenAI’s hopes of hitting its revenue goals is almost entirely contingent on ChatGPT Plus continuing to grow. While this might happen in the short term (though research suggests that 11% of customers stop paying after a month and 26% after three months), ChatGPT Plus has no real moat, little product differentiation (outside of its advanced voice mode, which Meta is already working on a competitor to), and increasing commoditization from other models, open source platforms, and even on-device models (like on Copilot+-powered PCs). 

The features that ChatGPT is best known for — generating code, summarizing meetings, brainstorming and generating stuff — can be done on any number of other platforms, in many cases for free, and OpenAI’s biggest competitor to ChatGPT Plus is ChatGPT itself. 

And I cannot express enough how bad a sign it is that its cloud business is so thin. The largest player in the supposedly most important industry ever can only scrounge together $1 billion in annual revenue selling access to the most well-known model in the industry. 

This suggests a fundamental weakness in the revenue model behind GPT, as well as a fundamental weakness in the generative artificial intelligence market writ large. If OpenAI cannot make more than a billion dollars of revenue off of this, then it’s fair to assume that there is either a lack of interest from developers or a lack of interest from the consumers those developers are serving. 

Remember when I said that OpenAI mentioned in its recent Devay that it had “cut costs for developers to access [their] API by 99% in the last two years” (which Max Zeff of TechCrunch posits may be due to price pressure from Google and Meta)? That suggests that OpenAI really has no room to start charging more for access to its APIs. This technology is commoditized, with few points of differentiation. It can’t, for example, raise prices for the sake of a better, more capable product. OpenAI is trapped.  

All of this is happening at a time when an astronomical amount of talent is leaving the company — the talent that OpenAI desperately needs to build products that people will actually pay for. 

At this point, it’s hard to see how OpenAI survives.

To continue growing ChatGPT Plus, it will have to create meaningful, mass-market use cases, or hope it can coast on a relatively-specious hype wave, one that will have to be so powerful it effectively triples its users. Even then, OpenAI will have to both find ways to give developers more reasons to integrate its models while making sure that said models provide a service that the end-user will actually appreciate, which is heavily-reliant on both the underlying technology and the ability of developers to create meaningful products with it. 

And OpenAI is getting desperate. According to Fortune, OpenAI’s culture is deeply brittle, with a “relentless pressure to introduce products” rushing its o1 model to market as Sam Altman was “eager to prove to potential investors in the company’s latest funding round that OpenAI remains at the forefront of AI development” despite staff saying it wasn’t ready. 

These aren’t the actions of a company that’s on the forefront of anything — they’re desperate moves made by desperate people burning the candle at both ends.

Yet, once you get past these problems, you run head-first into the largest one: that generative AI is deeply unprofitable to run. When every subscriber or API call loses you money, growth only exists to help flog your company to investors, and at some point investors will begin to question whether this company can stand on its own two feet. 

It can’t. 

OpenAI is a disaster in the making, and behind it sits a potentially bigger, nastier disaster — a lack of any real strength in the generative AI market. If OpenAI can only make a billion dollars as the leader in this market (with $200 million of that coming from Microsoft reselling its models), it heavily suggests that there is neither developer nor user interest in generative AI products. 

Perhaps it’s the hallucination problem, or perhaps it’s just that generative AI isn’t something that produces particularly-interesting interactions with a user. While you could argue that “somebody can work out a really cool product,” it’s time to ask why Amazon, Google, Meta, OpenAI, Apple, and Microsoft have failed to make one in the last two years.

Though ChatGPT Plus is popular, it’s clear that it operates — much like ChatGPT — as a way of seeing what generative AI can do rather than a product that customers love. I see no viable way for OpenAI to grow this product at the rate it needs to be grown, and that’s before considering its total lack of profit.

I hypothesize that OpenAI will successfully close its $150 billion round in the next few weeks, but that growth is already slowing, and will slow dramatically as we enter the new year. I believe the upcoming earnings from Microsoft and Google will further dampen excitement around generative AI, which in turn will reduce the likelihood that developers will integrate GPT further into their products, while also likely depressing market interest in ChatGPT writ large. 

While this bubble can continue coasting for a while, nothing about the OpenAI story looks good. This is a company lost, bleeding money with every interaction with the customer, flogging software that’s at best kind-of useful and at worst actively harmful. Unless something significantly changes — like a breakthrough in energy or compute efficiency — I can’t see how this company makes it another two years.

Worse still, If my hypothesis about the wider generative AI market is true, then there might simply not be a viable business in providing these services. While Meta and OpenAI might be able to claim hundreds of millions of people use these services, I see no evidence that these are products that millions of people will pay for long-term. 

For me to be wrong, these companies will have to solve multiple intractable problems, come up with entirely-new use cases, and raise historic amounts of capital. 

If I’m right, we’re watching venture capitalists and companies like Microsoft burn tens of billions of dollars to power the next generation of products that nobody gives a shit about.

Share
Comments

Welcome to Where's Your Ed At!

Subscribe today. It's free. Please.

Great! You’ve successfully signed up.

Welcome back! You've successfully signed in.

You've successfully subscribed to Ed Zitron's Where's Your Ed At.

Success! Check your email for magic link to sign-in.

Success! Your billing info has been updated.

Your billing was not updated.