OpenAI just raised $122 billion at an $852 billion valuation three weeks ago. Today, the Wall Street Journal reported that the company missed its own internal revenue targets multiple times in 2026 and fell short of a goal to hit one billion weekly active ChatGPT users by the end of 2025. The result: Oracle dropped 6%, Nvidia fell 5%, and the entire AI infrastructure trade took its worst single-day hit in months.
But the number that matters isn’t the stock price. It’s a warning from inside the building. CFO Sarah Friar told OpenAI’s leadership that she’s worried the company may not be able to pay for its future data center contracts if revenue doesn’t grow fast enough. That’s not a bear case from a short seller. That’s the person who signs the checks.
The Billion-User Goal That Never Arrived
OpenAI set an internal target of reaching one billion weekly active users on ChatGPT by the end of 2025. It didn’t get there. The company hasn’t disclosed its actual user count since announcing 300 million weekly actives in late 2024, which tells you everything about where that trajectory went.
The miss isn’t surprising if you’ve been paying attention. ChatGPT’s consumer growth has flattened since the novelty wore off. The product is now competing with Google’s Gemini baked directly into Search, Anthropic’s Claude gaining serious traction among developers and enterprises, and a dozen open-source alternatives that are good enough for most casual use cases. The era of ChatGPT as the only AI game in town ended sometime in mid-2025, and user numbers reflect that.
Revenue Misses Are the Real Problem
User growth is a vanity metric. Revenue is the one that pays for 500,000 GPUs. And OpenAI missed multiple monthly revenue targets in 2026, according to the WSJ report. The company’s annualized revenue run rate was last reported at roughly $25 billion, but that number was supposed to be climbing much faster than it has.
The culprit, per the report: Anthropic gaining ground in coding and enterprise markets. This is the quiet story that’s been building for months. Claude has become the default coding assistant for a significant chunk of the developer community. Enterprise customers who tested both platforms increasingly chose Anthropic’s offering for reliability, context handling, and safety. Every dollar that went to Anthropic was a dollar that didn’t go to OpenAI’s enterprise revenue line.
Google’s aggressive bundling of Gemini into Workspace and Cloud also cut into the addressable market. When your AI assistant comes free with Gmail, paying $20 a month for ChatGPT Plus becomes a harder sell.
The Data Center Math Doesn’t Work Without Growth
Here’s where Sarah Friar’s warning becomes existential. OpenAI has committed to hundreds of billions of dollars in data center contracts over the next several years. The $300 billion deal with Oracle alone represents a five-year commitment that requires revenue to not just grow, but to compound at rates that would make a SaaS company blush.
OpenAI’s board directors, according to the WSJ, have started questioning CEO Sam Altman’s push to secure even more computing power despite weakening revenue. This is the first public signal that Altman’s “spend now, monetize later” strategy is facing real internal resistance — not from critics or competitors, but from the people with fiduciary responsibility to OpenAI’s investors.
The math is straightforward. If you’re spending $14 billion a year (OpenAI’s last disclosed loss rate) and revenue growth stalls, the burn rate doesn’t just stay the same — it accelerates as data center commitments come due. You can raise more money, but at some point the cap table becomes so diluted that even an IPO at a $1 trillion valuation doesn’t generate meaningful returns for later investors.
Why Chip Stocks Took the Hit
Oracle’s 6% drop makes sense — it’s directly exposed to OpenAI’s ability to pay for compute. But why did Nvidia fall 5%? Because OpenAI is the single largest buyer of AI chips on the planet, and if its CFO is worried about paying for data centers, the entire demand curve that underpins Nvidia’s $3 trillion market cap just got called into question.
This is the scenario bears have been warning about for two years: the AI infrastructure buildout is being financed by companies that haven’t proven they can generate enough revenue to justify the spend. OpenAI was supposed to be the proof point — the company that demonstrated AI could be a $100 billion business. If even OpenAI can’t hit its own targets, every hyperscaler’s AI capex plan suddenly looks shakier.
Broadcom and AMD also fell between 3% and 5%. The sell-off wasn’t about OpenAI specifically — it was about what OpenAI’s miss implies for the broader AI demand story.
OpenAI’s Response Was Telling
The company pushed back on the WSJ report with a statement to CNBC: “This is ridiculous. We are totally aligned on buying as much compute as we can and working hard on it together every day.”
Notice what that response does and doesn’t do. It doesn’t deny the revenue misses. It doesn’t deny the user growth shortfall. It doesn’t deny Friar’s internal warning. It reaffirms the commitment to buying more compute — which is precisely the thing the board is now questioning. The response reads less like a rebuttal and more like a CEO who’s doubling down because the alternative is admitting the strategy has a problem.
The Verdict
OpenAI has built the most valuable private company in history on a bet that AI revenue will eventually catch up to AI spending. Today’s WSJ report is the first credible signal that the revenue side of that equation is wobbling — and the warning came from inside the house.
The company still has enormous advantages: brand recognition, a massive user base, and access to virtually unlimited capital. But capital is not revenue. And when your own CFO is telling leadership that the bills might not get paid, investors in the entire AI supply chain — from Oracle to Nvidia to AMD — have every reason to recalibrate.
The AI bubble hasn’t popped. But today, for the first time, you can hear the rubber stretching.