What OpenAI’s Alleged ‘Top 30’ Customers Reveal About Enterprise AI Adoption

Learn what OpenAI’s alleged top 30 customers reveal about the trends in enterprise AI adoption.

Hide Me

Written By

Joshua
Reading time
» 6 minute read 🤓
Share this

Unlock exclusive content ✨

Just enter your email address below to get access to subscriber only content.
Join 114 others ⬇️
Written By
Joshua
READING TIME
» 6 minute read 🤓

Un-hide left column

OpenAI’s alleged “top 30” enterprise users: what this leak hints at, and why it matters

A table has been doing the rounds on Reddit, claiming to list OpenAI’s 30 biggest customers by token usage – reportedly more than 1 trillion tokens processed. OpenAI hasn’t confirmed the list, but if it’s even directionally right, it offers a useful snapshot of how enterprise AI adoption is taking shape.

Before we get carried away: the source is unverified, and several details are not disclosed. Still, the mix of companies named and the categories they fall into align with what we’re seeing in the market across the UK and beyond.

Source: Reddit discussion.

Who’s reportedly in the “top 30”, and what patterns stand out

The list blends AI-native startups (Perplexity, Cognition, Sider AI), vertical specialists (Abridge in healthcare, Tiger Analytics in services), developer infrastructure (JetBrains, Warp.dev, Datadog), and large SaaS platforms (Salesforce, Shopify, Zendesk, Notion). There are also consumer-scale brands like Duolingo, Canva, WHOOP, and a major telco (T-Mobile).

The Reddit post groups them into four archetypes. Here’s how that maps out:

Archetype What it means Examples (from the list)
AI-Native Builders Products fundamentally designed around reasoning and LLM-first workflows. Cognition, Perplexity, Sider AI
AI Integrators Established platforms embedding AI into existing customer workflows. Shopify, Salesforce, Notion, Zendesk, HubSpot
AI Infrastructure Developer tools and platforms enabling building, routing, and observability. OpenRouter, Warp.dev, JetBrains, Datadog
Vertical AI Solutions Domain-specific apps optimised for a single industry or task. Abridge (clinical notes), WHOOP (health), Tiger Analytics (services)

That spread matters. It suggests AI is not just a feature; it’s becoming a stack. From routing and DevOps, to product-layer assistants, to specialist vertical tools, value is accruing across the pipeline – not just at the chatbot front-end.

About that “token war”: economics and scale

Tokens are the basic billing unit for most large language models (LLMs). Roughly, 1,000 tokens equals 750 words of text. If the leak is accurate, these firms are consuming tokens at extreme scale, either in product features (e.g. AI search, meeting notes) or behind-the-scenes automations.

“Whoever compounds reasoning the fastest shapes the next decade of software.”

Token economics drive every design decision: prompt length, context window, retrieval pipelines, and model choice. If you’re rolling out AI at work, monitoring tokens is non-negotiable. Pricing varies by model and tier; see OpenAI pricing for current rates, and plan for variability as models change.

One claim in the Reddit post is that “over 70% of ChatGPT usage is non-work” – advice, planning, personal writing. That’s not confirmed by OpenAI here, so treat with care. It does align with observed behaviour: consumer-scale habits often precede enterprise adoption, and those habits inform the product surface area that integrators then bring into organisations.

Why UK teams should care: risk, compliance, and opportunity

Data protection and residency

UK organisations need clarity on where data goes, how it’s stored, and who can access it. If you require regional controls, consider Azure OpenAI Service, which offers enterprise-grade compliance and data residency options via Microsoft’s cloud regions (including UK regions). For direct OpenAI use, review data usage and retention settings carefully and ensure a data processing agreement is in place.

Procurement and lock-in

Heavy token usage can lead to vendor concentration risk. Build abstraction layers where possible (e.g. model routers, internal APIs) so you can trial alternatives without rewriting your entire stack. The presence of OpenRouter and Datadog on the list reflects the importance of routing, monitoring, and observability as token volumes grow.

Model reliability and safety

Even the strongest LLMs hallucinate. For regulated workflows (healthcare, legal, finance), combine models with retrieval-augmented generation (RAG) and clear human-in-the-loop checkpoints. Track failure modes, run A/B tests, and establish model performance baselines before scaling.

Practical steps for UK developers and product teams

  • Pick your archetype: are you integrating AI into an established workflow, or building an AI-native product? The build-versus-integrate choice sets your architecture and cost profile.
  • Instrument token spend early: log prompts, completions, errors, latency, and per-feature costs. Tie token consumption to business metrics (leads, resolved tickets, reduced minutes per task).
  • Start with contained, high-ROI workflows: meeting notes, support triage, sales email drafts, or code review assistance. Keep humans in the loop until you have confidence in outputs.
  • Use governance by design: prompt templates, content filters, role-based access, red-team tests, and clear user messaging on limitations and data handling.
  • Meet users where they are: simple connectors into spreadsheets, docs, CRM, and ticketing systems beat flashy demos. If you’re experimenting, here’s a practical guide to wire AI into Sheets: How to connect ChatGPT and Google Sheets.

What we don’t know (yet)

  • Verification: OpenAI has not confirmed the table. Treat it as indicative, not definitive.
  • Token totals: It’s unclear whether “over 1 trillion tokens” applies per customer or in aggregate. Not disclosed.
  • Timeframe: No dates are provided. We don’t know over what period the usage was measured. Not disclosed.
  • Model mix: No breakdown by model family (e.g. GPT-4o variants) or modality (text, image, audio). Not disclosed.

The competitive landscape: where value may accrue

If this leak is even half right, two themes stand out. First, integrators with distribution (Salesforce, Shopify, Zendesk, Notion) can ship AI into existing workflows at pace. Second, AI-native builders focused on reasoning (Perplexity’s search, Cognition’s coding agents) are pushing the frontier and creating new usage patterns that everyone else will copy.

For the UK, the take-away is simple: AI capability is becoming a capability advantage. Whether you buy it (integrate), build it (AI-native), or run it (infrastructure), the teams that instrument cost, quality, and safety from day one will be better placed to scale when the proof-of-concept glow fades.

Final word

The token war has already started.

Even if the list is wrong in places, the direction of travel is right. Track your tokens, pick the right architecture for your goals, and build the governance to match your ambition. The rest is execution.

References

Last Updated

October 12, 2025

Category
Views
276
Likes
0

You might also enjoy 🔍

Minimalist digital graphic with a yellow-orange background, featuring 'Investing' in bold white letters at the centre and the 'Joshua Thompson' logo below.
Author picture
Caledonian’s strategic pivot into financial services, fuelled by fresh capital and two new investments.
This article covers information on Caledonian Holdings PLC.
Minimalist digital graphic with a yellow-orange background, featuring 'Investing' in bold white letters at the centre and the 'Joshua Thompson' logo below.
Author picture
Explore Galileo’s H1 loss, steady cash, and a game-changing copper tie-up with Jubilee in Zambia. Key projects advance with catalysts ahead.
This article covers information on Galileo Resources PLC.

Comments 💭

Leave a Comment 💬

No links or spam, all comments are checked.

First Name *
Surname
Comment *
No links or spam - will be automatically not approved.

Got an article to share?