OpenAI: the Next Tech Giant?
Reading the tea leaves and reacting to the hype
It’s pretty rare for a company to transition from a research lab to a developer infrastructure provider to a behemoth consumer app in just a few years. But given the launch of ChatGPT plugins last week, there’s a fair chance this could end up being the story of OpenAI.
Plugins allow ChatGPT to browse the web and interact with services like Kayak and Instacart to perform tasks for users beyond just generating text. The news marks OpenAI’s definitive step out of the land of research and into a vastly ambitious and uncertain new world, competing to earn its place as perhaps the newest tech giant, alongside Google, Microsoft, Apple, and Facebook.
Packy McCormick, never one to shy away from a bold optimistic claim, expressed his prediction for how this will turn out with the following meme:
Even if you’re not as bullish as Packy, it’s clear this is an important moment. My hunch is that soon we will all be using ChatGPT with plugins—or something like it—almost every day, for the foreseeable future. I don’t know who will be the dominant market player, or how the value will get captured, but the product/market fit of this emerging category of AI chat product is very real. It is on the same level of importance as the PC, the web browser, the search engine, and the smartphone. Seriously.
Last week, on the day that ChatGPT plugins were announced, I was at an AI conference hosted by Sequoia in San Francisco where Sam Altman was speaking. The announcement tweet dropped at 10 a.m., and you could practically see the news ripple through the attendees. Did you hear? Have you tried it? Wow, you had the beta? Damn…is it good?
Also, keep in mind that these were not impressionable or easily excited rubes! Many of the smartest investors and CEOs in AI were at this event, and they instantly understood the significance of this announcement.
It’s now possible, perhaps even plausible, that OpenAI considers itself primarily a consumer business. It may have started as a research lab and in recent years evolved into an AI infrastructure provider, but this may not be its final form. Over at Stratechery, Ben Thompson even went so far as to predict that OpenAI should and eventually may shut off all API access to developers, calling it a waste of resources and a distraction.
So—what’s going on? There are so many questions:
- What are ChatGPT plugins?
- Why are they so important?
- Is OpenAI going to become the next tech giant? Spoiler: I’m excited about the product, but slightly less bullish on the business than most.
- Should developers using OpenAI’s APIs be concerned?
It’s obviously a rapidly evolving situation, but I am going to do the best I can to answer all these questions and more. I’ve structured this week’s post as a sort of skimmable FAQ, because I want to cover all the basics even though I know a lot of you already know them. Feel free to skip to the parts that are most interesting to you!
(Maybe next time I’ll just write this essay as an input to a chatbot that you can talk to. 😅)
What are ChatGPT plugins?
In a nutshell, they let ChatGPT perform actions besides just generating text, and they allow ChatGPT to access external information that is not included in its training data.
In the past, developers were able to use OpenAI’s APIs in their own products. Now OpenAI is using developers’ APIs in its product.
So far, OpenAI has built three first-party plugins:
- Web browser: Capable of searching the web, clicking links, and finding current information.
- Code interpreter: Can run Python code and read the output in a sandboxed environment.
- Retrieval: This one is a little different from the first two, in that it’s more of a template that others can use to build their own plugins, rather than a finished plugin of its own. But basically it lets you upload a bunch of text and allows ChatGPT to use that text to answer questions. If you’ve ever seen a project like “chat with a book” or “chat with a newsletter” this is basically a way to do that inside ChatGPT.
In addition to these three first-party plugins, OpenAI has partnered with 11 companies to build third-party plugins for ChatGPT. It seems like it wanted to demonstrate a broad variety of use cases. Here’s a screenshot from the announcement page with descriptions for each:
Why are ChatGPT plugins so important?
Large language models (LLMs) like GPT-4 excel at understanding text, reasoning, and following instructions. But they are not capable of storing all the world’s facts or retrieving them with accuracy. Also, on their own, LLMs are only capable of generating text.
Plugins solve many of the biggest issues with ChatGPT in one fell swoop:
- Hallucination — Otherwise known as “making stuff up,” this is one of the main reasons it’s hard to rely on ChatGPT. When you ask it a question, it will sometimes give a false answer rather than telling you it doesn’t know. But when you give the AI accurate information from an external source, it solves the problem almost completely. Plugins accomplish this.
- Stale information — It’s expensive to train an LLM, and once you do, it’s hard to update with new information (like the news of the day or the status of your upcoming grocery delivery). Plugins solve this by injecting fresh information into the prompt.
- Private information - You can’t ask ChatGPT for your bank account balance. That data is changing all the time and it’s private. But with plugins, JP Morgan Chase (or any other service you use) could allow you to connect your account to ChatGPT.
- Taking actions — While I love reading text, I have to admit that it’s not always what I want. Actions are even more important. Plugins allow LLMs to not just generate text but take actions in external services. Anything you do on your computer or phone, ChatGPT may be able to do for you in the future.
By solving these four problems, plugins make ChatGPT easily 10x, if not 100x, more useful. That is a big deal, since it is already useful enough to attract the fastest growing userbase in history.
But besides making the product more useful, plugins also have the potential to create something of a flywheel for OpenAI.
Flywheels are basically virtuous cycles. Every tech giant has a powerful one. For example, Jeff Bezos famously sketched Amazon’s flywheel on a napkin, demonstrating how lower prices and wider selection would allow Amazon to attract more customers, which would help them offer lower prices and attract more sellers, which attracts more customers—you get the idea.
Here’s a potential flywheel that OpenAI may develop around plugins:
Doesn’t [insert startup name] already do this?
Yeah, it probably does. Many developers have built applications that bring external information into LLMs and enable LLMs to take actions. But none of them have as much usage or influence as ChatGPT. So far, only ChatGPT has the scale to get most major companies to adopt it.
It’s similar to when ChatGPT itself launched back in November. Others had built comparable products using GPT-3 already, but none broke out like ChatGPT did. OpenAI has more credibility and can reach more users than most developers.
That being said, it’s not all bad news for developers.
Will OpenAI control the LLM plugin ecosystem? (Plot twist: It won’t!)
Here is where things get very interesting. Up until now, the story has been a relatively simple one: ChatGPT is kind of like the iPhone, and plugins are kind of like apps. Except that’s not the whole story. Not even close.
The way plugins work for ChatGPT is very different from the way iPhone apps work. First of all, iPhone apps are much more complex to build. And when you build them, they only work on the iPhone. But ChatGPT plugins are much simpler, and they are built in a way that any other LLM will be able to make use of them.
So OpenAI will definitely have a lot of influence over how plugins work, but I think there’s a good chance that it will play a role that’s more similar to Apple’s in the podcasting world.
In podcasting, Apple defined a standard format that all podcasters must conform to in order to get their show into the Apple Podcasts app. But the reason there are so many podcast apps is that anyone can make an app that reads from that same format, and anyone can make a directory that lists all the podcasts. It’s not centrally controlled, the way YouTube is. OpenAI plugins work basically the same way.
That’s why I tweeted yesterday that someone should build an open directory of AI plugins, so that anyone can build them into their product or service.
It’s important to note that this sort of open industry architecture has advantages, but it also has disadvantages. On the plus side, an open, standards-based ecosystem is wonderful because it has such low barriers to entry. Anyone can build something and put it out there. But on the other hand, sometimes standards can get stuck, and they’re easier to change when you have a central player. Mike Mignano, who used to run podcasting at Spotify, wrote about this in Every a few months back. I also wrote on the topic a few years ago.
So will OpenAI become the next tech giant?
That’s the trillion-dollar question, isn’t it?
My current perspective is that OpenAI definitely seems more likely than any other company I know of to reach the same stratospheric heights (in terms of impact, users, and profits) as Google, Apple, Microsoft, Amazon, and Facebook. I agree with most of the bullish analysis that Packy laid out earlier this week. But I still think it’s a long shot.
This is for a few key reasons:
- In general, if we want to be good Bayesians and take into account prior likelihood of an event when estimating future likelihood of an event, then we should be very skeptical that any company will grow to be that large. Nature is chaos, and chaos causes most companies to hit a ceiling at some point much sooner than the giants did.
- Most people analyzing the usefulness of plugins don’t take into account strongly enough the fact that this will be very hard for OpenAI to control and contain, the same way Apple has with the iPhone.
- I don’t even think OpenAI wants to control its plugin ecosystem in the same way Apple has. OpenAI easily could have made a more locked-down system, but it chose not to. This might limit its growth, but be good for the world, in the same way that the internet itself is much bigger and more useful than it ever would have been if one company controlled it.
- As I explained a few weeks ago, LLMs have a very simple interface: text in, text out. If you have one LLM that is just as good as another, it will be fairly easy to swap them out without users noticing too much. Now that it’s clear how important this market will be, it’s hard to imagine we won’t see competition starting to catch up. And when they do, it won’t be that hard for developers to switch.
Given this, I think OpenAI is going to be an incredibly large and important company, but it’s hard to see it becoming as big or as important as the giants. That being said, I don’t have a huge degree of certainty about this. It could prove me wrong!
Should developers using OpenAI be concerned that it will shut down their APIs?
I think this is incredibly unlikely. Even Ben Thompson, who argued that OpenAI should do it, said he doesn’t think it will for a long time.
ChatGPT may be well on its way to being a lasting consumer hit, but there is still a lot of uncertainty. Google has barely woken up. If it decides to have an LLM generate text for every search a user performs, there will be far less reason to navigate directly to ChatGPT.
Also, OpenAI has an unusually deep partnership with Microsoft, and it is in many ways dependent on them. Without Microsoft’s infrastructure, OpenAI would not be able to scale as quickly as it has. So is it really going to go all-in on being a consumer company and competing with Bing Chat? I think it’s unlikely.
Microsoft makes a lot more money on Azure and developer services than they do from Bing. Even if Microsoft ends up acquiring OpenAI, I think it’s far more likely that Microsoft would want to roll it into its Azure group than to reserve OpenAI’s APIs for Bing Chat’s use exclusively.
And even in the extremely unlikely event that OpenAI gets rid of its APIs, developers using them would be able to switch to an alternative LLM provider quickly. Claude, from Anthropic, is quite good. Bigger and better open source models are getting released each week. I’m not worried about the rate of progress here.
The bottom line, for me, is that it’s an exciting time to be working in this space. I’ve been in tech for about a dozen years, and I’ve never seen anything move so fast. I remember when the iPhone App Store launched—it was nothing compared to this.
The main take-away for builders is to just focus on your users and keep making steady progress. If you want to wait for competitive conditions to settle down and become more predictable, it will be too late. The best thing to do is the simple thing: Make something people want.