Advice for Building in AI

Separating the signal from the noise

The past three months since we launched Lex have been wild. I’ve been heads down doing all the mundane (and fun!) things you need to do to transform an app from an exciting toy into an essential tool—talking to users, fixing bugs, tweaking the funnel, and building features. Meanwhile, the hype around AI has kept growing. There are tens of thousands of smart builders who want to launch startups in AI, and an even greater number of engineers and product leaders at incumbent technology companies that are integrating AI into their existing products.

Because this is all so new, it’s hard to understand clearly, and I see a lot of misconceptions and misunderstandings. This week on Divinations is my attempt to clear them up and offer some practical advice.

Resist broad generalizations

AI is like steel or electricity—a broadly useful underlying technology that is becoming a part of almost everything we use. It’s being integrated into free consumer apps, SMB SaaS, enterprise software, and everything in between.

Be wary of any one-size-fits-all advice about AI companies. One line I’ve heard is that “distribution is everything, because everyone has access to the same models.” That might be true, but when you’re trying to engineer viral distribution for a consumer app, the work looks very different from if you’re selling $100-per-month software to small businesses. And scaling up distribution for a bad product that fails to retain users is just lighting money on fire.

Ultimately the AI is a new raw ingredient to help you solve some customer problems that weren’t solvable before. The same old fundamental dynamics of whatever type of business you’re in still apply.

AI technology isn’t anyone’s moat—but that doesn’t mean moats will not be built

The conventional wisdom seems to be that AI companies building at the application layer will struggle because they’re reliant on fundamental models built by companies like OpenAI. Building A fundamental model is considered more attractive and defensible. I’m not so sure about that.

Just because something is expensive to make or requires a lot of technical expertise does not mean it is defensible. You know what else is hard to make, benefits from economies of scale, and requires rare technical expertise? OLED TVs, SSDs, and batteries. But these are all almost completely commoditized technologies.

Technical sophistication is not a moat in any industry. It can be a temporary hurdle, but it never lasts.

So where does real power and durability come from? From the same sources it always has: economies of scale, network effects, counter-positioning, switching costs, brand, etc. If OpenAI does end up becoming a dominant player for the long run, it will be because of some combination of the above forces, not because it’s too complex to copy.

Technological complexity and sophistication is enough of a barrier to keep away smaller startups, but as long as there is at least one other player that has the talent and resources to create a good-enough alternative, OpenAI’s bargaining power relative to customers is decimated.

Products using AI aren’t necessarily ‘wrappers’

The term that is most likely to die in 2023 is “wrapper.” The underlying idea behind the term is that the real value comes from the base model—such as GPT-3 or Stable Diffusion—and the application is a thin layer only useful to the extent that it allows users to access the underlying model.

But the vast majority of products utilizing these models are not wrappers. They have purpose-built interfaces to perform specific tasks, like helping sort sales leads or summarize legal documents, in ways that the base models can’t do on their own. Calling AI-powered apps wrappers would be like calling toasters and Teslas “wrappers for electricity.” Sure, they’re dependent upon electricity, but they clearly provide value beyond what a live wire offers.

Ignore the hype

We live in a moment in history where you can launch a simple thing—and tens of thousands of people will tell you it’s amazing. It feels good and the boost it gives you is real, but it’s ephemeral. Take it for what it is.

When we launched Lex, my Twitter DMs were overflowing with interest. Now, it’s more of a steady trickle, but I feel better than ever because I’m spending my time talking to users, writing code, and monitoring the metrics—which are thankfully looking good.

There’s an important difference between growth and hype, and you should care 100% about the former and 0% about the latter. They’re totally independent variables. It’s possible to quietly grow an amazing business, or create a “15 minutes of fame”-type moment around a cool demo or big vision. It’s also possible to have both at the same time.

Just don’t get confused about which is which.

AI-powered applications are mostly not about AI

If you’re building at the so-called “application layer,” most of your time will be spent doing all the things companies in your space usually do: building a great product, refining your go-to-market motion, understanding your usage data, and building your team.

It’s been a funny feeling to spend all my time on this stuff and see how relatively unimportant all the AI theorizing on Twitter and Substack are to my day-to-day challenges. That’s why until this week I’ve mostly been writing about regular old principles of product building. It’s just more relevant and useful to my current problems.

This feeling I’ve had reminds me of the famous (possibly misattributed) Picasso quote:

When art critics get together they talk about form and structure and meaning. When artists get together they talk about where you can buy cheap turpentine.

For present purposes, I’d amend it to:

When AI commentators get together, they talk about alignment and defensibility and AGI. When builders get together they talk about where you can buy cheap GPUs.

This sentiment reflects a general truth: excellence in anything is mostly about working hard and getting a lot of small details right. Execution is exponential. The “big idea” is just a seed. How the seed unfolds is what really matters.

Like this?
Become a subscriber.

Subscribe →

Or, learn more.

Read this next:


Inside the Clubhouse

The surprisingly compelling audio app that has consumed my life

243 Apr 24, 2020 by Nathan Baschez


Execution is Exponential

The math that quantifies how much execution matters

110 🔒 Aug 10, 2022 by Nathan Baschez


Bundle Magic

How to make 1 + 1 = 3

128 Apr 16, 2020 by Nathan Baschez


Announcing Our Newest Thesis Headliner: Eugene Wei

We are 60% sold out—buy tickets while you can!

11 Jan 26, 2023

Developing a Worldview

The art of shared meaning-making

44 Jan 31, 2023 by Casey Rosengren


You need to login before you can comment.
Don't have an account? Sign up!

Understand AI

Get one actionable essay a day on AI, tech, and personal development


Already a subscriber? Login