When Creation Goes To Zero
Skynet Who?

Sponsored By: INDX
Tired of endless scrolling while looking for great long-form content? Download INDX and find your next great read today.
This sounds simple, but it will change the world so pay attention: The most significant consequence of the internet was that it pushed distribution costs to zero. The most significant consequence of AI is that it will push creation costs to zero.
You can essentially boil down the atomic activity of any company into:
- Create stuff
- Acquire customer to buy stuff
- Distribute that stuff
If you put that into the context of the income statement you would see:
- Cost of goods sold
- Sales and marketing expense
- Other operating expenses
The internet broke the third category of distribution, and now AI is going to break the first one. Innovations like GPT-3, DALL-E, and other AI tools will dramatically decrease the cost of producing all goods with a digital component (aka everything).
I don’t want to veer into hyperbole, but I’ve never been so excited and so scared simultaneously by a technology category. Theoretically, I knew these changes were coming, but I have been stunned by how quickly they have occurred. My current feeling is that in 5-10 years, the power dynamics of knowledge work will look radically different. Our relationship with information and creation will be fundamentally altered.
For some categories of companies, AI tooling will be a disruptive innovation—one that renders their entire business obsolete. For other categories, they will be a sustaining innovation that can allow them to serve a similar set of customers at decreased costs.
The biggest question to me is this: when all that is left to compete on is acquisition, what does our economy look like?
This article is what I’m using to try to answer that question.
What Are These AI Tools—and Why Now?
Whenever a technological revolution occurs sooner than expected, it is worth examining what inputs into the progress bar suddenly sped up.
In the case of AI, I would point to two-step changes. 1) the introduction of transformer models allowed for computers to understand text and 2) an astronomical shit-ton of money and computing power that was unlocked over the last few years. Moore’s law, the napkin math rule from the Intel co-founder that computing power doubles every two years, has started to run out of steam in CPUs but GPUs have taken up the banner. It turns out GPUs are great for running deep learning and other machine learning algorithms so whenever the solution to a problem was unclear you could theoretically answer it with “What if we just dumped more compute power into the system.” Money coming in from Google, Amazon, and OpenAI allowed these resource constraints to be removed.
What if you could see everything your curious friends are reading?
Introducing INDX— a social platform to share thought-provoking, long-form reads, and podcasts.
Ditch the doom scrolling and find your next great read today.
With the combination of these two step changes, we are now at the point where computers can understand context and then apply that context to a defined set of parameters. It isn’t perfect! There can be weaknesses in the language models or in the application of the text inputs, but it is dramatically ahead of where anyone would’ve predicted five years ago.
The basic flow for these tools is open text box, type out in plain English what you want the AI to do, the AI does it. That’s it! No more messy 10-step workflows or languages you need to learn. It is as simple as typing out what you want.
Two weeks ago my colleague Nathan compared this new generation of AI software to what came before as follows,
“Instead of software that mimics a paintbrush, we now have software that mimics the painter.”
Thanks to our Sponsor: INDX
We finally have a focused place to share long-form content in crypto, tech, and productivity. Join INDX today and find your next great read, from a friend.