LLMs are the new CPUs

…but is OpenAI the new Intel?

162 2

Comments

You need to login before you can comment.
Don't have an account? Sign up!

Hi Nathan,
I do agree LLMs are becoming the new focal point for computing. But I don't think the memory analogy is apt. As you note, memory is a fully fungible component. When buying from Intel, TI, Samsung etc, you are buying a product with equivalent performance for a given specification. There is no qualitative difference. But for LLMs, each model outputs very different things. Swapping one for another immediately produces tangible effects. It's more like a Coke and Pepsi situation - you can taste the difference! And given that the product spec is entirely open ended and infinite in scope, I don't think it's even possible to produce a spec equivalent LLM. Talent, data, data processing, training/inference infra, distribution (via MSFT) are immense tailwinds that will produce product differentiation that's very tangible and lasting.

Ryan Smith about 2 years ago

It's not a given that 'best models' will continue to be accessed via API, especially if you include cost. Since LLMs can train other LLMs, machine IQ can be effectively photocopied.
https://www.artisana.ai/articles/leaked-google-memo-claiming-we-have-no-moat-and-neither-does-openai-shakes