DALL-E/Every illustration.

Does OpenAI’s Deep Research Put Me Out of a Job?

An investigation into a Sam Altman tweet

30 2

Was this newsletter forwarded to you? Sign up to get it in your inbox.


The pending unemployment of millions is a casual thing to announce with a tweet, but such is the age we live in. Last week, Sam Altman posted that according to his “very approximate vibe,” OpenAI’s new product deep research could “do a single-digit percentage of all economically valuable tasks in the world.” If you assume that each task represents a job, and that “single-digit” means 5 percent, then Altman is talking about the replacement of 8.2 million workers in the United States with last week’s announcement. 

If he’s right, we could soon start seeing mass layoffs and reduced hiring for anyone whose job involves doing research. As one of the people whose livelihood is directly threatened by this product, I thought this claim was worth stress-testing.

Deep research takes a question you have and scours the internet to answer it via a research report. It is one of the first AI agent products that 1) mostly works and 2) is available for ChatGPT’s hundreds of millions of users to purchase. As such, the discussion around the impact of AI employment just got less theoretical and more urgent.

The grand promise of AI is that it will automate the majority of existing knowledge work. This sounds cool! It also sounds like breadlines! My personal definition of knowledge work is any task that requires a keyboard—which is quite a lot of what we all do. To figure out if we will see a disruption of “single-digit percentage of all economically valuable tasks,” we need to start by attempting to answer the following questions:

  1. Has any other technology reached similar levels of disruption? 
  2. Does deep research have the potential to do that for knowledge work?
  3. Should the extremely handsome writer of this article prepare to move back to the farm?

Let’s get into it. 

What is technology anyway?

The short answer to whether other technologies have replaced labor like Altman claims is no. Unfortunately, the longer answer is probably yes. Allow me to explain. 

When we talk about “technology,” what we typically mean is some application of scientific knowledge that allows humans to perform new tasks or makes old ones easier. The more fundamental technologies, like electricity, underpin multiple technological applications. For example, in the hands of the right inventor, electricity can be applied to doing laundry, automating a large part of that task for busy parents. 

The key point here is that applications disrupt tasks, while fundamental technologies remake economies. 

Perhaps no situation is more instructive than the cotton industry in the 1780s through the 1890s in the U.K. In the beginning, the industry was reliant on a highly skilled labor force of handloom weavers who were crucial to the process of turning cotton into textiles. As mechanized looms became popular during this period, these workers slowly disappeared. 

On one hand, this is a story of technology's might. The cotton industry was initially small—about 1 percent of British GDP in the 1780s—and rose to 7–8 percent in 1813. Imports of cotton matched that increase, jumping from 26 million pounds to 300 million pounds in 1831-1835. Despite this growth, real wages for cotton weavers peaked in the early 1800s and plummeted to a quarter of that level by 1830.

Charts from Acemoglu and Johnson, 2024.

Interestingly, the size of the workforce didn’t decrease much between 1820 and 1830 even as wages cratered. 

That’s because weaving was quickly becoming low-skill work—mechanized looms required far less training, and almost anyone could get a job. The workforce finally began to decline once wages hit rock bottom in the 1830s. But it wasn’t until an economic recession in the 1860s, driven by the American Civil War, that handloom weaving jobs fully disappeared. 

During this entire 100-year period, tremendous profits were made, but almost everything went to factory owners. Mobility to entry-level jobs in other fields was limited because the other typical starter jobs were hard to find—in agriculture, for example, decreased land availability cut off a common path to employment for low-skill workers. Weavers were stuck. 

It is tempting to say automation is either good or bad for labor. The reality is something more nuanced. Technology may decrease costs and increase economic output. But whether the labor captures any of that benefit—or even if they escape the economic destruction of their profession—is dependent on a range of political and socio-economic factors that go beyond the technology itself. 

So no, technology rarely fully replaces labor overnight, but in the long run it forces labor to either go somewhere else or face reduced wages. 

The reality of a technology that can perform “economically valuable tasks” occurs when either:

  1. There is a new task to perform (like how computers helped us code).
  2. It automates an existing type of labor (dishwashers made it so we didn’t have to wash dishes). 
  3. It makes an existing form of labor more productive (hammers helped us construct homes faster).

With that framework, the natural next question is: What can deep research do, and in what context is it being applied? 

AI is coming

Deep research uses an AI agent to formulate a plan, search the web, and answer users’ queries. In Every’s internal experimentation we’ve found the tool to be useful, not perfect. If you are a subject matter expert you’ll find frequent, subtle omissions or distortions of the facts. Some of that can be fixed with careful prompting, but my rough heuristic is that the tool is currently at the skill level of a talented research intern or an entry-level analyst with less than six months experience. You have to know what kind of errors to watch for, but it is a fast and useful way to get 80 percent of the way to deeply researched answers to your questions. 

For example, I worked with it to find the right example for the section above. Look at this screenshot and tell me if you can spot the error. 

Author’s screenshot.

It got the dates wrong. Mechanization started earlier and handlooms lasted beyond this date range. Additionally, “hand-weaving” isn’t a very accurate term—it would be better to go with “handloom weaving.” 

But it’s heroin for nerds. I have spent hours upon hours reading research reports and asking questions that I’ve always wondered but didn’t want to bother to dig into the academic research for. It is an addictive product. And it works across a wide array of knowledge areas. 

At its current price of $200 a month (which is more than a free intern costs, but less than a decently paid one), deep research is almost certainly worth it to those who regularly read research reports as part of their job. In five years, I imagine the majority of the system’s subtle errors will disappear. Anyone looking for a job as an entry-level analyst will have a significantly harder time finding employment, or at least a living wage. Many aspects of how the internet works today—where publishers are incentivized to post information for free because users will find it and view ads at the same time—could be in jeopardy as well. 

Ultimately, to agree with Altman you have to believe that intern-level research makes up a “single-digit percentage of all economically valuable tasks in the world.” I am not one of those people. Just for fun, I asked deep research, “What percentage of U.S. GDP and labor force could be replaced by deep research?” (for no particular reason, I kept it to the U.S.). Its response: “Fewer than 5 percent of full occupations (i.e., jobs) can be entirely automated today.” Make of that what you will.

Still, this product was just released, and if the rate of AI progress continues, it feels reasonable to assume that deep research will be at the skill level of this very writer in less than three years (even assuming some professional growth on my part). 

Here’s one more thing that makes me nervous about deep research (and AI generally): The vast majority of examples of technologies that have automated labor, from the automobile to silicon transistors, have required on-site hardware. A machine had to be transported to a factory in order to be utilized. With AI, each user adds relatively small marginal costs for providers and zero distribution costs. ChatGPT is just another tab on your internet browser. When deep research was announced in early February, 125,000 ChatGPT Pro subscribers instantly had access to it. We simply don’t have any economic models or historical precedents for this.

Career disruption

For most of my career, I have prided myself on my ability to rigorously think through problems, find independent sources of data, and create materials that convince people to do what I think is right. Now OpenAI has released a suite of products that augments (replaces?) each of those: ChatGPT to think things through, deep research to find sources of data, and Canvas to create reports based on said data. The things that made me unique now threaten to make me replaceable for only $200 a month. 

This is not a comfortable realization. 

It does not mean that the tasks of a writer goes away! But it does mean that the bundle of tasks that writers traditionally handle are receiving downward pricing pressure from these technologies. The things that used to make me unique in the marketplace are now a purchasable good. Every has always been built by multi-faceted individuals who can code, write, create, bake, compose, design, etc. Our company structure is designed to support this bundling of capabilities. Still, as these models improve month by month, my contract as “lead writer” will look more and more overpriced if my responsibilities remain static.

You could make the argument that my innate sense of taste, my sense of what makes writing good, becomes more valuable in a world of deep research (note to self for my next compensation discussion). When deep research returned the table above, my spidey sense was immediately tingling that something was off. It just took me a few minutes of looking to find it. Since writers who are starting today won’t have the years of practice in financial models and Google docs that I do, they won’t be able to easily build that intuition. I don’t know if I believe that argument, but it is the only form of self-comfort I can offer myself—and, by extension, you. 

I don’t know if Altman is right with his statement. His estimate is a little high for my taste. But the rate of improvement from AI reasoning models is frighteningly fast. We are very close to needing relatively few people to do jobs that require deep research. 


Evan Armstrong is the lead writer for Every, where he writes the Napkin Math column. You can follow him on X at @itsurboyevan and on LinkedIn, and Every on X at @every and on LinkedIn.

We also build AI tools for readers like you. Automate repeat writing with Spiral. Organize files automatically with Sparkle. Write something great with Lex. Deliver yourself from email with Cora.

We also do AI training, adoption, and innovation for companies. Work with us to bring AI into your organization.

Get paid for sharing Every with your friends. Join our referral program.

Find Out What
Comes Next in Tech.

Start your free trial.

New ideas to help you build the future—in your inbox, every day. Trusted by over 75,000 readers.

Subscribe

Already have an account? Sign in

What's included?

  • Unlimited access to our daily essays by Dan Shipper, Evan Armstrong, and a roster of the best tech writers on the internet
  • Full access to an archive of hundreds of in-depth articles
  • Unlimited software access to Spiral, Sparkle, and Lex

  • Priority access and subscriber-only discounts to courses, events, and more
  • Ad-free experience
  • Access to our Discord community

Comments

You need to login before you can comment.
Don't have an account? Sign up!
@arushikhosla 7 days ago

Excellent piece, she wrote about an analysis of her own demise.
The closing sentence hit the mark -- most of my smartest friends all got started in what I now think of as a research factory. Sitting in the data and wading through it all manually WAS laborious but critical. Yeah, that's a comfort for the longevity of the careers of those of us in it, but kind of depressing for any new grads. Don't have particularly valuable advice for them atm.

Brad Z. 3 days ago

maybe I'm a naive optimist but AI’s impact on knowledge work always seems stuck in an old paradigm—one where AI is just a force for wage compression/worker displacement...that’s a scarcity mindset. to me, what’s actually happening is the rise of human-automatronic jobs (*in silico vibes), roles that weren’t even conceivable until now. i wish we were less into who gets replaced and more into what gets created. Idk. the human-AI interface isn’t a zero-sum fight over tasks, rather than(maybe) a new economic terrain where human capital is digitized, expanded, and integrated into some crazy-ass cybernetic workflows. LFG! Also, let me say it: work isn’t disappearing; it’s evolving into something post-scarcity, where intelligence is abundant. Perhaps it isn't even about protecting wages more than it’s about unlocking whole new strata of economic agency through synthetic labor markets.. human-aligned automation, etc. For transparency, how come we're not minting jobs instead of always 'losing' them. Thanks for letting me riff. Please write more about this topic. Accelerate.

Every

What Comes Next in Tech

Subscribe to get new ideas about the future of business, technology, and the self—every day