.53.16_AM.png)
Was this newsletter forwarded to you? Sign up to get it in your inbox.
One thing you can count on seeing on LinkedIn these days: freelance writers sharing stories about losing clients because of supposed AI usage.The stories follow a familiar pattern. A writer reports they have been dropped by a client—not because their work is bad or inaccurate, but because an AI detection tool flagged their human-written content as machine-generated.
These aren’t isolated incidents. From the U.S. to the UK to South Africa to Pakistan, writers have been caught in a bizarre paradox: accused of using AI when they haven’t, while agencies, freelancer platforms, and individual clients alike double down on “zero-AI” policies enforced by detection tools that are, at best, inconsistent. One writer went as far as to offer to screen-record himself typing an article to prove it was “real.” A copywriter in Ohio sent his client time-stamped drafts to prove he’d written it himself—but the client walked anyway.
Why do these businesses care so much? I suspect they aren’t really worried about AI—they're clinging to an old belief that if work isn't visibly difficult to produce, it must be less valuable. When we dig beneath the surface of "no-AI" policies and detection tools, we find an age-old assumption that worth must be measured in struggle. This mindset shows up again and again, from “hustle culture” and the “rise and grind’ mindset that defined the 2010s to recent return-to-office mandates that prioritize presence over performance. In a culture that values butts in seats and availability on Slack, it becomes easy to mistake friction for effort and effort for worth.
Oddly enough, the very thing we’re resisting—the ease of AI—might be what sets us free. AI isn’t the first tool to challenge how we think about work, but it may be the most direct. By shifting the locus of effort, AI forces us to confront our dysfunctional relationship with work. It holds up a mirror to our culture’s deeply rooted belief that struggle equals value—and in that reflection lies a rare opportunity: to reimagine work in terms of outcomes, not optics; human flourishing, not performance theater.
Make email your superpower
Not all emails are created equal—so why does our inbox treat them all the same? Cora is the most human way to email, turning your inbox into a story so you can focus on what matters and getting stuff done instead of on managing your inbox. Cora drafts responses to emails you need to respond to and briefs the rest.
Performing ‘productivity’
Modern resistance to AI is just the latest chapter in a long history of moral anxiety about ease. From colonial Massachusetts—where idleness was literally illegal—to the rise of “scientific management” in the early 1900s, visible labor has been equated with virtue.
The Puritans made work a spiritual mandate. The Luddites died defending the dignity of skilled labor. And when Frederick Taylor brought stopwatch logic to factory floors in 1911, union leaders warned it would turn “proud artisans into mindless machines.” From era to era, new tools have triggered the same fear: not just of obsolescence, but of losing the moral weight we attach to effort.
A century after Taylor's stopwatch-wielding efficiency experts, we're still obsessed with measuring the performance of work rather than its results. Only now, the surveillance is digital, constant, and often invisible.
Anne Helen Petersen once memorably described this as "LARPing your job"—performing a theatrical version of productivity. Workers engage in elaborate displays of "being at work": staying visible on Slack, responding to emails at all hours, and maintaining a digital presence that signals industriousness. The tools have evolved—from software that monitors keyboard activity to AI that analyzes facial expressions in video calls—but the underlying philosophy remains pure Taylorism.
The irony is that these measurements often have little correlation with value creation. Knowledge work rarely follows linear patterns. Our most valuable contributions often come from reflection, seemingly "unproductive" conversations, exploration of dead ends, and invisible mental processing.
Consider a writer working on a complex piece. Their most productive day might involve two hours staring out a window thinking, a long walk where they mentally structure their argument, and 30 minutes of rapid typing. But depending on who’s doing the measuring—a client, a manager, a budget-conscious private equity buyer—only those final 30 minutes might count as “real” work.
This narrow view of productivity—favoring what can be seen, tracked, or timed—helps explain our conflicted relationship with AI. Yes, impact matters, and it would be overly simplistic to claim organizations don’t reward outcomes. But in many contexts, especially where results are hard to measure, visible effort still carries disproportionate weight. When so much of our working life is spent proving we’re working, it’s no wonder we’re uneasy with tools that make effort disappear.
AI: The ultimate easy button
Become a paid subscriber to Every to unlock the rest of this piece and learn about:
- How AI redistributes labor rather than eliminating it
- 3 ways to break our dysfunctional relationship with work
Ideas and Apps to
Thrive in the AI Age
The essential toolkit for those shaping the future
"This might be the best value you
can get from an AI subscription."
- Jay S.
Join 100,000+ leaders, builders, and innovators

Email address
Already have an account? Sign in
What is included in a subscription?
Daily insights from AI pioneers + early access to powerful AI tools
Comments
Don't have an account? Sign up!
This makes me think about the old saying, do not mistake activity for achievement!! In this case, do not mistake the lack of activity for lack of achievement.
Awesome article!
I am very much in agreement with most of the perspective here. But the title and somewhat the conceit of the article I think fail to recognize another key reason for people's hesitance around accepting use of AI in creative work-for-hire which is the murky waters of copyright regarding the training data. That said the core point about the opportunity to redefine "productivity" and our role in work is compelling, though admittedly I have little faith that our society and current economic model will allow it.
I had so many conversations about this recently as AI makes it's way into the lives of more and more people. Mostly I spoke to 'deciders' that received something that they thought was AI. They were pondering: is that good or bad? I love that this article dives deep into how we can work through this better and the last line is such a banger! "Are you concerned about the quality, or just uncomfortable with the ease?"
PS: My view was...
800+, that's the days that I've obsessed over AI, using it daily and building my capabilities and understanding AI's capabilities
157+, that's the IQ of the smartest models I am using to augment my intelligence.
My prediction: within a year there any truly critical work where AI wasn't used in the process is seen as a liability