If you follow the latest social science research, the rationalist blogosphere, or Y Combinator’s Hacker News, you’ve probably encountered the calls to “intellectual humility.” Its proponents rightly caution us against the false and easy certainty characteristic of so many on all sides of the culture wars. But their implied solution is to renounce certainty as a goal—which is at best a cop-out, and at worst a capitulation to the deadliest dogmas around today.
To the extent that we want to live well and build things worth building, we need to be right, a lot, and we need to know that we are—so that we have the courage and discernment to act on the knowledge we do possess. This is at least as important as “knowing what we don’t know,” and arguably moreso, if our goal is not merely to avoid missteps but to boldly and actively live well.
The best thinkers—and thus the best builders—are not intellectually humble but intellectually ambitious. They respect the work that achieving real knowledge and earned certainty requires, and they embrace that work as the requisite to a life fully and consciously lived.
Yes, this means that they question and rethink their own assumptions, readily admit and learn from mistakes, moderate their confidence in a given hypothesis based on the strength of the evidence, and actively keep their own biases in check. But it also means they push for deep understanding (e.g., via identification of first principles) on issues that really matter, audaciously assert their reasoned conclusions in the face of doubt and disagreement, and accrue wisdom and self-trust so they can be more right more often, rather than settling for “less wrong.”
So yes, if you truly know little (whether in general or in a given domain), hurry up and admit it; not so you can brandish your humility, but so you can get on with the ambitious quest to learn more.
And if you know more than you’re letting on, whether to yourself or others: Dig deep for the courage to speak and act on your convictions, without hedging or apology. Know that there are those of us who recognize and respect hard-won confidence—not hand-waving skepticism or hollow posturing—as the real mark of virtue; who see the difference between you and the blustering blowhards you’re afraid of being equated with; who are grateful for the chance to learn and be inspired by you.
The rest of this post is a much more fleshed-out and as-yet-incomplete exploration of these points, some of which I’ll likely be returning to in future posts (so leave your pushback and questions in the comments, please!).
What people get right about “intellectual humility”
As an advocate of cognitive integrity and the courageous, formidable work involved in pursuing real knowledge, I’ve often found myself allied with proponents of “intellectual humility.” Philosophers have touted this practice of “understanding and acceptance of one’s intellectual limitations” as an intellectual virtue that safeguards against the most common epistemic ills—like confirmation bias, dogmatic overconfidence, and insensitivity to new knowledge and evidence—that derail good thinking and decision-making.
Instead of holding whatever we happen to believe as the objective and unassailable truth and dismissing anyone who disagrees with us, the advocates of intellectual humility counsel us to hold even our strongest convictions “loosely,” always allowing for the possibility that we might be in error.
Psychologists and other social scientists have taken up this mantle in recent decades, conducting numerous studies showing that those who score higher on measures of intellectual humility (like the Comprehensive Intellectual Humility Scale, which you can test yourself on here) also demonstrate better judgment and decision-making in various contexts.
The message has also been filtering down to many of the better business writers and public commentators from across political persuasions, who’ve rallied around the idea of intellectual humility as a much-needed corrective for the biased, divisive, self-confirmatory thinking that’s run amok in our culture.
Beyond citing the empirical findings noted above, they also point to the many inspiring thinkers and leaders—like Socrates, Ben Franklin, Charles Darwin, Marie Curie, Ruth Bader-Ginsburg, Jeff Bezos, and Tim Cook—who famously recognized their own ignorance, owned their mistakes, and were willing to rethink even their most strongly held convictions in light of new evidence.
One of my favorite modern champions of intellectual humility, Megan Phelps-Roper, credits it with guiding her extraordinary journey from very public spokesperson for one of America’s most notoriously bigoted religious groups to one of its most outspoken critics after getting publicly reasoned out of her views on Twitter. If you haven’t listened to her (utterly captivating) Witch Trials of JK Rowling podcast, I highly recommend it.
Whatever your particular feelings about the show’s bitterly divisive subject matter, you’d be hard-pressed not to admire the thoroughness, measured sincerity, infectious curiosity, and passionate humanism with which Phelps-Roper invites us to contend with each side of the story. What a refreshing contrast to the pat conclusions and righteously regurgitated dogmas that have come to dominate every corner of our cultural discourse, “woke” and “anti-woke” alike.
That said: I don’t think what’s admirable about Phelps-Roper or the great thinkers and leaders listed above has anything to do with humility, intellectual or otherwise. Merely “recognizing your limitations” is an incredibly low bar to set on your intellectual aspirations, and if taken seriously, it can become a shackle rather than an aid to the kind of thinking you need to do in order to build anything great.
What is impressive about the pantheon of individuals listed above is not their intellectual humility, but their intellectual ambitiousness. True, they are not content merely to rest on the laurels of their received wisdom and unchecked assumptions; nor are they content merely to rest in their self-confessed ignorance. What distinguishes them is not a unique insight into “how little they know,” but rather an appreciation of how hard it is to achieve real knowledge—and, crucially, a deep and abiding commitment to doing that work.
The ambitious work of building and applying knowledge
In my experience as a therapist and coach, most people are not plagued by “overconfidence” but by neurotic self-doubt or neurotic posturing at confidence (which, when looked at closely, is just another form of neurotic self-doubt). The real problem is not that people overestimate how much they know, but that they underestimate and undervalue the work involved in coming to know. Cultivating genuine confidence and certitude in one’s judgment is far more difficult, and requires a great deal more virtue, than merely recognizing one’s limitations.
Taking up the example of Jeff Bezos: One of his core leadership principles for Amazon is actually that leaders “are right, a lot.” And it is in service of this principle that leaders “seek diverse perspectives and work to disconfirm their beliefs.” This is one of the practices by which we can improve our judgment so as to be more right more of the time. The goal is not merely to accept how poorly we think and how little we know, but to become better thinkers and to vastly expand what we know. Again, it is not to be intellectually humble but to be intellectually ambitious.
Part of what that means is that, yes, we need to put a high premium on actively checking and revising our views in light of new evidence, acknowledging and learning from mistakes, and tracking what we don’t know as much as what we do. But it also means:
We form our own considered judgments about issues that matter to our lives.
As human beings we have no choice about needing to form beliefs and act on them. What sort of schooling will be best for my children? Will my company be more competitive if we integrate AI into our tech product? Do I have a better chance of survival if I get my cancer treated with surgery or with alternative medicine? The only choice we have is whether we take responsibility for how we form these judgments, or whether we default on that responsibility.
One way to default is by pretending greater certainty and confidence in our wishful thinking, emotion-driven conclusions, or uncritically accepted dogmas than we have any real right to; another is to defer to conventional wisdom or expert opinion on the grounds of “how little we know.” The intellectual humility framework imbues the latter with an air of righteousness (so long as we choose experts who also demonstrate appropriate amounts of humility, of course), making it an even more enticing substitute for the work of making up our minds.
By contrast, intellectual ambitiousness would push us to learn and use every epistemic tool at our disposal to form the best judgments we can about the issues that matter. For the most important issues, this may mean doing our own research and reconstructing our own knowledge and solutions from first principles—which are powerful largely for the certainty and genuine confidence they confer. We can’t plausibly do this kind of deep, fresh thinking about every single judgment and decision in our lives, but we can strive for that level of understanding and discernment in at least the central domain(s) of our lives.
SafeGraph CEO Auren Hoffman’s advice on this point really stuck with me: “The first rule of thinking: At any given point in time, you should be working on at least one thing via first-principles thinking. It might take you the rest of your life. But having at least one thing to go deep on is a life well lived.”
For everything else, we will inevitably rely on experts and other proxies to some extent—but this, too, we can be more or less intellectually ambitious about. For instance, we can pursue broad-strokes understanding of the methodological frameworks commonly used for establishing and asserting claims in different fields (e.g., qualitative and quantitative approaches in the behavioral and medical sciences, frequentist and Bayesian statistical approaches to probability estimation). We can learn enough to form our own considered judgment about their merits and appropriateness with respect to different sorts of claims—versus merely taking the most widely cited methods at face value.
This kind of familiarity with the major schools of thought and methodological approaches within a given field also positions us to “know what we don’t know,” which is itself a valuable form of knowledge that takes work to achieve. For instance, when I teach cognitive behavioral therapy (CBT) to the graduate students in my clinical psychology courses, I assign them several readings and written reflections on the therapeutic approaches we don’t otherwise cover in the course. Only then do they gain some sense of the kinds of problems and questions that their emerging expertise in CBT will and will not equip them to address, and what other kinds of experts might be appropriate to consult on the latter.
Even without such adjacent expertise, we can evaluate the advice of, say, competing medical experts based on our honest judgment of their credibility. (I.e., did they seem genuinely interested in explaining their reasoning, or did they seem more interested in impressing us with obscure medical jargon?) That way, we look beyond excuses to follow whichever advice happens to fit our preexisting prejudices.
We can actively update and evolve our own approach to making these judgments as we accumulate relevant wisdom and experience (i.e., by reflecting when certain expert characteristics turn out to matter more or less than we thought they would). That way, we don’t passively persist in the application of whatever default approach we started with.
We shirk the comfortable facade of indecision and uncertainty.
Even once we’ve done the work to form and validate our judgments in reality, this doesn’t mean we’ll automatically bring them to bear on our further thinking and decision-making. “Remembering what we know” takes ongoing, conscious effort and will, often in the face of both internal and external resistance. Sometimes this can mean acting precisely in ways that get branded as “arrogant” or “overconfident” by proponents of intellectual humility.
Think of every innovator who has ever been mocked, rejected, or outright persecuted for challenging the status quo before their innovation eventually became the new status quo. Frederick Douglass was considered “stubborn, arrogant, and overly sensitive to slights” by his critics. Anne Sullivan was almost fired for her controversial method of establishing boundaries with her deaf and blind pupil Helen Keller, yet she stubbornly persisted.
For that matter, Socrates literally dared the city to put him to death for his convictions, asserting that he was God’s gift to Athens and that killing him would injure them more than it injures him—for “I believe it is not God’s will that a better man be injured by a worse.” (In the words of my philosopher friend Greg Salmieri, Socrates’ Apology is “among the most self-righteous things ever written.”) And Phelps-Roper willingly cut ties with most everyone she’d ever loved when she became convinced that her church’s doctrines were wrong.
We pursue worthwhile possibilities, even while tracking the risks and unknowns.
Many of the issues we need to take a stand on in our daily lives take the form of probability judgments like: “There’s a small but non-trivial chance we can dramatically improve this product if we incorporate AI,” or “My odds of recovery are 10 times higher if I opt for X treatment.”
The need for courage and conviction equally applies to such judgments, as does the need to stay honest about the risks and uncertainties involved. In fact, a key way we earn confidence in our own judgment over time is by establishing a track record of modulating our confidence about any particular judgment according to the merits of the evidence.
Even when dealing with prolonged and intractable uncertainty—as is the case when trying to grow a tech startup in a volatile economy, for example—we can aim for meta-level certainty about the range of realistic best-to-worst-case scenarios and where they all lie along the probability distribution.
The literature on “superforecasters” already offers plenty of good epistemic advice for making more accurate predictions, but its emphasis on holding predictions cautiously and tentatively misses a crucial psychological dimension of truly great thought leaders: the will required to stick with bets worth making.
Think of Darwin’s dogged accumulation of evidence toward his evolutionary theory over many years, even as he took special care to “make a memorandum… without fail and at once” anytime “a published fact, a new observation or thought came across me, which was opposed to my general results.” Or, for that matter, think of the daring bets we need to be able to make in order to throw ourselves wholeheartedly into a new relationship with someone we’re starting to love, thus giving the relationship its best chance—even while knowing full well it might not work out.
The reward of all this work is the kind of earned confidence and certitude that allows us to take the big bets and build the big projects worth building—arguably the opposite of humility.
But what about the research?
Researchers of intellectual humility report a common pattern in their studies. Participants (most of whom are 18- or 19-year-old college students) are less likely to think and behave like pompous blowhards if they rate themselves lower on statements like: “My ideas are usually better than other people’s,” or “When I am really confident in a belief, there is very little chance that belief is wrong.”
Some of this owes to the selective use of outcome measures that fit the researchers’ epistemic worldview. (For instance, researchers will rarely measure participants’ tenacity in asserting an unpopular but deeply considered conviction.) The even bigger reason, I suspect, is that most participants haven’t even reached the minimal bar of realizing their ideas are mostly fluff, much less proceeded to the work of forging a credible knowledge base.
If, on the other hand, a Charles Darwin or Jeff Bezos in his 50s were to complete this measure, what do you suppose would be the objectively right response for them to give? I would think it false modesty for either of them to respond with anything less than “strongly agree” to either of the items above. And I do not reserve this judgment for world-famous titans of science and industry; I’d say the same of numerous people in my own life whose judgment I trust, and whose epistemic hygiene I strive to emulate.
OK, you might respond, but aren’t we just arguing semantics here? If those with “intellectual humility” are in fact a bit better off in their judgment, on average, than those without it, then isn’t it a fine practice to cultivate it alongside other useful practices?
No, because the virtues we adopt and define ourselves by are never merely semantic; they have motivational force, for better or for worse. And what a virtue of “intellectual humility” ultimately motivates is not the ambitious quest for knowledge, with all the active curiosity and openness to challenging and changing one’s views that this implies; rather, by putting doubt and uncertainty on a pedestal, it disincentivizes us from pursuing that quest too ambitiously.
Some researchers of intellectual humility have recognized this worry, and have attempted to address it by reframing intellectual humility as a “middle ground between intellectual arrogance and intellectual servility.” On this revised conception, a person is only intellectually humble if they not only “own their intellectual limitations” but are also “motivated by a love of such goods as truth, knowledge, and understanding.” They also can’t be too neurotically preoccupied with their intellectual weaknesses.
The measure resulting from this updated conceptualization includes a limitations-owning subscale with items like: “I am quick to acknowledge my intellectual limitations.” But it also includes a love of learning subscale with items like: “I care about truth,” and “When I don't understand something, I try hard to figure it out.” It has an appropriate discomfort subscale with items like: “When I know that I have an intellectual weakness in one area, I tend to doubt my intellectual abilities in other areas as well.” (That one is reverse-coded, with higher agreement resulting in a lower score.)
Higher sum scores on this measure showed positive correlations with measures of “authentic pride” and “assertiveness,” unlike the traditional “intellectual humility” measures in the study. The researchers took this as a positive sign that they have successfully reconceptualized “intellectual humility” to correspond with the active pursuit of truth, rather than with the “subservien[ce] to their intellectual limitations” that might lead people to “simply bemoan that they will never be as smart as Einstein and call it quits.”
I agree that this is a much better measure of something deserving the status of an “intellectual virtue.” The only issue: It is no longer a measure of what anyone means or has ever meant by the term “humility” (a term derived from the Latin “humilis,” meaning “low”). In fact it is much closer to being a measure of “intellectual ambitiousness,” i.e., of ability and willingness to do the work required for genuine knowledge gain.
Quiet confidence is not “humility”
Another confound of the “intellectual humility” framework is that the most epistemically secure (i.e., self-trusting) individuals tend to be the least concerned about proving how much they know or how right they are. Their fundamental goal is not to reassure themselves of their basic epistemic efficacy (which is already a settled matter), but rather to solve the next problem, answer the next question, master the next challenge on the road to building what they want to build.
As a result, they are comfortable asking “stupid questions” and constantly pushing themselves to new frontiers on which they are once again beginners; in other words, they are comfortable being ambitious. Thus they often appear “humbler,” on surface, than the insecure blowhards who lack such genuine confidence. They are “low-ego,” in the conventional sense of the term (where “ego” is almost synonymous with “deeply insecure ego”).
No wonder Jim Collins, in characterizing the most effective business leaders in his Good to Great research study, described a “paradoxical blend” of personal humility and professional will. These were the “Level 5” leaders, as he called them: the CEOs who quietly and relentlessly steered their companies to greatness while their flashier, more boastful counterparts steered theirs headlong into mediocrity.
The Level 5 leaders tended to shun celebrity, act with “quiet determination,” rely on “inspired standards, not inspiring charisma, to motivate,” and look “in the mirror, not out the window, to apportion responsibility for poor results, never blaming other people, external factors, or bad luck.” They were, in short, the Abraham Lincolns of the business world. One need not see any “paradox” in this finding at all, if one recognizes that this is what deep, abiding self-trust looks like—and that it’s self-trust, not self-effacement, that grounds and motivates unstoppable ambition.
Don’t sell yourself short
If you want to build anything great, you’ll need to know and understand a lot about it first. This means that, if you want to be ambitious about what you build, you’ll need to be just as ambitious about what, and how, you think.
First and foremost, you’ll want to know what is actually true, not cling to whatever convictions you wish were true. You do so in any important issue of your life, rather than being content with mere guesses or tentative hypotheses.
In your career, this means having deep, informed convictions about the kind of work you can do and the value you want to bring into the world, rather than relying on untested assumptions and status quo biases to guide you. In love and friendship, this means wanting your relationships to be grounded in accumulated experience and hard-won knowledge of the other person’s character, rather than unchecked hopes or hearsay.
Of course, such convictions may take years to build and validate, and even then they may not stand the test of time. The landscape is ever-changing, and the work of knowledge never stops. So long as there’s more life to be lived, there’s more knowledge to be gained.
Then again, the more we know, the better we get at knowing—and the richer and sweeter a life we can build, really build, henceforth.
Dr. Gena Gorlin is a Clinical Associate Professor of Psychology at the University of Texas at Austin. She is a licensed psychologist specializing in the needs of ambitious people looking to build.
Find Out What
Comes Next in Tech.
Start your free trial.
New ideas to help you build the future—in your inbox, every day. Trusted by over 75,000 readers.
SubscribeAlready have an account? Sign in
What's included?
- Unlimited access to our daily essays by Dan Shipper, Evan Armstrong, and a roster of the best tech writers on the internet
- Full access to an archive of hundreds of in-depth articles
- Priority access and subscriber-only discounts to courses, events, and more
- Ad-free experience
- Access to our Discord community
Comments
Don't have an account? Sign up!