- Home
- Michael P. Lynch
Know-It-All Society Page 2
Know-It-All Society Read online
Page 2
The resolutely Catholic Montaigne hesitated to go that far. It wasn’t faith per se that was the problem, he thought, but our penchant for intellectual arrogance—arrogance about our beliefs or our worldview. This is what he took from the skeptics, and he inscribed their sayings into the beams of his library, including this remark from Pliny: “There is nothing certain except that nothing is certain, and nothing more wretched than Man nor more arrogant.”4 “Arrogant” was Montaigne’s watchword, and his warning: our desire for certainty, for thinking that we’ve figured everything out, that our reasons are the best reasons, is what gets us into trouble, both in politics and in life.
This book is written in light of Montaigne’s warnings about the dangers of intellectual arrogance. Like Montaigne, we are living in an age of severe disruption. For some Americans, much of that disruption concerns a changing economic, cultural, and demographic landscape—one in which white Americans will soon no longer be in the majority. But part of the disruption is technological. Information technology has changed how we live, how we learn about the world, and how we interact with one another. It has made it easier to access information than ever before, yet we increasingly seem to disagree over that information, over what we know.
We are increasingly polarized in our attitudes toward our political opponents. And we disagree now not only over values, and not only over facts, but over which sources for those facts are even reliable. Our public discourse has been trampled as a result. Traditional boundaries and norms of civility have been set aside, and almost nothing seems to surprise any of us anymore. But how do we react to this uncertainty? Like the religious zealots of the sixteenth century, we generally don’t doubt; we dig in. We tell ourselves that we know what is right and what is wrong, reassuring ourselves with every tweet about our own superiority.
By the time Montaigne retreated to his tower, the dogmatism of his age had already done its bloody work. We are not there yet, but dogmatic politics based on a sense of superiority, on a perceived grasp of the real truth, is on the rise again. We see it in the marches of neo-Nazis on college campuses, in the demonstrations in Europe, in the desire to build a wall along our nation’s borders, in the dismissal of compromise by both the Right and the Left. It is hard to shake the feeling that our path is leading us into a dark wood—a path lit by the shining intensity of dogma, self-certainty, and intellectual arrogance.
Montaigne’s own reaction to the widespread intellectual arrogance of his day vividly illustrates the philosophical problem we’ll grapple with throughout this book. It is a problem that preceded him and will remain after us. It is a single problem with two faces—one personal and one political. The personal face concerns what sort of attitude we should adopt toward our own convictions—about what we think is fundamentally right and what is wrong. Put in the form of a question to oneself, it amounts to this: How can I be open to the possibility of being wrong while still maintaining strong conviction? The one attitude seems to psychologically rule out the other. This is a problem for any reflective person, and as Montaigne would have been the first to attest, it is no mere abstract puzzle but an existential issue, for it concerns how to live and how to think. It concerns how to retain conviction without intellectual arrogance.
Not believing anything is not really an option, as Montaigne knew, but retreating to a tower is not really an option for most of us either, and neither is it all that helpful from the standpoint of social change. And that brings us to the political face of the problem. It was doubtless felt in Montaigne’s time, but for democracies it takes a particularly acute form. Democracies need their citizens to have convictions, for an apathetic electorate is no electorate at all. Yet democracies also need their citizens to listen to one another’s convictions, to engage in political give-and-take. The problem is that those with conviction regard with suspicion those listening to the other side; they aren’t true believers. And those who listen often do fall into inaction; they flee to the tower, ivory or otherwise.
The problem that intellectual arrogance causes for politics is not unique to us, but as in Montaigne’s time, it is particularly pressing. In order to grapple with it, we must first understand what it is.
Like Ears of Corn
The seeds of arrogance are planted early in life. The best we can hope for, Montaigne thought, is to be like ears of corn—which stand proud and erect when their heads are young and empty, but droop with humility when their heads are old and full.5 But in reality, he argued, that never really happens. For Montaigne and the ancient skeptics, our tendency to overestimate our knowledge isn’t just a phase; it is part of human nature.
In recent decades, cognitive science has been catching up to Montaigne. The psychologist David Dunning and his colleagues, for example, asked subjects to first perform tasks like checking sentences for bad grammar, and then rate themselves on how well they performed that task.6 Those who did poorly tended to think they performed better than they did, while those who did better tended to think they performed worse. These results have been replicated numerous times, with students, in various job settings, even with doctors. One lesson is that the competent not only have skill but they know what it means to have that skill. Those who know tend to be harder on themselves—because they know how easy it is to make a mistake. But the incompetent don’t know what they don’t know, so they are more prone to overconfidence.
Moreover, most of us think we know more about how the world around us works than we really do. One of the most famous results in cognitive science is the “illusion of explanatory depth”: When asked first to rate how well they understand how a zipper works and then to explain it, almost everyone thinks they know, but they really don’t.7 And that finding, too, is highly replicated: we think we understand all sorts of things better than we do.8 Given that none of us are experts about everything, it is hard to avoid the conclusion that we all remain a bit of a teenager at heart: we stand tall like young corn only because our heads are empty.
Our tendency to overrate our knowledge is related to our more general tendency to think that our judgments are more rational and deliberate then they really are. As Montaigne suggested half a millennium ago, our “rational souls accept notions and opinions” produced during “waking sleep”—that is, produced without the aid of conscious reflection.9
Seventeenth- and eighteenth-century philosophers following Montaigne developed this insight. They divided the mind into different systems, or “faculties”: those that operated automatically, or passively; and those that were more conscious, active, and deliberate.10 “Intuition” is our capacity to make unreflective, automatic judgments; “reflection” is our capacity to engage in complex problem solving, plan for the long term, and consciously weigh reasons. While reflection is what enables us to engage in more sophisticated cognitive activities, intuition is indispensable right from the get-go. We could not get on with our lives if our minds did not quickly process the vast majority of information that regularly confronts us—if the mind did not take intuitive shortcuts.11
One example is our tendency to automatically and unconsciously compare and contrast new things and experiences with those already familiar to us.12 This is why we often find ourselves liking situations, clothing, food, and so on that are similar to what we already know. But it is also why we quickly try to make things cohere with what we’ve come to expect. Ask someone, for example, how many animals Moses took with him on his ark, and they may not spot the error—simply because they are unconsciously associating Noah with arks. This kind of process is one element of what David Hume—an eighteenth-century Scottish philosopher deeply influenced by Montaigne—described as the “association of ideas,” and what he thought was one of the mechanisms most used by the mind. As Hume suggested, such associative thinking spares us the trouble of consciously and individually evaluating each new object we encounter, and it means that we can make judgments faster and with “more method and regularity.”13 But it also means that our opinions, and ev
en how we perceive the world, are shaped by what we already believe and feel to be true—whether or not it really is true.14 We stain the world with our sentiments, as Hume famously put it.
This is the heart of the phenomena now called “motivated reasoning” and “implicit bias.” To some degree we see what we expect to see and believe what fits with what we already think we know. And the more familiar we are with certain categories—including what to associate with, and expect from, members of those categories—the more automatic and involuntary our judgments concerning those categories become.15
All this means we can also get things very wrong, and in morally troubling ways, for this tendency to intuitively associate new with old is not limited to inanimate objects but includes people as well.16 And it is shot through with social norms and prejudice. Which, in turn, means that racist, sexist, or otherwise discriminatory associations that have permeated the larger social context can distort our perceptions of one another. The pervasive stereotype that Muslims are dangerous can distort the ways in which we come to think of, and treat, Muslim people; the pervasive stereotype that women are less self-assured or intelligent than men can distort the ways in which we come to think of, and treat, women; the pervasive stereotype that black people are dangerous distorts the ways in which we think of, and treat, black people; and so on.17 And, the more pervasive the stereotypes are, the more likely they are to influence how people conceive of certain groups, and the more inclined those people will be to assume they “know” what individuals in those groups really are like.18
To make matters even worse, we are also really bad at telling the difference between beliefs formed on the basis of reflection and those formed on the basis of implicit bias. In staining the world with our sentiments, we are prone to bias blind spots: we fail to recognize when our own judgments are informed by biases. Other folks are biased, sure. But our beliefs (we tell ourselves) are pure as the driven snow, based on facts and sound reasoning.19 Combine that faulty perception with the finding, stated earlier, that we overestimate how good we are at tasks that we are really bad at—and you get the frightening possibility that the worse we are at detecting our biases, the better we think we are.
Montaigne and Hume would not be surprised.
A Very Social Attitude
It is difficult not to feel for Icarus. He tried to fly like the gods, but he fell to earth, burning out rather than fading away. In so doing, he paid the price of what the Greeks called hubris. Judging by their myths, they were a bit obsessed with it. In their stories, hubris never escapes punishment. Niobe, who boasted of the superiority of her children, was turned to stone; Prometheus, who stole fire from the gods, had his liver eaten for eternity; Icarus, who flew too close to the sun, dropped into the sea.
But while you can read these myths as cautionary tales, you can also see in them a recognition of a very human truth: that overconfidence is a central feature of life. It is risky, as the Greeks emphasized, but, as their fables also suggest, we tend to think that nothing great is achieved without it. “Fortune commonly favors the bold and enterprising,” David Hume noted, “and nothing inspires us with more boldness than a good opinion of ourselves.”20 Self-esteem, said Hume, is an essential part of any successful character and habit of mind.
In our own time, we’ve come to absorb this truism as liberating. Starting in the 1970s, the self-esteem movement in education was a reaction to an earlier culture that hammered at students’ self-conception and derided those who were in any way different, creative, or outside the white, heterosexual mainstream. The new approach emphasized praise and achievement rather than criticism. More recently, there has been a similar fascination with the related concept of “grit.” Grit is a drive to succeed. It is a kind of determination and, as such, is often marked by self-control. The person with grit is the person who sticks it out; it’s the kid who doesn’t eat the marshmallow. Grit is what pushes us through, keeps us at it when the odds are against us. And having grit, we tend to assume, means having self-esteem.
Importantly, we also apply Hume’s truism to our beliefs. We admire the woman whose self-esteem enables her to stand and testify about her experience of sexual assault even as those arrayed against her try to undermine her credibility. And we admire Galileo, who confidently believed that evidence showed that the Earth traveled around the sun, even though this view was dismissed by both the church and other scientists of his day. We admire, in short, those who have the strength of mind to go with the evidence even when few others find it plausible, much less believe it.
And we should. Self-confidence, either about our abilities or our beliefs, is a very good thing; we hope to instill it in our children, and we wish we had more of it ourselves. We demand it from those who fly our planes, operate on our bodies, and lead our armies in battle, and we reward it with esteem, with medals, with power. Confidence is sexy; no one follows the fainthearted or the meek. And while we realize that confidence can become overconfidence, and boldness hubris, we are often willing to bet on those confident enough to take risks, as long as they are more often right than wrong. Confidence, in short, is socially rewarding.
The undeniable social and psychological benefits of confidence help to explain why human beings are prone to overconfidence as well as intellectual arrogance. The explanation lies not with confidence per se but with our socially reinforced desire for confidence. We long for the esteem of others, and having self-esteem is an excellent way to get it. That’s true about our beliefs too. We want other people to agree with us, to flatter our opinions with praise, and being confident in your opinions can often help achieve that end.
A similar point is true of our tendency to inflate our knowledge. We saw earlier that this tendency is partly the result of our hardwired propensity for quick, intuitive associative thinking and our willingness to overlook our own biases. But it is not our ignorance alone that matters as much as our fear of ignorance. While we don’t know the extent of what we don’t know, and we often overestimate how much we do know, we are all too familiar with what it is like to get things wrong or not know the answer. Nobody wants that. And for good reason: mistakes can get you hurt. They aren’t socially rewarded either. You don’t win awards for making mistakes, and you don’t get promoted for admitting how much you don’t know. So we try to signal to others—and importantly, to ourselves—that we are reliable and knowledgeable. That’s what makes the human condition so deliciously ironic. We so strongly hate not knowing that we try to convince ourselves—and everyone else—we know more than we do.
What all this suggests is that seeds of intellectual arrogance are planted in the soil of our social interactions. One seed is not ignorance itself but our socially reinforced fear of it; the other is not confidence itself but our socially reinforced desire to have it. Combine this fear and desire and you get a social recipe for encouraging people to resist admitting error and act as if they are always right, even when they are wrong.
Intellectual arrogance is therefore a very social attitude—both in its origins, as we’ve just seen, and also simply in itself. The know-it-all’s defining characteristic is, in fact, explicitly social: he thinks he has nothing to learn from anyone else—that his worldview can’t improve from hearing what people with a different perspective have to say. Naturally, experts often don’t need anyone’s help to do what they know best. (A pilot isn’t arrogant just because she doesn’t take tips on how to land the airplane from someone who doesn’t know the first thing about flying.) But as we’ve seen, those who know the most often tend to recognize that they don’t know it all. And experts sometimes seek out additional training and coaching. They know you have to work hard to up your game—whether the game is sports or science.
A telling example of this point crops up in Thomas Ricks’s 2006 book, Fiasco, about the beginning of the Iraq War. Ricks details how many senior military officers were alarmed by the administration’s absurdly optimistic projections of how much the war would cost, how diffi
cult it would be to maintain control over conquered Iraqi territory, and how many American troops it would take to do so effectively. Ricks reports one four-star general as telling him that these concerns were “blown off” and “discounted” by senior White House officials, even before they got to the president:
“The people around the president were so, frankly, intellectually arrogant,” this general continued. “They knew that postwar Iraq would be easy and would be a catalyst for change in the Middle East. They were making simplistic assumptions and refused to put them to the test . . . they did it because they already had the answers and they wouldn’t subject their hypothesis to examination. These were educated men, they are smart men. But they are not wise men.”21
This passage illustrates one of the defining features of intellectual arrogance: an unwillingness to regard your own worldview as capable of improvement from the evidence and the experience of others. But it also suggests a second important characteristic of the intellectually arrogant: they put ego before truth—but tell themselves they are doing the opposite. The intellectually arrogant are convinced their views are superior because of their better command of the facts. But in reality, their sense of superiority reflects their own hyperconcern for their self-esteem.22 Their posture is defensive; fear of error and desire for esteem push them to emphasize their authority, and thus to insist on their being right, whether they are or not. That defensive posture not only can keep them from seeing the evidence; it makes them believe their own hype.