Rhetoric, as a field, has long admitted that pathos persuades more people than reason and logic. The Greeks and Romans recognized the danger of this human flaw. Cognitive science has, since at least the 1970s, demonstrated that “logical arguments” don’t win political debates — or most other debates, either.
What rhetoricians should want to know is why humans embrace emotions over reason and if there are rhetorical techniques supported by science instead of wishful thinking and faith in reason.
Why are we predisposed to embracing emotion over reason? It could be because we don’t want to disagree with the majority within our communities. The desire to “fit in” with the group leads us to trust the group — and distrust ourselves when we disagree with the majority.
Sometimes, the majority is correct. It might be that the majority is correct more often than not. But, I doubt that. I do not have faith in the wisdom of crowds. I don’t like crowds. Still that desire to cooperate and fit in with others has led to some curious evolutionary traits among humans. We evolved from impulsive animals to (somewhat) reasoning animals. The impulses, especially those associated with community and cooperation, remain stronger than reason.
Consider the following from The New Yorker magazine:
Why Facts Don’t Change Our Minds
Elizabeth Kolbert
February 27, 2017 IssueThe vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.
Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?
In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.
Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.
“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.
It’s bad enough that we value and trust the opinions of majorities over our own reason. Research shows that the smartest amongst us are the least confident in our knowledge and the least informed are the most (over) confident. Average people assume they know a lot about the world. Trusting that majority to shape our views should worry intellectual humans.
Thankfully, when confronted with ignorance, most of us start to waiver in our certainty and admit we don’t know a lot. If you ask someone to explain, in detail, something he or she claims to know well, that over confidence is self-corrected. Ask someone to explain what they don’t really understand if you want to later change an opinion based on faulty facts.
Virtually everyone in the United States, and indeed throughout the developed world, is familiar with toilets. A typical flush toilet has a ceramic bowl filled with water. When the handle is depressed, or the button pushed, the water—and everything that’s been deposited in it—gets sucked into a pipe and from there into the sewage system. But how does this actually happen?
In a study conducted at Yale, graduate students were asked to rate their understanding of everyday devices, including toilets, zippers, and cylinder locks. They were then asked to write detailed, step-by-step explanations of how the devices work, and to rate their understanding again. Apparently, the effort revealed to the students their own ignorance, because their self-assessments dropped. (Toilets, it turns out, are more complicated than they appear.)
Sloman and Fernbach see this effect, which they call the “illusion of explanatory depth,” just about everywhere. People believe that they know way more than they actually do. What allows us to persist in this belief is other people. In the case of my toilet, someone else designed it so that I can operate it easily. This is something humans are very good at. We’ve been relying on one another’s expertise ever since we figured out how to hunt together, which was probably a key development in our evolutionary history. So well do we collaborate, Sloman and Fernbach argue, that we can hardly tell where our own understanding ends and others’ begins.
“One implication of the naturalness with which we divide cognitive labor,” they write, is that there’s “no sharp boundary between one person’s ideas and knowledge” and “those of other members” of the group.
Few people can describe how a computer chip works, but we rely on those chips. Few people can even explain how a car works in detail, yet we would generally say we understand the basics of cars. We don’t know a lot, and there’s way too much to know in our world. We overstate our expertise, because to admit we know so little would be paralyzing.
Those of us who admit ignorance are the exception to norms. We are the freaks.
I am painfully aware of what I don’t know, which is why I read and study so much. I always feel ignorant, and that leads me to feel insecure. Oddly enough, I also don’t tend to trust experts unless I can understand what they are saying or what they have written. I’m a natural skeptic in social structure that doesn’t work well if everyone is a skeptic.
We need most people to just accept that technology, complex systems, and the universe function as intended.
As people invented new tools for new ways of living, they simultaneously created new realms of ignorance; if everyone had insisted on, say, mastering the principles of metalworking before picking up a knife, the Bronze Age wouldn’t have amounted to much. When it comes to new technologies, incomplete understanding is empowering.
Where it gets us into trouble, according to Sloman and Fernbach, is in the political domain. It’s one thing for me to flush a toilet without knowing how it operates, and another for me to favor (or oppose) an immigration ban without knowing what I’m talking about. Sloman and Fernbach cite a survey conducted in 2014, not long after Russia annexed the Ukrainian territory of Crimea. Respondents were asked how they thought the U.S. should react, and also whether they could identify Ukraine on a map. The farther off base they were about the geography, the more likely they were to favor military intervention. (Respondents were so unsure of Ukraine’s location that the median guess was wrong by eighteen hundred miles, roughly the distance from Kiev to Madrid.)
Surveys on many other issues have yielded similarly dismaying results. “As a rule, strong feelings about issues do not emerge from deep understanding,” Sloman and Fernbach write. And here our dependence on other minds reinforces the problem. If your position on, say, the Affordable Care Act is baseless and I rely on it, then my opinion is also baseless. When I talk to Tom and he decides he agrees with me, his opinion is also baseless, but now that the three of us concur we feel that much more smug about our views. If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.
How does the ignorance of the complex shape political debates? Because we put our trust in the “community” to which we belong, assuming the experts in our community know how things work. We trust our political experts, and distrust the opposition’s experts.
“This is how a community of knowledge can become dangerous,” Sloman and Fernbach observe. The two have performed their own version of the toilet experiment, substituting public policy for household gadgets. In a study conducted in 2012, they asked people for their stance on questions like: Should there be a single-payer health-care system? Or merit-based pay for teachers? Participants were asked to rate their positions depending on how strongly they agreed or disagreed with the proposals. Next, they were instructed to explain, in as much detail as they could, the impacts of implementing each one. Most people at this point ran into trouble. Asked once again to rate their views, they ratcheted down the intensity, so that they either agreed or disagreed less vehemently.
I’m not sure rhetoricians are equipped to help reveal to people their ignorance, or the ignorance of groups. Too many rhetoricians rushed to embrace crowdsourcing and other “wisdom in numbers” modes of problem solving. We’re the same scholars who want group work and collaboration simply because those must be better than individual thoughts and ideas.
If anything, scholars should be the most skeptical group when considering groups. Maybe small groups of experts collaborating are okay, but random groups are certainly not wiser than one expert in a field.
Our political systems are proof that pluralities and majorities are flawed.
“The Enigma of Reason,” “The Knowledge Illusion,” and “Denying to the Grave” were all written before the November election. And yet they anticipate Kellyanne Conway and the rise of “alternative facts.” These days, it can feel as if the entire country has been given over to a vast psychological experiment being run either by no one or by Steve Bannon. Rational agents would be able to think their way to a solution. But, on this matter, the literature is not reassuring.
I’m a pessimist. Thankfully, on that I am aligned with the community of experts.