By the time the polls close on Tuesday, more than 150 million Americans will have voted. Billions of dollars have been spent trying to convince voters whether Donald Trump or Kamala Harris is the superior candidate, and despite the frantic efforts, the latest polls aren't markedly different from when Harris first entered the race. All of this is a testament, among other things, to how difficult it is to change people's minds.
In 2017, Elizabeth Kolbert investigated why most of us, as a species, have such a hard time changing our minds. Education often does not help; in some cases, sharing facts that contradict people's opinions only makes them cling more tightly to their views. Through a series of studies, scientists link this behavior to the peculiarities of our evolution - a process that has not been affected by a world of complex policies and sophisticated vaccines. As Americans near the end of another polarizing campaign, our plight can be traced in part to a paradox of our nature.
We are quite capable of spotting the weaknesses of others' arguments, but we are blind in our own arguments.
In 1975, researchers at Stanford invited a group of students to participate in a study about suicide. They were handed two sets of suicide notes. In one group was a note written by a random individual, the other by a person who had then killed himself. The students were then asked to distinguish between genuine and fake notes.
Some students discovered that they were geniuses at this task. Out of twenty-five notes, they correctly identified twenty-four of them. Others found they were hopeless. They identified the real note in only ten cases.
As is often the case with psychological studies, the entire study was rigged. Although half of the notes were indeed real - they were obtained from the Los Angeles County Coroner's office - the results were fictitious. Students who were told they were almost always right did no better than those who were told they were mostly wrong.
In the second phase of the study, manipulation was detected. The students were told that the real purpose of the experiment was to assess their responses to thinking they were right or wrong. (This, it turned out, was also a trick.) Finally, the students were asked to rate how many suicide notes they had correctly categorized and how many they thought the average student would get right.
Even after the evidence "for their beliefs is completely refuted, people fail to make appropriate revisions to those beliefs," the researchers noted.
Thousands of subsequent experiments have confirmed (and elaborated on) this finding.
The greatest advantage humans have over other species is our ability to cooperate. Cooperation is hard to establish and almost as hard to maintain.
Of the many forms of faulty thinking that have been identified, "confirmation bias" is among the most common and is the subject of textbooks' worth of experiments. One of the most famous of these took place at Stanford. For this experiment, the researchers assembled a group of students who had opposing views on the death penalty. Half of the students were in favor of it and thought it prevented crime; the other half were against and thought it had no effect on crime.
Students were asked to respond to two surveys. One provided data supporting the deterrence argument and the other provided data questioning it. Students who had initially supported the death penalty rated the pro-deterrence data as highly credible and the anti-deterrence data as unpersuasive; students who initially opposed the death penalty did the opposite. At the end of the experiment, the students were once again asked about their views. Those who had supported capital punishment were now even more in favor of it; those who opposed him were even more hostile.
The Gormans, too, argue that ways of thinking that now seem self-defeating must have been at one point adaptive. And they also devote many pages to confirmation bias, which, they argue, has a physiological component. They cite research that suggests people experience real pleasure—a surge of dopamine—when they process information that supports their beliefs.