Belief Perseverance
In his autobiography, the physicist Max Planck wrote, "The new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it."
This somewhat brutal bit of wisdom has become known as Planck's Principle, because it seems to so often be the case that people cling to beliefs even after those beliefs have been disproven; a norm sometimes shorthanded as "Science progresses one funeral at a time."
"Belief Perseverance," also sometimes called "Conceptual Conservatism," refers to this tendency to maintain our existing beliefs even when exposed to convincing evidence that said belief is incorrect, incomplete, or flawed.
Back in the mid-19th century, a Hungarian doctor named Ignaz Semmelweis discovered that the mortality rate of children with fevers fell ten-fold when the doctors treating those children disinfected their hands with a chlorine solution between patients.
Germ theory—the understanding that disease is caused by tiny creatures—was still a few decades in the future, so Semmelweis was considered by many doctors to be a bit nutty: many physicians at the time were socially and economically high-class, and they didn't believe high-bred gentlemen's hands could transmit disease (though they were also operating within a scientific paradigm in which disease was predominantly thought to be caused by pockets of “bad air” called miasmas).
Despite ample evidence that this hand-disinfecting routine saved a lot of children's lives, then, these other doctors refused to accept that they might need to change something about how they behaved, and change the beliefs they held associated with those behaviors.
This rejection of evidence that seems to contradict what we currently believe is sometimes called the “Semmelweis Reflex.”
It's not just scientists who are prone to digging in when presented with information that would seemingly devastate the logical underpinnings of an existing set of beliefs. Most of us reflexively defend and maintain our perceptual status quo whenever feasible.
It makes sense that we do this, as developing mental frameworks that help us navigate the world ("disease is caused by bad air") allows us to spend less energy on heavy cognitive effort, which in turn allows us to build other assumptions upon that heuristic foundation ("disease is caused by bad air, therefore this child is getting sicker because of miasmas leaking into the hospital").
Upsetting even one block in a grand structure, if that block is near the foundation, can require not just the reassessment of a single element of our belief system, but the whole of the system. And that can be cognitively burdensome, but also embarrassing—it can dent our egos and cause us to question not just factual beliefs, but ethical beliefs predicated on that knowledge, and the heuristics we constructed using that knowledge.
In some cases at least, this belief perseverance reflex is coiled so tight that challenging someone's belief, even when that challenge comes in the form of demonstrable fact and convincing evidence, can cause a person to dig deeper, retreat from most external sources of information—if those sources seem to be challenging their existing worldview—and seek out new sources of information that agree with their now more-entrenched belief.
This "backfire effect" has been shown to be especially potent when tribal affiliations become entangled with certain beliefs. "This political party doesn't believe in vaccines," or "This class of person doesn't believe that washing our hands will help save our young patients' lives."
This isn't a tendency that's limited to "other people"; it's seemingly baked in to the human condition. Belief Perseverance has been called a fundamental law of nature because of how widespread and omnipresent it seems to be across cultures, trades, and fields of inquiry.
That said, there is some evidence that elements of Belief Perseverance might be less impactful than older studies suggest.
Planck's Principle, for instance, is a truism that seems to have been defied by Planck's contemporaries, who eventually embraced germ theory a few decades later, after the germs in question could be seen using new, more powerful lenses, but also by Planck himself who—after more than twenty years opposing it—accepted physicist Ludwig Boltzmann's constant, which relates to thermodynamics, and which in turn allowed Planck to develop quantum theory (arguably his most important contribution to science, among many important contributions).
Likewise, there's evidence that some people who embrace disproven ideas understand that they're backing something less-valid according to demonstrable fact, and voicing their support for that less-proven, or disproven thing serves as a sort of virtue signaling to their tribe: a way of showing affiliation with a group, not a firm statement of non-ideological belief.
Whatever our beliefs, politics, or personal understandings about the world, then, we may say we trust what can be proven and that we have open minds, and we are capable of maintaining conceptual flexibility and epistemic humility. But our brains can convince us that new, proven things aren't so proven, and our social priorities might encourage us to celebrate frameworks and behaviors that oppose demonstrable fact because doing so aligns us with a group we want to be part of—and our desire to be perceived as a “good person” according to the standards of those we see as our social kin is a powerful force.
—
Enjoying Brain Lenses? You might also enjoy my news analysis podcast, Let’s Know Things and my daily news summary, One Sentence News.
The Brain Lenses podcast is available at brainlenses.com or wherever you get your podcasts.