Ignorance
Free essays and podcast episodes are published every Tuesday, and paid subscribers receive an additional essay and episode on Thursdays.
We are all ignorant about pretty much everything.
This isn’t a hyperbolic statement: no single person has the time or cognitive capacity—or bandwidth—to absorb even half the things it’s possible for them to know over the course of their lives.
Consequently, while many of us know a decent amount about a few things, and a little bit about a great many things, none of us know everything or even most things about the majority of topics about which it’s possible to learn.
Thus, there’s nothing wrong with being ignorant; it’s our natural state. And our ignorance about essentially everything informs a great deal about who we are, from our sense of ethics and ideology to our behaviors, habits, and opinions.
What we’re ignorant about can also inform how we fill in our knowledge gaps and which gaps we choose to fill.
The environment in which I grow up and the information I have as a result of that upbringing informs my next-step learning experiences; and this is true both in terms of opportunities that are available or unavailable, and in terms of what I decide is important to learn, based on what I already know.
There’s always a chance that we’ll randomly encounter some new bit of knowledge that will lead us down an unanticipated intellectual path, but in most cases what we know now largely determines what we’ll learn, next. Our ignorance can limit our perception of next-step educational paths, just as learning some random bit of information can increase our sense of the same.
Sometimes we learn something new, or learn about the existence of some untraversed knowledge-path, but then preemptively shut down that line of inquiry before we have the chance to fully understand it.
This is often called willful ignorance, and it emerges for a variety of reasons, but most of them tie back to cognitive dissonance, confirmation bias, or tribalism.
Cognitive dissonance refers to the psychological stress or strain we sometimes feel when we encounter something—often an experience or fact—that seems to contradict something we believe.
Someone who believes the world is flat, but who then sees images of the Earth taken from orbit, might experience cognitive dissonance because their worldview—literal and figurative—is partially predicated on this idea that the world is flat.
Exposure to information that seems to contradict this belief, then, implies that first, they were wrong, second, maybe other people will judge them for being wrong, and third, they likely have other beliefs that are also wrong.
It’s common, when we experience this kind of mental discordance, to push back against the offending information, and to perhaps lash out at the person or other source of said fact; to discredit or destroy these intellectual-aggressors, if possible.
It’s also common, with time, to come to terms with such information, but it can be years before enough information is imbibed, enough humility is mustered, and enough of that initial psychological shock has worn off so that we’re able to approach this new information intellectually, rather than emotionally.
Confirmation bias is related to cognitive dissonance in that is refers to our tendency to seek out information that confirms our existing beliefs and to discard evidence that seems to contradict what we currently believe.
It’s thought that we probably do this reflexively as a defense mechanism against the stress we experience when we face psychologically difficult information head-on; though there’s also a chance that it simply feels good to expose oneself to confirmatory evidence, which is part of why we tend to stick to informational filter bubbles and spend more time with people who believe what we believe.
Finally, tribalism refers to our tendency to team-up with others who we perceive to be our literal or ideological kin.
Tribalism can incentivize us to opt for willful ignorance in the face of new information because some information might cause us to disagree with the beliefs of the group, or perhaps even doubt the group’s intentions or priorities.
It can sometimes make more sense, then, psychologically, to deny that facts are facts, or that certain types of knowledge or knowledge-seeking has any validity, because acknowledging those potential or proven realities might lead to uncomfortable questions about your family and friends, what they believe, and perhaps what you believe.
If your own beliefs change too much and fall out of alignment with those of the group, it’s possible that you could be formally or informally outcast, which doesn’t feel good socially, but which is also stressful, biologically—fitting in with a tribe is a human survival reflex.
Ignorance is a normal aspect of human life, but it’s also something we tend to lean into for a variety of understandable reasons.
External structures, like the scientific method, can help us avoid this collection of biases, but all such structures are imperfect and prone to their own flavors of selective blindness and denial.
It’s generally prudent, then, to utilize such structures where appropriate, but to also be aware of our own, personal responses to uncomfortable information.
Enjoying Brain Lenses? You might also enjoy my news analysis podcast, Let’s Know Things and my daily publication, Yesterday’s Newsletter.
The Brain Lenses podcast is available at brainlenses.com or wherever you get your podcasts.