Binary Bias
Splitting, sometimes referred to as “dichotomous” or “black-and-white” thinking, refers to a perceptual stance in which we reduce a complex collection of qualities down into two competing camps, and then make judgements based on that simplification rather than the more complicated reality.
Often, one of the poles of this bipolar arrangement is “ours” and the other is “theirs.”
When thinking about political policies, for instance, instead of considering the policy on its own merits, we’ll often rely on mental shorthand by asking which party supports it and which is against it, which then tells us, roughly, in which camp this new proposal belongs: ours or theirs.
We also often do this to other people: not necessarily rendering value-judgements about a stranger or new acquaintance outright, but making decisions about who they are, what they care about, and what sort of ideologies they might adhere to based on a knee-jerk response to typically mostly visual, but also auditory and other sorts of sensory stimuli we receive about them in the first few seconds of exposure.
When meeting a stranger, then, we might decide a great deal about their economic circumstances, their religious beliefs, their political allegiances, and our likelihood of getting along with them within mere moments of seeing or meeting them—and this collection of snap-judgements can, in turn, lead to a bias about which bipolar category they fit into across a vast range of attributes.
The "binary bias" is a close relative of psychological splitting, in that it’s a mental process that helps us quickly categorize things in such a way that the world seems to make more sense, more of the time, and it often achieves this by comparing new things to existing mental models: gauging them in relation to a known quantity rather than trying to build a new model from scratch, in the moment.
This is the bias that causes us to, for instance, learn about a new scientific study that generated mixed, inconclusive findings, but then behave as if the study proved something was true or false.
Our brains don't like middle-ground and uncertainty, and shades of gray are far less convenient for processing purposes than easy-to-parse black-and-white polarities—so when certainty isn't present, our subconscious mind will sometimes make it seem as if things are more certain than they are as part of a psychological defense mechanism.
The automatic reduction of spectrums down to a pair of opposing points can be a useful heuristic because it allows us to approximate the shape of various circumstances incredibly quickly—which at times can be a life-saving superpower, like when it helps us identify a dangerous situation before we even have a chance to consciously register what’s going on.
It’s a lot less useful in other situations, though, because it causes us to flatten our internal and external experiences.
Lacking an awareness of this flattening, it’s possible to find ourselves trapped in a very concrete- and comprehensible-seeming world, that nonetheless behaves in what we perceive to be irrational and confusing ways because other people are discerning gradients and spectrums that we technically also observe but, because of this bias, fail to acknowledge or act upon.
Enjoying Brain Lenses? You might also enjoy my news analysis podcast, Let’s Know Things and my daily news summary, One Sentence News.
The Brain Lenses podcast is available at brainlenses.com or wherever you get your podcasts.