Shoshin
Free essays and podcast episodes are published every Tuesday, and paid subscribers receive an additional essay and episode on Thursdays.
Some of the most common cognitive biases are the consequence of, or related to, our tendency to lean on heuristics when we make judgements or decisions.
A heuristic is a mental shortcut which allows us to quickly come to a conclusion, bypassing the otherwise-necessary process of more cumbersome computing methods.
Don’t talk to strangers is a heuristic taught to children in many cultures, despite it being a flawed, exemption-laden way of dealing with the world.
Use smaller image files on the internet than you use in print, likewise, is a heuristic that many designers find useful, but which is not universally applicable or desirable.
Heuristics are meant to get us to the right place much of the time, while requiring as little cognitive overhead to get us there as possible: no logical processing, worrying about edge-cases, or thinking through potential downsides. Just an “if this, then that” mechanism for choosing a path.
As a practical method for addressing common problems with uniform or uniform-enough solutions, heuristics are wonderful time- and energy-saving cognitive tools. They get us to generally where we want to be more often than not.
They’re also something we humans seem to be primed to use: maybe biologically, maybe culturally—for whatever reason, we make near-constant use of them, similar to how we might wear shoes without even thinking of them as ever-present, useful tools. Our mental shortcuts are so ubiquitous that we strap them on reflexively, benefitting from their presence and only infrequently questioning the wisdom of their application in the shape of reflex, intuition, or bias.
As I mentioned, though, these tools are also at the root of many of our more negative preconceived notions—in some cases because our heuristics about a given situation, person, or idea are flawed, and in some cases because we learn the wrong lessons from our own or someone else’s experiences. In either case, our future decisions are shaped by these unrepresentative experiences or miscalibrated understandings, and we are thus nudged along paths that make little sense for our desired outcomes and for the reality of the environments in which we exist.
The concept of shoshin comes from Japanese Zen Buddhism, and is generally translated as “beginner’s mind.”
This idea is often presented as a kind of paradox, as it can sometimes seem that the more we learn about something, the less capable we are of learning more about it.
Or as the Zen Buddhist monk Shunryu Suzuki put it in his 1970 book Zen Mind, Beginner’s Mind, “In the beginner’s mind there are many possibilities, but in the expert’s there are few.”
Paradoxical as it may sometimes seem, though, this concept makes perfect sense in the context of heuristics.
The more we learn about a particular topic, and the deeper we dive into the theoretical and practical realities of it, the more heuristics we consciously and unconsciously generate about it.
An electrician who’s just starting out will be able to imagine a huge number of solutions to many of the problems they encounter—many that wouldn’t work, a few that would—while a electrician who’s been working at their craft for 40-years will be more likely to have tried-and-true approaches to many of the issues they might face; fewer ideas, but more of them tending to work.
Such experience will serve a veteran electrician well in most cases, according to most metrics. But in the face of the unfamiliar, or a paradigm shift in their technological, economic, or social environment, the more novice electrician may have some advantages: like fewer expectations about what will be true, and as a result, fewer biases about where a solution might be found.
History is filled with examples of incredibly intelligent, capable, seasoned experts failing to perceive or believe in vital discoveries or ideas: germ theory, tectonics, and democratic governmental structures were all thought to be inane hogwash by some of the most renowned minds of the relevant era, before eventually being proved true or possible by newbie or outsider thinkers.
Interestingly, it’s been shown that even believing we know more about something than most people can push us out of a beginner’s mindset into a more rigid line of thinking, blinding us to the full range of potential possibilities.
This may be the result of wanting to defend concepts we’ve come to perceive as our territory, or perhaps just wanting to avoid the potential embarrassment of having been seen to support an idea that is eventually replaced by something else.
It may also be the consequence of what’s sometimes called the “earned dogmatism effect,” which posits that an internalization of perceived social authority makes us feel as if we have to behave differently around people who seem to know less about something than we do.
The external application of the term “expert,” then, can become internal, and that internalization may make us feel as if we have a responsibility to share and teach the knowledge we have, which shifts our cognitive processes away from absorption and learning mode, into something more like formalization and heuristic-making mode.
A potential solution to this well-meaning cognitive rigidity, according to Zen Buddhist thinking, is to attempt to maintain the mindset of the beginner: to view ourselves as someone who knows little or nothing, and who is therefore capable of learning anything and everything, without preconceived notions about what we’ll learn.
There’s some research backing this assertion, along with some ideas as to what we might do to maintain this mindset in the face of social mores that encourage us to learn, crystallize, and defend our conceptions of how things work.
One method that seems to have potential is attempting to explain a concept or field to someone else, thoroughly, and then consciously reflecting on the gaps in our knowledge: the things we cannot explain in detail, and the questions from our “student” we cannot answer.
It may even be valuable to imagine explaining something we think we understand to a professional or other expert: a neurosurgeon if we think we’re knowledgable about the structure of the brain, or a political analyst if we think we know more about politics than most people.
Such imaginings can have a humbling effect on our self-perception, which can shift us back into learning mode from a place of potentially stagnant defense and explanation.
Again, there are benefits to generating a collection of heuristic tools that allow us to more quickly and capably solve common problems within a given field. But the construction and over-use of such tools can have rigidity-related consequences that many of us would prefer to avoid.
Enjoying Brain Lenses? You might also enjoy my news analysis podcast, Let’s Know Things.
There’s also a podcast version of Brain Lenses, available at brainlenses.com or wherever you get your podcasts.