Ostrich Effect
Free essays and podcast episodes are published every Tuesday, and paid subscribers receive an additional essay and episode on Thursdays.
The term “uncertainty,” when used casually, generally implies that a decision has not yet, but will with time, be made.
If I’m uncertain about which school to attend or which city to visit for my upcoming vacation, it probably means I’m in the midst of sorting through pros and cons, but will eventually settle on one university over another, and one location out of all possible locations to visit.
When used in more formal, and especially behavioral science and economic contexts, however, uncertainty describes a state of imperfect information—or more precisely, the degree to which we lack the information we require to make accurate predictions or subjectively correct decisions in a given circumstance.
To measure uncertainty, we typically utilize probabilities, assigning numerical designations to various paths we might take; those probabilities based on the data we do have available, contrasted with the data we don’t.
Probability can be complicated, and in situations in which there are significant consequences for bad decisions, any sense of unknowability can seem absolutely terrifying to the people who have to make those decisions.
In some cases, then, when the unknowns are many and the knowns not ideal, it makes psychological sense to just ignore any information that causes us stress, or information that might tell us something we don’t want to hear.
The Ostrich Effect was coined to apply, specifically, to investors who do their best to ignore inconvenient information about investments they’ve made or want to justify making.
One facet of this concept is what we might call denial: the tendency to ignore or discount provable reality in order to avoid feeling psychological discomfort.
Many of us experience denial of some kind at some point in our lives: it’s not an unusual disposition, and would in fact seem to be a common reflex that helps our brains protect us from emotional shocks and other discomforts.
There’s also an element of what’s called Selective Exposure Theory, which is a fancy way of pointing at our tendency to protect and maintain the informational filter bubbles in which we exist, because it’s more neurologically rewarding to consume information that reinforces our existing biases and presuppositions, and comparably distressing to be exposed to information that seems to challenge the same.
Applied to investments and decisions, then, Selective Exposure Theory would mean avoiding information that conflicts with our assumption that a particular industry will flourish in the coming year, or that a specific company will outperform its competitor. We’ve decided ahead of time, based on other sources of information, what we think about that industry and/or company, and it’s not fun to be faced with other data points or opinions that tell us the opposite; that tell us we might be wrong or acting based on false or incomplete information.
Interestingly, the term “Ostrich Effect” was originally coined in the early 2000s, and has since been challenged by the findings of a 2014 study, which found that folks working in the financial world, in particular, seem more prone to what’s come to be known as the Meerkat Effect than the Ostrich Effect.
The Meerkat Effect manifests as a kind of hyper-vigilance, rather than an enforced ignorance.
Instead of ignoring information that conflicts with their preexisting biases, then, this study found that investors will often flood their informational inputs with data, including data points that seemingly contradict their own biases. A disposition that comes with its own very different downsides.
The Ostrich Effect terminology has found a second-life outside the world of finance, however, and is now more commonly applied to politics: especially politicians who ignore polls, research, and experts because they’re trying to promote a particular message.
Ideological tribes—political or otherwise—often do the same: members filter their inputs so as not to be bothered by potentially informative, but psychologically distressing, information from sources outside of their ideological bubble.
This tendency is reinforced by other external factors, as well, including media entities, which can benefit from being seen as the sole, reliable source of legitimate information, and which can in turn provide their audience with the psychologically comforting permission to ignore anything they hear elsewhere as fake news, propaganda, and misinformation.
Although such positions are often antithetical to making informed decisions, then, they’re arguably understandable, depending on context, and in some ways less irrational than the decisions made by the people who hold them.
Enjoying Brain Lenses? You might also enjoy my news analysis podcast, Let’s Know Things.
There’s also a podcast version of Brain Lenses, available at brainlenses.com or wherever you get your podcasts.