The Johari Window
In 1955, psychologists Joseph Luft and Harrington Ingham developed a technique for self-help groups which they called the Johari Window—the monicker derived from a combination of their first names.
This technique aimed to help those who utilized it develop better heuristics related to their own sense of self, the world, and their understanding of both.
The most typical illustration of this concept is a two-by-two grid, with “known to others” and “not known to others” on the vertical axis, and “known to self” and “not known to self” on the horizontal axis.
Thus, the first cell in the grid contains things that are known to others and self, which is labeled “Arena” or “Public,” the second cell contains things that are known to others but not known to self, which is labeled “Blind Spot,” the third contains things that are not known to others but are known to self, which is labeled “Private” or “Façade,” and the forth contains things that are not known to others and not known to self, which is labeled “Unknown.”
The initial application of this technique asked participants to apply these different labels to a list of adjectives that they believed applied to themselves and their personality.
So they might choose “modest” from the list, and then decide whether that was a trait both they and others perceived about themselves, if it was something that others saw and they didn’t, and so on. And this would typically be done to a collection of such adjectives, ranging from “powerful” to “spontaneous” to “clever” and “able.”
This grid, though seemingly effective in its self-help application, was also later used in other settings, including by intelligence services and some scientific and engineering organizations like NASA.
The use of this model by the former—by military-linked spy services—let to one of the more public utilizations of it by the then-US Secretary of Defense under the George W. Bush administration, Donald Rumsfeld, who answered a question at a 2002 briefing (related to the lack of evidence linking terrorist organizations to weapons of mass destruction) by famously or infamously saying, “Reports that say that something hasn’t happened are always interesting to me, because as we know, there are known knowns; there are things we know we know. We also know there are known unknowns; that is to say we know there are some things we do not know. But there are also unknown unknowns—the ones we don’t know we don’t know. And if one looks throughout the history of our country and other free countries, it is the latter category that tends to be the difficult ones.”
At the time, Rumsfeld was criticized for this statement because it seemed, within the existing geopolitical context, to be a double-speak laden statement intended to confuse the issue and conceal the fact that the administration was scrambling to come up with a justification to go to war in Iraq, following the terrorist attacks in the US on September 11, 2001.
In more recent years, however, he’s at times been commended for fairly clearly (if imperfectly so) elucidating a complex topic to an audience of people who have no reason to be familiar with it.
When intelligence professionals are making decisions and recommendations to governments and military officials, they have to consider that there are things that are public knowledge, things that some people know but which they do not (and which they know they don’t know), things that they know but which others do not, and things they do not know (and don't realize they don't know): unknown unknowns.
Understanding which is the case in any individual situation can be tricky, and requires a fair bit of epistemic humility: a mindset that allows one to consider that one might be wrong, that one might not have all the information one needs to make an informed decision, and a sense of which cell on that grid one might be in at any given moment.
Unknown unknowns are especially tricky because by definition we cannot know ahead of time all the things we don’t know yet; and there’s actually a decent chance that with every decision we make about anything, there are more things we don’t know than things we do know.
Confounding the issue further is that it’s difficult to prepare for unknown unknowns, because of their very nature: you can keep things relatively flexible to give yourself room to adjust things, later, if more information emerges. But beyond aiming for late-stage malleability in our planning, how do you plan for variables you can’t possibly predict? You can’t. Hence, why such things can be so problematic to our ability to imagine and prepare.
Other cells on that grid can also confound our ability to accurately perceive ourselves and the world, though, if we’re not careful.
A clear sense of who knows what informs our capacity to understand other peoples’ behaviors, and even a basic sense of our own limitations—including when we might lack the requisite knowledge to clearly perceive something—can help clarify a lot of otherwise quite confounding, seemingly inexplicable, things.
Enjoying Brain Lenses? You might also enjoy my news analysis podcast, Let’s Know Things and my daily news summary, One Sentence News.
The Brain Lenses podcast is available at brainlenses.com or wherever you get your podcasts.