Free essays and podcast episodes are published every Tuesday, and paid subscribers receive an additional essay and episode on Thursdays.
Gregory Benford, an astrophysicist and science fiction author, is perhaps best known for a quote pulled from his 1980 novel Timescape, which has since been dubbed “Benford’s Law of Controversy.”
The adage goes like this:
“Passion is inversely proportional to the amount of real information available.”
This is more of a truism than a law; it’s perhaps better described as a heuristic to use when analyzing a discussion or topic of discussion. But the “law” part is generally considered to be a humorous exaggeration, rather than an actual attempt at a formalized statement of fact.
Hyperbole or not, this concept is potentially useful in that it memorably gestures at a common human tendency, which can help us notice some of our own counterproductive behaviors.
Often, the topics about which we know the least are the topics about which we’re most emotionally engaged.
This is not universally the case, and there are variances between people and cultures, almost certainly. But a quick assessment of what’s being discussed and debated in the public square at any given moment makes clear why this is a popular understanding of discourse in the macro, even if it’s not necessarily true in any individual circumstance.
Politics, for instance, is a domain in which many people have many firm opinions, but if most of us are asked specific questions about the details of the systems and policies involved, research has shown that most of us are able to do little more than shout slogans we’ve been taught by our preexisting political platform of choice.
There’s actually a whole field of research, tucked into the larger field of public choice theory, called “rational ignorance,” which posits that it’s actually a rational choice for most voters in most modern, democratic nations, to not put in the effort required to be educated about politics beyond a superficial level, because the reward gained compared to the effort expended results in an unfavorable ratio.
In other words, it costs too much time and energy to become fully educated about politics based on what a normal person is able to do with that knowledge. It requires too much time and effort to figure out how best to cast a single vote within a larger system, and thus, it makes sense, according to this thinking, to invest a little time in learning about the available political parties, and then allow them to tell you how to vote in each election.
This theory has been contested, and it’s anything but a concrete law of nature that people behave in this way. It’s possible that political systems could be recalibrated to make more detailed information available via more easily accessible, comprehensible, and unbiased means, for instance, and it’s possible that the average person could be imbued with more consequential power within existing systems.
At the moment, though, there’s reason to believe that people are perhaps acting in their own best self-interest, according to some metrics, at least, when they remain relatively ignorant about political matters; which could partially explain why so many of us are.
There’s a Dunning-Kruger effect at play here, too, which means, in essence, that we tend to think we know more about the things we know the least about, because we don’t know enough to understand how little we know.
The title of the paper in which this effect was first posited does a capable job of expressing what the effect actually is, I think. That title being: “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessment.”
While at first this concept might seem like a sweet burn, it’s actually a recognition of something quite common in most of us: our inability to see past knowledge horizons that we’ve had no reason to ever crest.
Before I started to learn how to cook, I had no reason to know about the Maillard Reaction—which is a part of what gives cooked food its flavor—or how to make a roux.
There’s a chance I might have come across either of these concepts asymmetrically while learning about something else, but I had little reason to even know these concepts existed—and I didn’t. They were completely new to me when I discovered them along the path of my learning-to-cook journey.
The same is true of concepts within every possible field of inquiry: our lack of awareness about these concepts, much less any comprehension of them, is completely understandable.
This ignorance about our own ignorance, then, is not an insult, but a recognition of reality. It’s unlikely that we will be cognizant of the specifics of fields about which we know little or nothing.
The consequence of this lack of awareness, though, is that we sometimes look at another field or practice or body of knowledge through the lens of ignorance and assume that what we see from the outside is all there is: a concept referred to as the What You See Is All There Is Bias by psychologist and economist Daniel Kahneman.
It’s possible, then, to look at something about which we know very little, to assume we actually know a lot—because we assume that what we see is all there is, and thus, we know all we need to know after a modicum of “research”—and to be lured by that assumption of self-expertise into making judgements and forming opinions that are not backed by anything except our own ignorance.
This is where Benford’s Law of Controversy comes into play, saying, basically, that when we find ourselves with little actual data and understanding, we fill in the gaps with passion and emotion.
We may even feel quite strongly about these sorts of things, even if we can’t point to any rationales for those feelings, beyond our instincts—which are often informed by past experiences, biases, and prejudices.
What we have, then, is a jumble of people—the majority of which know very little about the topic at hand—all believing that they are experts, when in reality they’re arguing about gut-responses to concepts they barely understand, beyond bumper-sticker slogans and editorialized framings.
Folks who know a lot about any subject tend to be humbled by the weight of what they come to realize they don’t know, and consequently, are less likely—in general—to speak in absolutes about those topics.
People who are fueled by passion, on the other hand, are understandably more likely to see those who come to different conclusions than they have as being stupid or evil, not just ignorant, which can leading to a type of Othering that disincentivizes productive discussion.
Which is also somewhat understandable: why would you try to have a serious debate with someone who is too dumb to think at your level, or who is actually evil in some fundamental way?
Benford’s Law of Controversy, then—despite not being a true law in any sense of the word—can be useful as a reminder that, often, if not always, the most passionate, inflammatory, emotion-driven topics are the least-productive, and fueled by the least-informed. And we, ourselves, almost certainly contribute to these same sorts of conflicts, even when we deeply feel that we are both well-informed and correct in our assertions.
Enjoying Brain Lenses? You might also enjoy my news analysis podcast, Let’s Know Things.
There’s also a podcast version of Brain Lenses, available at brainlenses.com or wherever you get your podcasts.
This is a very well timed read. Within the last couple of days I had a conversation with someone that aligns very well with this “law”, and I felt very uncomfortable with the conclusion of the conversation. The topic wasn’t one of high importance, yet I still felt attacked into which I now, after spending time thinking further into the words that were said, that the conversation was driven from a passionate fuel that blinded me and made me arrogant to what the person on the other side was trying to get across.
Loved the read, thank you Colin,
Turner