We live in a world of unrivaled—in the historical context—quantitative potential.
We can track our sleep, the number of steps we take per day, our heart rate and blood-oxygen levels and blood pressure using devices that are casually available, included alongside other trackers and sensors in our devices, some of which we wear all day without even having to think about it.
If we make use of more pro-grade technologies, we can track complex abstractions of our brain waves and consciousness and fluctuations in the chemicals that moderate our moods and which shape the outward projection of our personalities.
We've got satellites looking down at Earth tracking weather and traffic patterns, and we've got others looking outward, identifying black holes and trojan asteroids and exoplanets around distant stars.
We build ever-more-sophisticated tracking mechanisms, we learn to crunch data in novel ways, and we improve our models, which we hope will allow us to wield these numbers and figures more skillfully and purposefully in the future.
The McNamara Fallacy is named after Robert McNamara, who was the US Secretary of Defense during part of the Vietnam War, and who in his earlier life as a business executive had been very keen on the idea that if you measure enough and properly, you can optimize any situation, and that'll allow you to produce whatever outcome you prefer.
This theory was popular in the business world following the success of methodologies like Taylorism and Fordism, both of which revolved around breaking a task into component pieces and little by little optimizing each of those pieces.
The most well-known manifestation of this approach was seen in Henry Ford's car factories, where workers would obsess over tiny details like which screw to use and how far workers would turn each screw: efficiency of expense and labor was key, and this obsession with standardization and economies of scale and iterative improvement made Ford a massive success, and other companies that utilized similar models likewise came to dominate their industries.
The McNamara Fallacy says that while it's tempting to assume we can measure everything we need to measure, those measurements will always be the most accurate models for understanding what's happening, and anything we cannot measure—either because it defies measurement or because we don't yet know how to quantify it—isn't worth understanding or working into our math, this is not the case.
There are times, within controlled environments and finite slices of life where we can quantify a great deal, and that allows us to (for instance) make more, better cars faster and cheaper.
But when applied beyond the confines of such spaces (and arguably within such spaces as well, even if it’s not always obvious to management) there are other variables at play. And while it may someday be possible to quantify these as well, the idea that our numerical models are always the most accurate and useful models available, that we'll always measure things accurately and correctly, and that anything not measured in this way is worthless for making plans, is fallacious.
Just because we can measure something doesn't mean we understand it, and just because we have measurements doesn't mean we've measured every influential variable.
In the mid-1980s, the US Navy adopted a slew of recommendations made by a statistician named W. Edwards Deming. These recommendations were meant to help the Navy improve their overall effectiveness by implementing models of standardization and iterative improvement.
That statistician, Deming, was a big fan of quantification as well, but benefitted in some ways from the lessons learned in the Vietnam War. He often said, in essence, that although it would be nice to be all-seeing and to have perfect information, we never do. He would thus often tell his followers, the folks who implemented his standards-related tenets, "Nothing becomes more important just because you can measure it. It becomes more measurable, that's all."
There's value in quantification and the leveraging of our number-related understandings of things, then, but they're not the only variables that matter.
In some cases, like in Vietnam, the feelings and understandings and morale of local, rural people will matter, and will matter in a way that's difficult to accurately quantify. This general sense, this vibe (qualitative data), can influence the outcome of a war that's otherwise seemingly perfectly measured and balanced, and alongside countless other such variables can throw everything off if you fail to take it into account.
The McNamara Fallacy is considered to be widespread, today, as well, especially in fields in which measurements based on raw numbers are revered to the exclusion (or near-exclusion) of all others, and in a macro-context in which numerical understanding of ourselves and the world have become selling points of countless products, organizational systems, and perceptual approaches.
Paid Brain Lenses subscribers receive twice as many essays and podcast episodes each week. They also fund the existence and availability of all the free stuff.
You can become a paid subscriber for $5/month or $50/year.
You can also support all my work (and receive gobs of bonus content) via Understandary.