In Mark Twain's autobiography, the author said, "The glory which is built upon a lie soon becomes a most unpleasant incumbrance." And then a bit later, "How easy it is to make people believe a lie, and how hard it is to undo that work again!"
This quote is often somewhat ironically misstated as "It's easier to fool people than to convince them that they have been fooled."
A similar sentiment was posited by a programmer named Alberto Brandolini in a 2013 presentation in which he said the amount of energy needed to refute bullshit is an order of magnitude bigger than to produce it; this observation has since been termed Brandolini’s Law, or in some cases The Bullshit Asymmetry Principle.
Said another way, it's much easier to go on Twitter and claim you've discovered a cure for malaria and it's made from a plant that only grows in Madagascar than it is to prove such a claim is false, and the amount of effort required to prove said falsehood is substantially higher than the amount of effort required to post the false tweet.
This is thought to be part of why the internet—and social media in particular—is so awash with misunderstandings, misinformation, and outright lies: the cost of posting false information is small (in terms of the time required to do it, but also reputation in many cases) while the cost of disproving false information is high.
Thus, even the most ardent truth-seekers will tend to drown in the deluge of new falsehoods that have been published since they began investing their time and energy debunking just one. The advantage is on the side of mistruth on such platforms.
Over the years, it's been contended that this internet adage—a bit of digital folk-wisdom, not a true law in any sense of the word—is a lot less applicable in some realms of inquiry than others.
In the world of mathematics, for instance, it's unlikely that someone will be able to make a false claim that cannot be more or less immediately shown to be untrue (to anyone who understands the language of mathematics, at least).
This is also the case in other fields with mechanisms through which you can quickly and inexpensively test claims and make the results of those tests visible to all relevant parties.
This is less the case when it comes to some aspects of scientific inquiry, however, in part because it can be time-consuming and expensive to test claims.
You might be able to show with a high degree of certainty that the plant from Madagascar doesn't cure malaria, but it might cost a small fortune to do so, it might take months or years to conduct the requisite tests, and it may be that everyone who was paying attention at the beginning has either lost interest or decided what they believe based on gut-feeling (or the claims of self-declared experts on YouTube) in the meantime.
The reality is even more grim for fields in which truth, and arbiters of truth and mechanisms for establishing truth are even less concrete, less trusted, or less understood, like journalism or social sciences or even just interpersonal relationships.
It's possible to falsely claim a person you don't like did something horrible, and to have that mistruth spread far and wide before any counter-statement can be made, and even then there's a decent chance the initial smear will be more broadly shared and have a longer life than the correction or evidence against it—if such things do in fact get published or disseminated in some way, eventually.
We humans are also born with cognitive defaults that make us more likely to believe—and then cling to—seeming facts that appear to reinforce our existing biases, even if those "facts" are clearly nonsense, or are at some point widely shown to be nonsense; we’ll figure out ways to keep believing them, regardless.
All of which makes establishing truth in the first place, and stomping out mistruths—or outdated previous, presumed truths—the work of lifetimes, not hours or days.
Brain Lenses is part of the Understandary portfolio of projects.
You can find the Brain Lenses podcast at brainlenses.com or wherever you get your podcasts.