Citogensis
There's an xkcd cartoon that shows a stick figure making up a fact and submitting it to Wikipedia. Another stick figure—a rushed writer—uses that “fact” in a piece they're publishing, and then someone flags the made-up Wikipedia-fact, which results in another stick figure—this one a Wikipedia editor—finding that piece of published writing which uses the false fact, and then citing it on Wikipedia to validate said false fact.
As a consequence of this cycle, which the webcomic's author calls "citogenesis," a made up fact becomes its own reference, which in turn reinforces its seeming legitimacy.
This cycle is catalyzed, then, by misinformed or malignant people on the internet who share bad or incomplete information, is reinforced by folks who are meant to check their sources, but who—because of the realities of the publishing world today—don't always do so (at least not to a high standard), and is then locked-in by those who generally help weed out bad information by culling weak claims from info libraries like Wikipedia.
Longer term, this cycle can create a super-cycle: that original not-fact, thus supported by a perceptually legitimate Wikipedia-and-source bundle, is repeated many more times by many other publications and as a result is even more difficult to extract from information canon, if someone ever realizes what has happened and makes the effort to remove it.
This is similar in some ways to the ongoing replication crisis that has enveloped the social science world for the past decade or so.
In the early 2010s it became apparent that a great many seemingly legitimate social science studies were not replicating: an important aspect of scientific inquiry wherein other groups of researchers repeat earlier studies to see if they get the same results as the original group.
Some replications were showing different results, but by the time these repeat studies were completed, often years after the initial studies, many other studies had been conducted based on those first-study results.
This issue, which started small, spiraled out of control because of the incentives within the social science world. Namely, there was more prestige and financial support for new findings—and especially new, surprising findings—than for the yeoman's labor of replicating other people's work to confirm what we thought we knew.
Surprising new findings could make a career, while reworking old studies would lead to no recognition at all. And consequently there was a growing backlog of studies that hadn't yet been replicated, which meant those studies—not yet checked by this replication system—were becoming the bases of new studies in the meantime.
Years later, this led to a flood of panic and "what do we do?"-style editorials, because, in essence, we didn't know what was real and what wasn't. There was so much potential nonsense in the system that it was impossible, in the short-term at least, to disentangle demonstrable fact from popularized fiction.
There's an element in both of these spaces of what's called the "garbage in, garbage out" principle, which says if the data you're using to process something is nonsense, you're likely to get nonsense on the other end, as well, even if your processing mechanisms are otherwise fine.
But the compounding element of these types of cycles are what make them especially pernicious and dangerous.
An error on Wikipedia reinforced by a citation doesn't always remain a single error on Wikipedia: it spreads and spreads and spreads, becoming, for all intents and purposes, a new fact, at least as far as many publications and Wikipedia-perusers are concerned.
Likewise, a study that goes un-retested for too long can become woven into the infrastructure of social science supposition, becoming—for all intents and purposes—a brick in the foundation of all future research within that facet of scientific inquiry.
Removing such bricks is difficult, because you probably won't need to extract just the one: you'll need to remove all bricks touching that brick, as well, and all bricks touching those bricks. The same is true of false-facts with the veneer of actual facts on Wikipedia or anyplace else we tend to look for such information.
Interestingly, some research indicates Wikipedia can be just as or more factually accurate than centralized encyclopedias (like Brittanica), but much of this research is at least a decade old, and more recent studies have suggested the quality of articles varies wildly between topics and languages, so it’s a good idea to check sources assiduously and stay alert for self-validating feedback loops.
Brain Lenses is part of the Understandary project portfolio.
You can find the Brain Lenses podcast at brainlenses.com or wherever you get your podcasts.