Repetition-Induced Truth Effect
The word "truth" is tricky to define as it means somewhat different things in different realms of inquiry.
In some contexts, something that is true is in alignment with reality or demonstrable fact and is the opposite of something that is not congruous with reality.
Thus, if I say the world is round, that's a truthful statement because the world is roughly spherical. If I say the world is flat like a pancake, that would be considered false, because it doesn't line up with reality.
According to some definitions of "truth," if we live in a world in which all the data we've gathered seems to indicate the world is flat, despite the world actually being round, it is true that the world is round, even if we don't yet realize it. According to other definitions it would be true the world is flat in that moment, because everything we seem to know says that's the case; so truth can change with time as our understanding of reality changes.
You might also argue that truth is subjective and based on our personal experiences of reality, so you and I could witness the same car accident and because we were standing in different places, because we've had different life experiences, and because I have bad eyesight and you don't, we may walk away with different understandings of what happened.
There's a universal "thing that actually happened" in that context, with a sequence of events having actually occurred, which in turn ultimately led to a car accident. But according to this perspective, your truth will be different from my truth, and that's just to be expected.
Interestingly, it seems to be possible to influence a person's impression of what's true by simply exposing them to whatever it is you want them to believe over and over again.
This is often called the "repetition-induced truth effect" (or sometimes just "the truth effect" or "repetition-induced belief"), and it's part of why companies with products to sell spend gobs of money to repeat the same slogans and catchphrases over and over again when we listen to the radio or watch TV, and it's part of why politicians are so keen to hit potential voters with the same message as many times as possible before voting day.
There's some evidence that there's a point of diminishing returns with this approach, so a handful of exposures to a message will increase a person's belief in it, but more exposures beyond that point won't have any real effect.
Alarmingly, there's also evidence people who know something that is true can be convinced to instead believe something that is not true (a false alternative to that true thing) through the use of this effect.
There's relevance here for most mediums of communication, but this is perhaps especially notable for social networks and other always on, free-to-use platforms where distributing false information in large quantities is especially easy and friction-free, and consequently putting a whole lot of nonsense into the world (that some people will eventually come to believe) is especially attainable.
One theory as to why our brains seem to behave this way is called the "fluency hypothesis."
Basically, our brains have an easier time recalling something they've been exposed to over and over again, and as a consequence are more likely to summon up that oft-repeated thing when it’s needed, rather than a less-repeated thing, even if the latter is demonstrably true and the former is not.
This theory has some support, but is still being investigated (as is the repetition-induced truth effect), so we unfortunately do not yet know why we seem to be so susceptible to this kind of manipulation—though even lacking a complete explanation, it doesn't hurt to note this susceptibility and counter it where feasible.
Paid Brain Lenses subscribers receive twice as many essays and podcast episodes each week. They also fund the existence and availability of all the free stuff.
You can become a paid subscriber for $5/month or $50/year.
You can also support all my work (and receive gobs of bonus content) via Understandary.