Y2K Fallacy
This seems like ancient history now, but in the lead-up to the transition from the year 1999 to 2000, there was a firestorm of panic and media coverage related to what became known as the Y2K or Millennium Bug.
This bug was related to how most computer code was written at the time.
More specifically, the date designations in code back then (for historical and practical reasons) only used two digits for the year, so the year 1999 would be "99" and the year 2000 would be "00," which is a problem because that could also be read (by software) as the year 1900.
The panic was focused on concerns that computer systems, globally, would glitch and/or collapse because the machines would think it was 1900, or simply go haywire because their files and databases would be disoriented by the seemingly conflicting dates.
When the year 2000 arrived, though, relatively few issues were reported, which led to a sort of public denial about the entire episode.
Conspiracy theories abounded about what the Forces That Be were up to, whether the scare was a lie, and who stood to gain from all the panic that was sewn over the preceding years.
The reality, though, is that nothing big happened, in large part, because vast sums of money were spent and a whole lot of people expended heroic amounts of effort leading up to the transition to make sure code was changed—often at very fundamental levels—so few major bugs manifested.
The Y2K Fallacy—which is my term for this perceptual issue, and which is not a true fallacy—refers to a common narrative about the Y2K Bug: that we got all worked up about nothing, and we know this because few bug-related issues were reported when the year 2000 arrived.
This narrative primarily seems true to people who are ignorant of the estimated $300-500 billion (though it may have been higher) that was spent by government and private entities to fix this issue before it became an issue.
Most experts have said this was money well-spent, and although we can't be certain about what would have happened had they not made those investments, it probably wouldn't have been good.
This narrative is similar to a business firing their security guard because there haven't been any crimes reported at the business, lately.
The presence of the guard may have prevented any crimes from occurring, but while we may accurately perceive outcomes (no crime), we may inaccurately identify their causes (the presence of a security guard).
Thinking in these terms, it's possible to imagine other scenarios in which we make bad choices because of this same flavor of fallacious thinking.
If NASA deflects an asteroid that would have otherwise struck Earth, some folks on the ground might rally to defund NASA and their asteroid-deflection program because they've never experienced an asteroid impact, which implies—looking at things through this semi-fallacious lens, at least—that we don't need to fund such efforts.
The deflection program may seem like a waste of money to many people because all they see is a lack of asteroid impacts.
This suggests that we're prone to making judgements about causes and effects that are informed by incomplete or incorrect data, which in turn suggests that it might be prudent to invest in education efforts to inform the public about these successes if we want to be capable of funding similar efforts in the future.