There's a concept in economic and climate-related circles called "risk blindness."
The idea is that we tend to build our perception of risk, as individuals and communities, on prior norms.
This generally makes sense: we can't build accurate mental models around things that haven't happened (and which we therefore have little reason to think will happen, in addition to not being able to accurately imagine them), and history has shown us that recurring things will tend to reflect what we’ll encounter in the future.
Consequently, our mental and structural risk models tend to be built around “the way things have always been,” which leaves us prone to the unexpected.
Our organizations, governments, businesses, and so on will sometimes develop complex risk models to dodge "black swan" bullets, hoping to account for dangers that aren't yet on our collective radars, but which could theoretically upend things at some point in the future.
Such events (being unprecedented) have unknown parameters, though, which means we're forced to make such preparations based on the emergence of previous unknown variables, which means we're making choices about unpredictable things based on the shape and scale of previously unknown things that have surprised and harmed us.
All of which means we're relatively well-prepared for economic tumult and pandemics and warfare and climate irregularities that are roughly of a kind with what we've experienced in the past, but seldom more than that.
In an age in which technology is evolving fast, the climate is entering a new paradigm with frightening rapidity, and the scope and scale and nature of our societies and weaponry and information-distribution channels are also growing and evolving and becoming unrecognizable over a relatively short period, this means our preparations and our emergency tools and response plans will tend to be wrongly proportioned for the issues we face.
Consequently, our economic issues are flabbergasting, our pandemics (and their toll) are scaled-up to the point that our protocols no longer do what they’re meant to do, our guardrails for floods and storms and wildfires and extreme heat consistently fail us—not because we haven't considered possible risks and prepared accordingly, but because we've prepared for yesterday's risks.
It's been posited that we're blind to newly proportioned risks because the cost of preparing for things that seem—according to our current understanding—outlandishly large or widespread or harmful, are high.
Investing at such a scale means not investing in other (more contemporarily relevant) things, and that means we'll tend to under-prepare for disasters that are more damaging than what we've seen before because economically that's almost always the (seemingly) wiser and most justifiable course.
It's also been posited that difficult to imagine or uncomfortable outcomes may be more likely to take a backseat to more easily envisioned versions of the same.
Many of the entities and individuals who deny the scientific consensus about climate change, for instance, base their denial on the fact that the anticipated, grim consequences of not changing course on our emissions as quickly as possible seem to them to be unlikely to happen.
There's a hint of the famous Upton Sinclair quote here, about it being difficult to get a person to understand something when their salary depends on their not understanding it, but it's likely that many people who deny abundant research that’s staring them in the face are legitimately skeptical, believing that the presumed consequences of not changing course simply aren't possible because they're unable to imagine living in a world in which such things happen.
Which is a reasonable heuristic, in some cases, but a potentially dangerous one at a moment of fast-arriving change and regularly defied expectations.
If you’re worried about climate change emissions worry about CHINA & INDIA.