Cognitive Reflection and Misinformation
The concepts of System 1 and System 2 thinking—popularized by Daniel Kahneman in his book, Thinking, Fast and Slow—refer to our dual capacity to think quickly and intuitively, and to slowly reflect and cognate when circumstances require thorough assessment.
Said another way, we have mental processes that allow us to make rapid decisions with little information, and that can help us survive dangerous situations (and can help us make relatively unimportant decisions quickly and with minimal effort), but we can also pause, take our time, and come to more complex, complete conclusions by using a different set of processes, when warranted.
Among other things, the book posits that there’s often a tension between these two types of thinking, as in some cases we’ll benefit from a sort of knee-jerk response to external stimuli, but we instead succumb to “paralysis by analysis,” and in other (perhaps more frequent) cases we might benefit from slower, more time- and energy-intensive pondering, but we instead go with our gut, much to our detriment.
Recent research into false information (and how we respond to and process it) has found that the system we use to parse misinformation partly determines whether we fall for it or not, and that the type (and valence) of said misinformation may bias us toward one system or the other.
In this context, what Kahneman referred to as System 2 thinking is called “Cognitive Reflection,” which basically means our capacity to notice and weigh our gut-response to things, and to bypass that intuition when our more logical, rational selves deem it appropriate to do so.
And those researchers found that our social groups, and specifically how identity-based our social groups are (how much our friendships are predicated on our political leanings, for instance) can be highly predictive of our likelihood to believe misinformation that aligns with our existing beliefs.
Subjects who engaged in more cognitive reflection (pausing to assess whether news stories they were given were legit or not) showed a higher propensity for being able to sort real from fake headlines, but this didn’t seem to protect them from believing false political rumors if those rumors aligned with their political biases.
This is just one relatively limited study, so these findings aren’t the last word on this subject. But the implication is that while our more logical thinking processes can help us sort misinformation from real-deal fact, the degree to which our self-identification and friend group are shaped by ideological beliefs (like politics) seems to be a more potent force in some cases, overwhelming our falsehood-filters so that we’re more likely to believe incorrect things if they reinforce our sense of the world and how it works.