Combatting the onslaught of low quality or outright false analysis circulating online means wielding simple, effective tools – not complex appeals to the rational mind. It is unreasonable for any particular person to think they can reason through their own cognitive errors. In any case, it is unrealistic to expect readers of news and political analysis to carry with them the toolkit needed to sort fact from fiction because such a rationalist’s toolkit takes a huge cognitive toll. What we should aim for instead is a reasonable approximation of reality, a constant attitude of ‘true enough to believe provisionally’. Only then might we begin to make progress against the constant slush of opinion pieces dressed up as analysis and the infamous ‘fake news’.
Cognitive biases are systematic mistakes that people make during unconscious thought patterns, and they are near constant. Usually they are just mental shortcuts that mean we don’t have to conduct taxing mental contortions every time we are faced with an uncertainty, but they often mean we make fundamental rational errors. Such errors like confirmation bias, hindsight bias, and fundamental attribution error plague our attempts to understand the world. This is especially so when trying to interpret it through the lens of someone else’s writing, where our own errors are compounded by the mistakes already made by the writer. So powerful are these effects that they continue to have an effect even while a reader is being reminded of these. Such a dynamic means if we as individuals are to better understand the world through reading, writing and discussing, we need an effective countermeasure against these cognitive short-circuits. Unfortunately, the intuitive solution is a self-defeating one. Appeals to teach people to become good “bullshit detectors” and to be aware of their own biases are futile.
For the sake of illustration, take your reading of this passage. In reading the opening paragraphs of this article you likely thought something like ‘I agree that this is an issue and people should learn about these kinds of bias so they understand their impact, like I do’. The catch 22 is this; awareness of cognitive bias is known not to impact the prevalence of it. It is alike to looking at an optical illusion and being told that the lines are the same length, or the colours the same shade, but still seeing the false image painted by your million-year-old brain.
Even worse than this, being smart is also no defence. Decades of research have indicated that people with superior cognitive resources are more susceptible to these kinds of mistakes. This is a reasonably well-understood phenomenon; the view is that smarter people are more able to identify patterns and justifications for what their intuitive animal brain already ‘knew’ all along.
Where we stand now is in an awkward in-between realm, where raising awareness of cognitive bias won’t help stem the tide of poor quality analysis and news because knowing you fall foul of cognitive biases doesn’t mean you no longer do so. Furthermore, even people who should be better able to identify these mechanisms are actually more prone to falling foul of them. Clearly, we can’t rely on deliberately reasoning through a suspicious-looking news article to get to the bottom of the issue raised, so we must look somewhere else.
Heuristics, or approximate ‘rules of thumb’, are in one way a kind of cognitive bias, but we can weaponise them in the war against being wrong. By carefully selecting such rules as a group and agreeing to apply them generally so as to limit the impact of cognitive bias, we can make real ground against prolific fake news and shoddy analysis. The classic heuristic that ‘if something seems too good to be true, it’s probably not true’ has countless counter-examples but, applying the principle to new pieces of information that come into our awareness, it takes on a new utility. As a tool to use when perusing writings by columnists or pundits day by day, it means bringing a critical eye to things that strike you as being a very convenient fit with what you already believe. Broad attention to such rules would mean fewer articles making bold assertions supposedly backed up by clear evidence that then melts away at the first sign of real scrutiny.
While not exactly amounting to a defence of folk wisdom, what we are encouraging is an active discussion over the relative merits of different heuristics. Group discussion of a particular line of reasoning is an excellent way to identify gaps and has a long and celebrated history going back to Socrates. The alternative is a denial of the power of cognitive bias, or worse, an insistence in the unreasonable position that you are not susceptible to it. Since one of the most infuriating cognitive biases is the tendency of people to overestimate their own capabilities and intelligence, an acceptance that smarter people have better use for heuristics since they’re more susceptible to cognitive biases would at least play to the average punter’s ego.
Centrethought is an online publication that provides a forum for young people to unpack the issues that affect our world in a rigorous and engaging way. We are politically unaligned, and publish writing from young people who come from a diverse range of academic and cultural backgrounds. Get in touch here: about@centrethought.com