In the aftermaths of the terrorist attack in Paris and the security threats in Brussels, I have struggled with mixed feeling. Empathy with journalists trying to make sense of fragmented news to keep people up-to-date, and politicians (who’s shoes I would not like to be in) facing decisions to protect its residents, while managing limited resources and capacities. As I trust the government for doing its best (even if not ‘the best’), I have tried to remain critical and cautious to the information flow and my own reactions.
Reading Kahneman has struck me as timely insightful to further comprehend the way fear works; due to the attacks and threats as well as the disturbing spill-over effects it has had on immigration and racism.
‘..widespread fears, even if they are unreasonable (…) should not be ignored by policy makers, [they] must protect the public from fear, not only from real dangers.’
Kahneman explains that fear is not always based on real threat. It is based on our personal experiences and our biases, which we simply cannot resist. When we easily can recall the memory of an event, we will likely exaggerate it, and also believe it can happen again. For example, a victim of a car or bike accident will be particularly afraid it will happen again, even if knowing the statistics say the opposite. Luckily our fear diminish as our memories fade over time. Studies of estimates of causes of death demonstrates our misconceptions about likelihood; while stroke cause almost twice as many deaths as all accidents combined, 80% of all respondents judged accidental death to be more likely. Death by accident was judged to be more than 300 times more likely than death by diabetes, but the true ratio is 1:4.
Unusual events attract disproportionate attention and are consequently perceived less unusual than they really are. Our exposure to such events and the emotions they cause distort our judgment and our opinions, without us even knowing. Media coverage is both shaped by and shape what the public is interested in. Sometimes journalists reports a minor event, leading up to public panic and ending up in large-scale government actions. While a country such as Israel has faced terrorist attacks, the weekly number of casualties almost never came close to the number of traffic deaths. Yet people do not fear the potential drunk driver that may threaten their life.
We can infer the general from the particular but not the other way around. In other words, being presented with statistics first will not change a person’s judgement and opinion, even if they know for a fact that they are wrong. A persons is on the other hand more likely to learn from first being presented with individual compelling cases, preferably rather simplified and coherent. This proves the point why populist politicians effortlessly glamour voters with their simple storyline and out-perform researchers and rights defenders that present countering facts and figures. The learning is that we need to communicate differently, to better match the psychology of the human mind.
People prefer lying to themselves than facing the facts, even if it is staring us in the face. We rather create a coherent story, although it is inaccurate. In other words, our intuition beats logic. We believe in the simple storyline, and we have an ‘almost unlimited ability to ignore our ignorance’. We tend to stick with our beliefs no matter its absurdity, especially if it is shared by a community of like-minded people. When our intuition turns out wrong, we just reconstruct our former beliefs, and often we cannot even imagine that we ever felt differently. When people say ‘I know this would happen’, it is merely a belief and often a reconstructed one. We tend to give little credit to politicians for decisions that turned out successful, because we tell ourselves afterwards that such a move was obvious. On the other hand, we blame them for decisions that worked out badly.
We trust decision made by humans more than by algorithms, we prefer natural over artificial. Decision makers believe they can overrule the formula based on additional information and their own intuition. Unfortunately, more often are they wrong than not. The more the experts believe they know, the more disillusioned and overconfident they become. On top, they are resistance to admit when they are wrong. Although in the end of the day, errors are inevitable because the world is both difficult and unpredictable.
People prefer actions based on pretended knowledge, than no solutions at all. Somehow we cannot bear the thought that decision makers may be merely guessing, and that their ‘educated guesses are no more accurate than blind guesses’. Even more disturbing is the thought of politicians guessing under extreme uncertainty when stakes are high. The fact that large historical events has been determined by luck, is still incomprehensible for many.
Read also my previous post ‘Too Slow for Fast Populism?’ based on the first part of Kahneman’s ‘Thinking, Fast and Slow’.
Leave a Reply