I (finally) finished reading Daniel Kahneman’s popular 2011 book, Thinking, Fast and Slow. It is a compelling read (and miraculously jargon-free) with loads of insights on the sources and consequences of our psychological biases. Above all, Kahneman demonstrates the general inability of the human mind to deal comfortably with statistics; rather, we seem prone to making decisions based on rapid and often superficial intuitions, usually based on a small number of oversimplified analogies. We have particular trouble when simple analogies come to mind quickly and vividly.
Many of his insights are applicable to the study of foreign policy decision-making and international relations, and public policy-making more broadly. One of the concepts Kahneman describes, for example, is the planning fallacy. Individuals, business executives, and politicians alike commonly fall victim to this bias, making plans and forecasts that are unrealistic, overly optimistic, and fail to take into account the statistical base-rate of similar cases. In the grip of the planning fallacy, people overestimate benefits and underestimate costs, spinning “scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or to deliver the expected returns – or even to be completed.” As somebody about to start a PhD in Political Science, I am no exception to this pattern.
This tendency towards over-optimism can be advantageous in certain situations. People feel happier when they are thinking on the bright side of things, which may actually make them more successful; entrepreneurs are more likely to attract investors to their company when they project confidence in their business model; politicians need to sell a positive message of hope and transformation of the status quo in order to win votes.
However, this kind of best-case-scenario thinking has also played a large role in some of the major foreign policy errors of the twenty-first century. Western interventions in Iraq and Afghanistan, for example, were justified in part by optimistic visions of peaceful, liberal, democratic nation-states in the heart of the Muslim world. Policy-makers (and plenty within the media and academia) ignored the base-rate of success for other cases of armed occupations attempting to construct democratic institutions in war-torn states, which was not very high. Not surprisingly, neither intervention did much to change that base-rate, despite an enormous investment of time, money, and human lives. Pro-intervention hawks pushing for an expanded military role for Western countries in places like Mali and Syria today commit the same error.
Another psychological trait Kahneman explores, particularly relevant to the study of terrorism, is the way humans react to extremely rare events. In the early 2000s, for example, many travelers in Israeli cities avoided buses because they feared suicide bombings, even though the statistical probability of being injured by a terrorist attack on a bus remained much smaller than being killed in a traffic accident in a car. The reason is because of differences in vividness: the thought of a suicide bomb produces a much more vivid image of death and insecurity, producing an instant and uncontrolled emotional arousal (something I also experienced briefly when I visited Israel in 2011). Even though people may “know” that the probability of a terrorist attack is low, they cannot escape the psychological discomfort caused by such a vividly imagined threat. This kind of mental bias goes a long way to explaining popular support for the massive re-allocation of public resources towards “anti-terror” programs and institutions in Western countries following the 9/11 attacks, even though a “rational” policy-making process would surely favour devoting those resources to more (statistically) deadly problems, such as heart disease, domestic violence, or traffic safety.
Another pitfall familiar to public-policy analysts is the sunk-cost fallacy, also known as the “disposition effect.” In the financial world, this costly bias is easily observed in the overwhelming tendency of (novice) investors to sell stocks which have performed well, and to hold onto those performing poorly. Each individual investment is imagined as an “account” in the mind of the investor, who wants to enjoy the pleasure of closing each account as a “winner” and avoid the regret of selling a “loser.” Of course, the rational strategy is always to get rid of stocks which are least likely to do well in the future, regardless of whether they are higher or lower now than at the moment they were purchased. But humans have a difficult time thinking this way, and the inexperienced investor ends up prioritizing immediate emotional reactions to individual stock purchases over his aggregate wealth.
Much the same effect occurs when political and military leaders invest in poorly performing foreign policy strategies. Rather than reverse course and admit that a policy or objective is unfeasible, they are likely to continue throwing good money after bad in an effort to avoid the humiliation of a costly failure. Faced with a failing effort to build legitimate state institutions in Afghanistan, U.S. political and military leaders have repeatedly chosen to double-down and escalate the war-effort, rather than make a broader assessment about whether those resources might be better spent elsewhere. The effect is especially strong because top decision-makers are mostly surrounded by subordinates whose own careers would be adversely affected by admitting their earlier advice was wrong, producing plenty of confirmation bias and group-think.
These are just a sampling of the kinds of biased decision-making processes described in Kahneman’s book. There’s much more to chew on, so put it on your to-read list if you haven’t already.