What I learned from Daniel Kahneman’s Thinking, Fast and Slow

I (finally) finished reading Daniel Kahneman’s popular 2011 book, Thinking, Fast and Slow. It is a compelling read (and miraculously jargon-free) with loads of insights on the sources and consequences of our psychological biases. Above all, Kahneman demonstrates the general inability of the human mind to deal comfortably with statistics; rather, we seem prone to making decisions based on rapid and often superficial intuitions, usually based on a small number of oversimplified analogies. We have particular trouble when simple analogies come to mind quickly and vividly.

Many of his insights are applicable to the study of foreign policy decision-making and international relations, and public policy-making more broadly. One of the concepts Kahneman describes, for example, is the planning fallacy. Individuals, business executives, and politicians alike commonly fall victim to this bias, making plans and forecasts that are unrealistic, overly optimistic, and fail to take into account the statistical base-rate of similar cases. In the grip of the planning fallacy, people overestimate benefits and underestimate costs, spinning “scenarios of success while overlooking the potential for mistakes and miscalculations. As a result, they pursue initiatives that are unlikely to come in on budget or to deliver the expected returns – or even to be completed.” As somebody about to start a PhD in Political Science, I am no exception to this pattern.

This tendency towards over-optimism can be advantageous in certain situations. People feel happier when they are thinking on the bright side of things, which may actually make them more successful; entrepreneurs are more likely to attract investors to their company when they project confidence in their business model; politicians need to sell a positive message of hope and transformation of the status quo in order to win votes.

Continue reading