• Lottie Buttle

CMI Brief: Thinking, Fast and Slow

Updated: Mar 9


Traditional economic theory has long posited that humans are rational decision makers. But our behaviour often seems inconsistent with that expected from utility theory. The deviation of human decisions from the rationality predicted by theory provides the premise of Daniel Kahneman’s 2011 bestselling book Thinking, Fast and Slow.


Kahneman beautifully summarises the decades of research that he and the late Amos Tversky (to whom the book is dedicated) conducted on human decision-making during their professional and personal relationship.


The book is divided into five parts. Kahneman initially describes the two crucial decision-making components as the book’s key characters – the intuitive system 1 and effortful system 2. He subsequently outlines some heuristics – rules of thumb which simplify decisions – that system 1 falls victim to, which can lead to cognitive biases – systematic errors in judgement.


The book highlights not only the research done by Kahneman and Tversky but also Kahneman’s subsequent research focusing on the concept of the two selves. In addition, Kahneman raises issues which are fundamental to our understanding of human behaviour and how it can be manipulated which is important for dissecting our response to the coronavirus pandemic. Although it may feel like it takes eternity to read all 499 pages of the book, and even longer to comprehend fully, the depth of understanding it provides is indispensable to understanding how we make decisions.


Systems 1 and 2


The decisions we make in daily life requires the use of two separate but mutually influencing systems of thinking. Kahneman describes the two systems as characters in a story and, as such, they are often in conflict with each other:


System 1 describes fast thinking, which uses intuitive, quick processes to arrive at a response seemingly automatically. System 1 makes use of a variety of heuristics as short cuts to reduce cognitive load during decision making resulting in satisfactory results. At its core, system 1 is driven by associative memory mechanisms in which prior experience affects current decisions.


System 2 in contrast is slow effortful thinking, driven by logic and rationale. It makes up considerably less of the decisions we make and tires easily. For this reason, it often relies on information from system 1.


Heuristics to know for the pandemic

Framing effects. The way equivalent information is presented can dramatically alter how attractive it is perceived to be.


For example, prospect theory. This is the way we tend to be loss averse in our decision making as we perceive the probability of gains to be greater than that of losses. When the case – fatality ratio from coronavirus in the UK is presented as a death rate of 2.9%, people perceive their risk of death as higher than when the same information is described as a survival rate of 97.1%. The difference in perceived risk could have an effect on subsequent behaviour, such as how much people decide to follow safety guidelines.


Availability heuristic. We often ignore statistics in favour of basing our decisions on personal experience or those of people we know.


The availability heuristic occurs when we perceive something as more significant if examples of it readily spring to mind. For instance, when asked, "how many people do you know personally who have tested positive for coronavirus?” your answer can shape your estimation of how prevalent you believe the virus to be.


The number of people we know who have been personally affected by the pandemic could inflate our perceived presence of the virus and affect our decisions. To mitigate biases from the availability heuristic, people should focus on objective data detailing the virus's presence, rather than relying on anecdotal evidence.


Confirmation bias. People aim to surround themselves with others who present information consistent with their views and opinions.


As humans, we seek to avoid being in a state of cognitive conflict – an uncomfortable psychological state we experience when we encounter new information which is not consistent with our current beliefs. Conformation bias is perpetuated by social media platforms that intend to maintain user engagement by providing them with access to like-minded individuals.


This may have contributed to the ‘infodemic’ recently declared by both the UN Secretary-General and the Director-General of the World Health Organization. An infodemic describes a situation where people are bombarded with information, some of which may not be accurate, as has occurred during the coronavirus pandemic. The spread of niche opinions could potentially have damaging consequences for the pandemic response, such as decreasing trust in scientific advice.


Peak-end rule . People judge an event based on how they felt during the peak (the most intense period) and at the end of the event – rather than on the experience overall. Similarly, perception of experience is marginally affected by the amount of time it takes, known as duration neglect. Easing lockdown restrictions gradually, for example, could lead to the end of a lockdown being experienced as relatively pleasant. People may, without realising, lessen the extent of the unpleasantness they recall having experienced.


The book’s impact

Thinking, Fast and Slow inspired not only a wealth of new academic but also new approaches to public policy like David Cameron’s Behavioural Insights Team (formerly the Nudge Unit). The impact it has had on the way we view ourselves and our decisions has been profound and could offer useful policy suggestions to manage our response to the continuing COVID-19 pandemic.