Friday, November 05, 2021

Kahneman & Tversky - System 1&2, cognitive bias

Described in the New York Times as the Lennon and McCartney of social science, they worked together on the psychology of judgement and decision-making. Daniel Kahneman, Emeritus Professor at Princeton, won the Nobel Prize for Economics in 2002, for questioning the role of the rational agent in economic models. He worked on this with Amos Tversky and said "
I feel it is a joint prize. We were twinned for more than a decade." Their work on the nature of thinking and cognitive biases has a significant impact on teaching and learning.

At the beginning of their careers, they worked in different branches of psychology: Kahneman studied vision, while Tversky studied decision-making, they both became experts in human error or biases. Thinking Fast and Slow (2011) is essential reading if you are interested in how bias works in the mind. The less academic book, The Undoing Project (2017) by Michael Lewis, explains it in a more readable form, telling the story of the relationship between Kahneman and Tversky, and their different contributions.

Thinking Fast and Slow

Kahneman refers to System 1 thinking as fast and immediate without reflection. This is the instinctive, emotional, unconscious reaction we have to learning experiences, identified as important by Damasio in learning. System 2 is slower, where you can rationalise, reflect, reinforce, recall and apply your learning. We have a mixture of fast and slow thinking in the learning process and the learning journeys we take, of achievement, success, confidence or having overcome difficulties in learning experiences. There is also a lot of cold, rational thinking, understanding and application that has to take place.

Kahneman is right to remind us of the existence of fast and slow ways of thinking but he also warns us against the bias and mistakes that System 1 emotional and instinctive thinking brings in its wake. He asks us to be aware of emotional and instinctive bias, not to see it always as a virtue. Emotion and vividness strongly influence what we do, so we must take this into account as both teachers and learners.

In the penultimate page of Kahneman’s book Thinking Fast and Slow (2011) he addresses this issue of combating emotion and bias by saying that… “System 1 is not readily educable”. The lesson is that we should not look to changing System 1, assuming you can eliminate fast and unconscious bias. His recommendation is clear. “The way to block errors in System 1 is simple in principle: recognise the signs that you are in a cognitive minefield, slow down, and ask for reinforcement  from System 2.” This is good advice, so how do we do this? Kahneman suggests that organisations use ‘process’ and ‘orderly procedures’, such as ‘useful checklists… reference forecasts… premortems’. In other words, much is to be gained through organisational checks and balances, not training based on unreliable, diagnostic tools. Training has a limited effect on changing biases, so focus on process and procedures.

Cognitive biases - anchors, frames and availability

To err is human as the mind is an evolved coping mechanism that has to deal with what is thrown at it, so it has limitations and is fallible. This matters when trying to understand why and how people make mistakes, an important issue in learning as avoiding and correcting mistakes is often the goal.

They showed that we often ‘anchor’ ourselves first, then make judgements based on that anchor. You anchor, for example, on the asking price when buying a house. Anchoring can be measured and it distorts judgements. This occurs in marking, where knowledge of previously high or low scores affects the marking at hand. Anchoring has even been shown to affect judgements by Judges on prison sentences. For Kaheneman and Tversky, anchoring results from fast System 1 thinking that provides memories (anchors) to the more rational System 2. and you are unaware that it happens. They call this WUSIATI - what you see is all there is. This priming is both useful and dangerous. Priming in learning can be used to push a consistent and true learning experience but also dangerous as it can distort learning.

We also ‘frame’ decisions and actions. Simply reworking a proposition affects how we react to it, especially on calculating risks. In addition, the ‘availability bias’ means the more readily you can call something to mind, the more available it is to you, making you attach a higher probability to it being true. The lesson here is that System 1 thinking, an instinctive process of which we are unaware, often just a feeling that we are correct, can be seriously misleading. This explains the fact that learners often ‘feel’ they have learnt things but when tested have not.

The ‘conjunction fallacy’ leads us to artificially inflate the truth of conjoined propositions: 

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which is more probable?

   Linda is a bank teller.

   Linda is a bank teller and is active in the feminist movement.

People err towards the second. 

We also tend to avoid losses, called ‘loss aversion’, more than welcoming gains. We see this in the purchase of insurance, where people buy on the basis of an imagined and predicted loss, even when the cost of the premiums is greater than the cost of that loss. This is an important finding in both psychology and economics.

Such cognitive biases are part of human nature, even trained scientists and statisticians make simple bias errors. In pilot training, for example, it is important to accept that pilots will make errors under stress, and that you can’t ‘train’ this out of them. What Kahneman and Tversky did was to change pilot training forever, by refocusing on the cockpit decision making environment and encouraging criticism of the decisions from the person in charge.

Kahneman also discovered the Peak-End rule, where we remember the positives or negatives, not on the experience as a whole but based on its peaks and end. An important principle when designing learning experiences.


Kahneman and Tversky’s remarkable work on cognitive bias, as well as the characterisation of thinking in terms of fast and slow systems, have had huge influence on psychology, economics and many other domains, as well as learning theory and practice. Cognitive bias, for example, is now a theme of significance in the processes of teaching, learning and assessment. Unfortunately, their lessons have not been wholly adopted, as the focus is still on training towards changing such biases, rather than accepting that they exist and changing how we do things. It is still a phenomenon that is being seen as something to be corrected by training. This is where we may be going astray, in seeing biases as easily corrected through training. They are not. Their view is that emotion and instinct are poor masters and that reason must be used to counter their distorting confluence.


Lewis, M. (2017) The Undoing Project. Penguin

Kahneman, D. (2011) Thinking fast and Slow. Penguin


No comments: