I have long been fascinated with the research on human biases and how they can affect our decision making. I’ve read a bit of Kahneman’s work and have seen it cited in many other books. I never quite got around to finishing ‘Thinking, Fast and Slow’ though. Ralf Dobelli’s The Art of Thinking Clearly provides an easily digestible summary of the research on human biases, neatly packaged into a list of 100 biases, each with their own short description.
In the following I collect some of the biases I find the most useful for to be aware of:
Survivorship bias leads us to overestimate the likelihood of success in an endeavour or profession. For instance, we will see many successful musicians and professional athletes but far fewer failed and unsuccessful musicians and athletes. This leads us to grossly overestimate our chances of becoming a successful musician an athlete. This bias is especially devious in the realm of finance. Funds managers are more likely to stay investors for a longer time if they happen to be successful by sheer luck. Thus, the population of funds managers is tilted by these survivors; giving us the illusion that it is easier than it really is to outplay the market.
Social proof leads us to believe that when a lot of people are doing or believing something that it must be good for us to do or believe the same. Authority bias is a variation of this expressing that we are even more likely to give excessive weight to what those in authority are doing and saying.
Confirmation bias leads us to see and rationalise observations in a way that confirms our beliefs. For instance, if my favourite sports team looses, it will be due to bad luck (or incompetent referees) but if they win, it would be due to the quality of their players, the competent coach and, not least of all, their fans. This bias is really the underlying cause of many of the other biases listed in The Art of Thinking Clearly.
Contrast effect leads us to judge something not in relation to an absolute scale but in relation to other things presented to us in the same context. For instance, if we see an expensive car next to a cheap car, we will make different judgements than if we are presented with two cheap cars. Some real estate agents like to exploit this by showing prospective buyers particularity hideous properties before showing the property they think will be suitable for the buyer. A related bias to this is the Anchor effect, which results in our judgements being influenced by the order in which options are presented to us. For instance, if we are given the population numbers of ten cities we are unfamiliar with, we will provide an estimate for an eleventh city within the same range.
Story bias makes us more likely to believe something to be true if it is presented in the form of a story. For instance, we are less likely to recall the information ‘Accidents are likely in constructions sites’ than we are to remember ‘Stephen forgot his helmet when he rushed out in the morning after a heated argument with his wife. A pole that came loose did not care and connected fatally with his head. His wife never forgave herself.’
Regression to mean leads to exceptional observations being followed by mediocre ones. For instance, if we invest in a share and loose 80% of our investment, the next investment we make is likely to be less unsuccessful. The problem is that we attribute the improvement in performance to changes we may have introduced (e.g. I will invest in coal miners instead of technology companies) rather than pure chance, thereby easily overestimating their effectiveness. This of course is reinforced by the Confirmation bias mentioned above.
Overconfidence simply expresses that we think ourselves way smarter than we actually are.
Halo effect leads us to believe that attractive people are more trustworthy and knowledgable than those that are less attractive.
Action bias describes that we tend towards taking actions rather than to wait passively. This can result in us acting to our detriment; in cases where wait and see would have been the better approach.
The Hedonic treadmill describes that we overestimate the impact of achievements and material gains on our happiness. A new job with better pay may make us happier temporarily, but according to extensive research, we are more likely than not to regress to our original happiness level.
Black swan events are events that have a very large effect on our live, yet are very unlikely. For instance, a catastrophic stock market crash. Our bias is that we equate very unlikely with impossible.
In-group bias makes us more likely to help those that we perceive to be in the same group as us. This was also explored in detail in Blueprint: The Evolutionary Origins of a Good Society by Nicholas A. Christakis, as one of the key elements of any human society.
Planning fallacy – a favourite of mine as a software engineer – and certainly a close cousin to the overconfidence bias leads us to believe that we can get more things done than we actually can.
What I liked best about The Art of Thinking Clearly is that it provides a clean and concise summary of vast reaches of behavioural research. I do think though that a lot of the descriptions should be taken with a grain of salt. Dobelli seems to have a tendency to exaggerate the importance of the biases and suggest more dramatic corrections of our behaviour than necessary (‘never do this, always do that instead’). In fact, I think many of the biases can work to our advantage in the right circumstances. Key though, is to be aware of them, and also to be aware that we are likely to succumb to them notwithstanding our efforts to the contrary.
I think our tendency to be overconfident in our abilities (Overconfidence bias) and see the world how we like to see it (Confirmation bias) is a danger to our own well-being and the well-being of others. Thus I think reading The Art of Thinking Clearly is a worthwhile endeavour in that it can teach us a bit of something often sorely lacking in our modern world: humility.
Picture credit: Sasha Freemind