Gerd Gigerenzer’s Risk Savvy: How to Make Good Decisions

Author

Jason Collins

Published

September 10, 2014

I should start this review of Gerd Gigerenzer’s least satisfactory but still interesting book, Risk Savvy: How to Make Good Decisions, by saying that I am a huge Gigerenzer fan and that this book is still worth reading. But there was something about this book that grated at times, especially against the backdrop of his other fantastic work.

In part, I continue to be perplexed by Gigerenzer’s ongoing war against nudges (as I have posted about before), despite his recommendations falling into the nudge category themselves. Nudges are all about presenting information and choices in different ways - which is the staple of Gigerenzer’s proposal to make citizens “risk savvy”. Gigerenzer’s use of evidence and examples throughout the book also fall well short of his other work, and this is ultimately the element of the book that left me somewhat disappointed.

The need to make citizens risk savvy comes from Gigerenzer’s observation (which matches that of most of Gigerenzer’s faux adversaries - the behavioural scientists) that people misinterpret risks when they are presented in certain ways. If I say that screening reduces the risk of dying from breast cancer by 20 per cent, most people will interpret it to mean that 200 of every 1,000 people will be saved, rather than understanding that it means screening reduces the risk of death from 6 in 1,000 to 5 in 1,000 - effectively saving one out of 1,000.

Gigerenzer’s contribution to this area is to show that if presented in natural frequencies (i.e. tell people about the statistics as proportions of, say, 1,000 people), people are better able to understand the actual risks. This includes doctors, who are equally confused by statistics as everyone else, and who Gigerenzer suggests need training to communicate risks in ways that their patients can understand.

This ability to make citizens and experts risk savvy leads Gigerenzer to argue that people do not always need to be at the mercy of their biases. People can be educated to understand risks and experts can present them in ways that others understand. He advocates risk literacy programs in school, showing that simple decision tools can dramatically increase understanding of probability and statistics, although he spends little time discussing how well this education sticks. In making his point, Gigerenzer takes aim at the behavioural science crowd by claiming that natural frequencies wouldn’t have helped if people are subject to cognitive illusions - a strawman argument. As he does at semi-regular intervals through the book, Gigerenzer clouds an interesting argument with an attempt to engage in a battle that doesn’t really exist.

That said, I did enjoy this part of the book and have found myself quoting a lot of the examples. His arguments about how to present risk are compelling. Further, it is enjoyable to read Gigerenzer’s evisceration of the presentation of risk by various high-profile cancer organisations.

There are parts of the book where Gigerenzer is more pessimistic about the ability to educate the masses, such as when he channels Nassim Taleb and berates the finance industry for not understanding the difference between risk and uncertainty. In a world of uncertainty – where we do not know the probability of events – simple rules often outperform more complex models that are overfitted to past data. This provides a natural entry point to Gigerenzer’s well-established work (and subject of some of his better books) on the accuracy of heuristics. Risk Savvy has plenty of additional advocacy for their use with Gigerenzer arguing that we can be trained to use useful heuristics in making better decisions. Gigerenzer covers areas from marriage (set your aspiration level and choose the first person who meets it) to business to the stability of financial institutions, building on decades of evidence he has accumulated on the accuracy of simple rules.

Gigerenzer’s heuristics don’t always match up with his optimism that we can make people risk savvy. One heuristic he suggests is: “If reason conflicts with a strong emotion, don’t try to argue. Enlist a conflicting and stronger emotion.” He also recognises the limits to education, with heuristics such as “don’t buy financial products you don’t understand.” But given that a lot of people don’t understand compound interest, we might need to rely on the Dunning-Kruger effect to allow people to follow this rule and still make any investments.

One interesting point made by Gigerenzer is that there is still a role for experts (and even consultants) in a world where we use simple heuristics. Suppose we replace our complex asset allocation models with a 1/N rule - allocate our assets equally across N choices. This still leaves questions such as the size of N, what we will include in N, or when you should rebalance. For many heuristics, there may be more complex underlying choices - although I imagine heuristics could be developed for many of these too.

Gigerenzer is also a stout defender of gut instinct - again, as covered in his other books. Gigerenzer suggests (and I agree) that data is often gathered due to a culture of defensive decision-making and not because data is the major reason in the decision. This is, however, the weakest area of the book, as Gigerenzer’s stories reek of survivorship bias. Gigerenzer notes that leading figures in business reveal in surveys that they rely on gut instinct and not data in making major decisions. But how many corpses who relied on gut instinct are strewn along the road of entrepreneurship?

As another example, Gigerenzer talks of a corporate headhunter who had put a thousand senior managers and CEOs into their positions. The headhunter said that nearly all the time he based his selection on a gut decision. He was now being replaced by tests by psychologists. Gigerenzer puts this down to a negative error culture, with the procedures designed to protect the decision makers. But what is the evidence that the headhunter has been good at their job and could outperform the psychologists armed with tests?  Similarly, Gigerenzer suggests listening to those with good track records in business. Again, survivorship bias could make this a useless exercise. When talking of predictions of exchange rates in other parts of the book, Gigerenzer effectively makes this very same point - the successful people you see in front of you could simply be the lucky survivors.

However, the evidence that Gigerenzer has developed in the past would make it folly for anyone in business to throw gut instinct out the window - or to completely discard Gigerenzer’s arguments. But the way he makes the case through Risk Savvy feels built on anecdote and weak examples.

There is one rule I am going to take away from the book - an extension of my usual habit of flipping a coin for decisions about which I’m indifferent. Gigerenzer suggests flipping a coin and as it spins, considering what side you don’t want to come up. He used this example in the context of choosing a partner, but it’s not a bad way to elicit that gut instinct that you can’t otherwise hear.