Budget week might be a good time to remind ourselves of the fallacies on which bad policies feed. Last year the University of Michigan’s Professor Richard Nisbett wrote a short book called Mindware, about the ways in which people deceive themselves and others about statistical reasoning. Since reading it, I have been noticing examples of the art everywhere.
Think of Nisbett’s book as a field guide to a nature reserve. Keep an eye out for the Sunk Cost fallacy, wherein you argue that a nuclear power station or a supersonic airliner must be built because you have spent a fortune on it already. It should never matter how much cost has already been sunk into a project: it is only worth spending more if it is cost-effective.
Harder to spot is the Opportunity Cost. Money spent on one thing cannot be spent on another. The US Department of Homeland Security assesses the cost of Donald Trump’s wall on the Mexican border at $21.6 billion, which could buy quite a few bridge repairs, warships, or tax cuts instead – just to name the president’s own priorities. In the 2010 Comprehensive Spending Review, the British government deferred strategic road schemes with an average benefit-cost ratio of 6.8, while pressing ahead with High-Speed 2, whose estimated benefit-cost ratio at the time was 1.2.
A common but shy animal is Loss Aversion. This is the peculiar fact that people mind more about losing something than they are pleased by gaining something of equal value. If you suggest tossing a coin with the result that your friend has to give you £100 if it is heads, while you will give him £101 if it is tails, you’ll find he is not very interested. In general the reward has to be twice as great as the loss before people are keen to take on such risks. The beneficiaries of business rate changes [or National Insurance changes!] are more numerous than the losers, but we hear from the latter.
A close cousin is the Endowment Effect – that you are reluctant to give up something you already have. The misery of many Remainers about leaving the European Union is partly explained thus: the loss of membership looms larger in their minds than the possible opportunities outside. The average Leave voter perhaps never felt that the European Union was much of an endowment anyway.
In the same habitat lives the Status Quo Bias: our reluctance to embrace change. The MEP Daniel Hannan tells me that he could find few hedge fund or private-equity firms in the city of London that thought the Alternative Investment Fund Management Directive of 2011 was a good idea before it came into force. Now it exists, there are many that say they would be reluctant to lose AIFMD’s clammy embrace. Clever public policy can exploit this. In Germany only 12% of people are organ donors, while in Austria 99% are – because Germans have to opt in to organ donation, while Austrians opt out.
Over there is Confirmation Bias – the tendency to look only for evidence that supports your hunch and ignore that which challenges it. We’re all guilty. And try to spot the Availability Heuristic (first identified by Daniel Kahneman and Amos Tversky): the more easily an example comes to mind, the more frequent or plausible the phenomenon seems.
Listen for the call of the Spurious Correlation, so beloved of university press officers. The idea that second-hand smoke bans could drastically reduce heart attacks started with a study in Helena, Montana, which saw a 60% reduction in six months when it banned smoking in 2003 and a rebound when the law was struck down. No such dramatic effect has ever been seen since and the vast majority of studies find no evidence that second-hand smoke causes heart attacks. The Helena effect was a fluke (I favour smoking bans but on other grounds, that smoke is unpleasant). A website called Spurious Correlations has fun with this: did you know America’s crude oil imports eerily track its consumption of chickens?
In the grassland, keep an eye out for the False Positive. A 99%-accurate way of identifying terrorists in London will find 100,000 people, all but (say) ten of whom are innocent. In the swamp live two species of Hoc, Ad and Post Ergo Propter. Justifying the failure of predictions with new excuses is a favourite of climate science (ad). Nisbett overheard somebody urging a friend not to quit smoking on the advice of a doctor, because he knew of two people who did just that and died (post).
The nature reserve contains Hippos. This is the fallacy that the best way of deciding what to do within a business or a government is to take the Highest-paid Person’s Opinion. Firms like Google, and political campaigns beginning with Barack Obama in 2008, use a different and much more effective approach, called A/B testing. Try pairs of options and see which one work best.
When Dan Siroker of Google joined Candidate Obama’s campaign, instead of guessing whether to put “learn more”, “sign up now” or “join us now” on the button to be clicked, he tested the various options. “Learn more” was the clear winner. Thanks to the internet, this is a lot easier than it used to be, though of course it is not practical for choosing the designs of nuclear power stations.
It need not be confined to internet operations. Supermarkets use it to try out different store lay-outs. One of the coalition government’s best initiatives, from its “nudge unit”, was the plan to do randomized control trials of policies, based on Ben Goldacre’s argument that the efficacy of a policy was rarely tested with the same rigour as the efficacy of a drug. Sadly, since the departure of Sir Oliver Letwin from government and the privatization of the nudge unit, not much more has been heard of this idea.
One intriguing suggestion in Professor Nisbett’s book is that there is evidence people can be fairly easily taught about these fallacies, so they learn to avoid them or call them out. Perhaps in a world where there’s less need to teach people much mental arithmetic, because of calculators, or facts because of Google, or Latin, because it’s a waste of time (that’ll get the letters coming!), then room could be found in the curriculum for teaching people how to handle statistical reasoning with less naivety.
Matt Ridley, a member of the British House of
Lords, is an acclaimed author who blogs at www.rationaloptimist.com.
No comments:
Post a Comment
Thanks for engaging in the debate!
Because this is a public forum, we will only publish comments that are respectful and do NOT contain links to other sites. We appreciate your cooperation.