This probably is my first post on the psychological side of Economics. I came across this ad in London and kept thinking about it.
“If it doesn’t feel right, it probably isn’t ”
On July 7th 2005, four bags containing bombs were left on London public transport. They exploded, killing 52 people. Since then, some 250,000 bags have been left on London Transport. None have turned out to be bombs. The probability of a left bag on London Transport being a bomb is something like 4 in 500,000 or 1 in 125,000. If we consider all the bags left in all the large metropolises of Europe and North America over a decade, we’re talking about microscopic chances. Thus, it’s more the case that:
“If it doesn’t feel right, there’s a 0.000008 chance it isn’t.”
We can imagine the policy-maker’s thinking when they came up with this campaign. “If we had reports of all the suspicious bags, we’d be able to stop some of these bombs from going off. But when people see a left bag, they probably think it’s nothing, and so they don’t report it. So, let’s just lie. That way, they’ll be so scared that they’ll report every bag. We’ll stop some bombs. And that justifies the lying.”
Same goes for the policy with mortgages. To prevent another meltdown, some policy-makers suggest exaggerating risks of foreclosure to scare people into choosing less expensive houses.
And I’m not against scaring people. Emotions drive decisions, and getting people to think about possible consequences can improve long-run decision making. But you can present truthful, graphic depictions of outcomes without lying about the probabilities of these outcomes. It’s the product of the probability and the outcome that counts. Improved risk literacy will help people lead happier, healthier and safer lives. But for people to become risk literate, they need accurate risk estimates, not phony probabilities that cry wolf.