An extraordinary manslaughter trial in Italy provides a tragic example of what I call the all-or-nothing fallacy. This is the tendency to think of proof, knowledge, belief, and predictions in binary terms; either you prove/know/believe/predict something or you dont, and there are no shades of gray in between. The all-or-nothing fallacy leads people to think predictions have to be categorical rather than probabilistic, but this would mean we could never predict anything unless we were clairvoyant.
Seven scientists are now on trial in Italy for giving undue reassurance to the public that a major earthquake was not on its way. On 31 March 2009 Bernardo De Bernardinis, then deputy chief of Italy’s Civil Protection Department, told people in and around the medieval Italian city of L’Aquila that a series of tremors which had been felt in the area over the past four months posed “no danger.” Six days later an earthquake of magnitude 6.3 struck the city, destroying thousands of buildings and killing 308 people. If the scientists had not been so reassuring, the prosecution claims, people would have left their homes following the earlier tremors, and many would have been saved.
Thomas Jordan, an Earth scientist at the University of Southern California, in Los Angeles, who chaired an international review of earthquake forecasting in Italy in the wake of the tragedy at L’Aquila, argues that the scientists in charge of analyzing the risk posed by the ongoing swarm of tremors got sidetracked by the predictions of another seismologist. Public warnings by Gioacchino Giuliani, a laboratory technician at the National Institute of Nuclear Physics, reportedly caused panic in the nearby city of Sulmona two days before the scientists met to discuss the risks to L’Aquila (no big earthquake followed on that occasion). The scientists therefore, according to Dr Jordan, “got trapped into a conversation with a yes/no answer”. Since they could not give a definite “yes”, the result was that the scientists gave the impression that there would be no quake.
The all-or-nothing fallacy leads people to assume that, when something can‘t be predicted with 100 percent certainty, it cannot be predicted it at all. Indeed, the hope of policymakers until recently was that research in seismology would identify precursor signals that could predict earthquakes with near certainty. As a result, they have tended to ignore the less categorical, but still valuable information that is expressed in terms of probabilities rather than certainties.
Seismologists have known for at least 20 years that small earthquakes increase the likelihood that a powerful event will happen in the near future, even if the absolute probability of such an event remains low. Indeed, Warner Marzocchi and Anna Maria Lombardi of the National Institute of Geophysics and Vulcanology (INGV) showed that, a few hours before the earthquake actually struck in L’Aquila, modelling would have suggested that the chance of a powerful quake occuring within 10 kilometres of the city within three days rose from one in 200,000 (the background level) to about one in 1,000.
If the scientists had been comfortable with speaking about probabilities, they might have informed the public about the increase in relative risk, while maintaining that the absolute probability of an earthquake remained low. However, with the exception of an informal system in California, no country in the world has yet set up regular probabilistic earthquake forecasting that can be used to guide emergency actions.
Recently, a director at the UK Met Office emailed me to say that, in his experience, the public seemed to hate probabilistic forecasts. This is too defeatist; rural farmers in Bangladesh have been trained to understand and interpret probabilistic forecasts of floods without much difficulty. Local officials played a key role in this process. For example, an imam introduced the concept of probability during prayer time at his mosque. As a result, not only were the flood warnings heeded, but the concept of risk was better understood and widely discussed.It seems that people living on the edge understand risk very well, and are happy to accept and use probabilistic forecasts.