‘Towards a zero-risk society’ might be thought as the manifesto of a dishonest politician who tries to win voters’ support by promising to free them from any possible risk. A deceitful manifesto, however: although it could attract plenty of people and gather widespread consensus, the promise would be absurd.
The relevance of the issue emerged during the public debate concerning the use of Covid-19 vaccines, in particular the one produced by Astra-Zeneca, which is currently the first in terms of global reach.
Such debate made headlines in highly developed economies. It was probably fed by how individuals (mis)perceive risks and by a genuine desire to avoid risk.
What are the reasons behind the aspiration to a zero-risk society?
Individuals and human societies as a whole are always exposed to situations that may have negative outcomes. For centuries, this simple truth has led human beings to devise acceptable ways of dealing with the risk. In recent times, however, people believe they can have a life without risks. The promise of building up a zero risk society, therefore, might be the winning promise of unscrupulous politicians.
Many elements can explain the ongoing efforts to obtain a riskless life. First, it is quite likely that in affluent societies population aging goes hand in hand with an overall increase in risk aversion: Older people dislike risks more than the rest of the population because they are more vulnerable to various categories of shocks. Second, fast technological developments have fostered the (wrong) belief that everything can be put under control. This is not always the case. In the famous movie Armageddon, a nuclear weapon is detonated to split in two an asteroid on its way to the earth. Unfortunately, not all problems can be solved by resorting to nuclear weapons. Fighting viruses or tackling antibiotic resistant bacteria are two examples. A third reason that might explain the ongoing efforts to obtain a riskless life relates to the fact that individuals are exposed to an increasing amount of information. This may be a mixed blessing. Some information may be incomplete or inaccurate, leading to wrong beliefs about the plausible remedies to risk. Moreover, large amounts of information may create false cognitive perceptions about the magnitude – hence the relevance – of risks. And our brain tends to overreact. There are reasons to believe that these mechanisms were active during the recent debates about the use of covid-19 vaccine.
How our brain perceives risk
Human beings use methods of thought that tend to generate quick approximate answers. These answers are good in most cases. But they can also be characterised by systematic errors called biases. Biases affect the assessment of risk.
The psychological literature documents that some people do know the source of a number of undesirable events, such as death. Yet, when asked to quantify risks more precisely, they significantly overestimate the frequency of the less likely causes of death, and underestimate the frequency of the common causes. More generally, information about the occurrence of a given harmful event may modify the perception of risk. Individuals tend to attribute a greater probability to an event likely to occur again. This has something to do with the so-called hindsight bias: after learning that a given event has indeed taken place, individuals give a relatively high estimate to the probability it might occur again, higher than the estimate assigned by those who ignore that the outcome has already occurred. The hindsight bias is sometimes called the I-knew-it-all-along effect.
Clearly, this mechanism has potentially negative consequences on how human societies deal with risks. If we suppose that minor hazards occur more frequently, the misperception of risk may lead to a request for protection against such minor hazards, while less than adequate attention will be paid to major risks. This comment applies to the recent discussion about the opportunity to use the vaccines approved by the regulatory authorities with stringent checks, in particular the Covid-19 Vaccine by Astrazeneca.
Because of the systematic tendency humans have to overestimate known events that have very small probabilities, it happened that a few suspicious cases were enough to overstate the risks of adverse reactions. At the same time, however, public opinion neglected the dangers of being infected in the absence of a vaccine.
Two lessons can be learned from this story. First, when the authorities take action, they might be conditioned by the same biased assessment mechanism that affects individuals’ perspective and decision making process. As the abrupt interruption of the Astrazeneca’s inoculation campaign has shown, this is due to the fact that politicians find it difficult to clarify the terms of the question to the general public, and more rewarding to follow the voters’ instinct. In turn, and this is the second lesson, individuals are easy prey to those who promise no-risk solutions, even when such solutions do not exist.