Pascal Diethelm, OxyRomandie, Geneva, Switzerland
Christmas is a time when many entirely rational people whose views are based solidly on empirical evidence the rest of the year suspend their critical faculties and say things they know to be untrue. Just in case any young children have picked up their parents’ copy of the BMJ, we won’t go into detail except to say that the subject of these falsehoods traditionally originates in the far north. Such stories are harmless and those telling them will, when their children reach an appropriate age, abandon the pretence. Yet other people hold views that are equally untrue and do so with an unshakeable faith, never admitting they are wrong however much contradictory evidence they are presented with.
Some of these views are harmless, but others cost lives. It is easy to think of contemporary examples. “HIV is not the cause of AIDS.” “The measles, mumps, and rubella vaccine cannot be considered safe.” “Second hand smoke is simply an irritant and there is no conclusive evidence that it is dangerous.”And, with potentially the greatest consequences for our species, “the evidence that the world is warming is inconclusive, and, if not, the evidence that global warming is caused by anthropogenic carbon emissions is unproven.”
Denialism and its history
The term “denialism” has been coined to describe this phenomenon. First popularised by the American Hoofnagle brothers, one a lawyer and the other a physiologist, it involves the use of rhetorical arguments to give the appearance of legitimate and unresolved debate about matters generally considered to be settled. The term can be traced to people who deny the existence of the Holocaust, but it has subsequently been applied much more widely. Denialism can be recognised by the presence of six key features (box). It is, however, important not to confuse denialism with genuine scepticism, which is essential for scientific progress. Sceptics are willing to change their minds when confronted with new evidence; deniers are not. Unfortunately, confusion is encouraged by the liberal use of the term, such as when the current British government uses the term “deficit deniers” to attack critics of its economic policy, a group that now includes large numbers of distinguished economic researchers, among them several Nobel laureates.
Although contemporary usage of the term is relatively recent, the concept of denialism has been recognised for several decades. A chapter entitled “Denial of reality” in a 1957 book describing the phenomenon of cognitive dissonance notes how “. . . groups of scientists have been known to continue to believe in certain theories, supporting one another in this belief in spite of continual mounting evidence that these theories are incorrect.” It highlights, in particular, the importance of selectivity, whereby “one aspect of the process of dissonance reduction [is] obtaining new cognition which will be consonant with existing cognition and avoiding new cognition which will be dissonant with existing cognition.” The extent to which selectivity influences our views is now widely recognised, not least as a result of a best selling book containing many examples of what is termed “confirmation bias.” One explanation is that confirmation bias is how we deal with evidence that challenges our strongly held beliefs and that would otherwise threaten our self perceived status as intelligent and moral individuals.
Approaches to denialism
Recent cognitive research, some taking advantage of advances in brain scanning, has shed light on the neurological processes whereby individuals interpret a message according to who is the messenger. People subconsciously suppress recognition of clearly contradictory messages from politicians that they support, yet easily identify contradictions from those they oppose.11 However, simply ignoring relevant evidence is insufficient. Evidence, including authoritative corrections, that contradicts strongly held views can, paradoxically, reinforce those views.12 Thus, research in the United States has found that registered Republicans who are exposed to evidence on the importance of social determinants of health are less likely to support collective action to address them than are those not exposed.13
Yet denialism involves more than someone accumulating a collection of individual errors in information processing. Increasingly, it takes on the form of social movements in which large numbers of people come together and propound their views with missionary zeal.14 These views combine exploitation of the genuine uncertainty that characterises scientific research with the use of simple falsehood.
Denialists emphasise the limitations of statistical associations for establishing causality, which are well recognised by aetiological epidemiologists, yet ignore other criteria that are used to ascertain whether a relationship is likely to be causal, such as biological plausibility, consistency, and strength of association. They may also try to change “the rules of the game,” such as in the now notorious example when the tobacco industry sponsored efforts to define “good epidemiology practice.” The initiative would have redefined a relative risk of less than two as being not statistically sound because of the potential for unrecognised confounding and was designed to exclude research on the risks associated with passive smoking, which typically yield a relative risk of 1.3-1.6. Other efforts seek to redefine concepts as essentially unresearchable, such as in an industry funded report on alcohol that stated: “violence is a nebulous concept.”
Selective use of the scientific literature is another approach used by denialists, who either promote methodologically flawed research that supports their world view over more methodologically sound papers or undertake intensive searches of papers they oppose for anything that might cast doubt on the quality of the science. A now notorious example is “Amazongate,” in which a report by the Intergovernmental Panel on Climate Change inappropriately referenced a statement on a report about the sensitivity of the rainforest to changes in rainfall rather than the relevant primary research. This inconsequential referencing error, in a report of more than 900 pages, was then used to undermine the entire report.
Deliberate falsehoods are rarely used to convince people that something is true, but rather are used to seed doubt about the actual truth. For example, although only 18% of Americans believe that President Barack Obama, a church going Christian, is a Muslim, an additional 43% are unsure. Media commentators don’t actually say that that Obama is a Muslim, they just say that they don’t know whether he is or he isn’t, while consistently using the president’s full name: “Barack Hussein Obama.” In the health arena, this approach is commonly found in debates about vaccines, where denialists play on the argument that “you can never be sure” when it comes to the very small risk of complications of vaccinations.
The spread of denialism
Of course, there have always been people who have held strong views in the face of overwhelming evidence to the contrary. Indeed, the Flat Earth Society, although a shadow of its former self, still exists. However, the world has changed in recent decades in three important ways, each facilitating the spread of denialism.
The first is the birth of web 2.0, which has transformed the internet from a closed publishing platform into an interactive tool allowing intensive exchange of ideas. People who might once have clung on to dissenting views in isolation can now locate individuals with similar views within seconds. Social media enable communities of denialists to grow by feeding each other’s feelings of persecution by a corrupt elite. This is encouraged by cynicism with existing political systems. In one study, for example, the people who were most likely to believe in 9/11 conspiracy theories were those who were disaffected and disengaged with the political system. Such cynicism is growing, a development that should not be surprising given how politicians feel able to take their countries to war on the basis of dubious evidence.
A second issue, in some countries, is the espousal of denialism by an increasingly partisan media, which expends considerable energy identifying supposed conspiracies that they then espouse to the general public.
The third is the growing exploitation of the first two issues by corporate interests. Although the tobacco industry has been at the forefront of such tactics, there are now examples from many other sectors, including the food and drink, asbestos, oil, and alcohol industries. Such activities received considerable official support during the administration of George W Bush, under whose aegis there were widespread attempts to politicise scientific research and advice.
Tackling denialism
So how should scientists respond to denialism? The first step is to recognise when it is present. Denialism changes the rules of the game. Conventional approaches to scientific progress—such as hypothesis generation and testing, and argument and counterargument—that seek to elicit the underlying truth no longer apply.
In some cases, nothing can or needs to be done. The persisting belief among many people that Princess Diana may have been murdered by the security services (32% of the British public in one poll), for example, has enabled some tabloid newspapers to fill many pages and has wasted much police time, but has no persisting implications for public policy.
In other areas, especially where the views reflect longstanding cultural beliefs, it may be necessary to accept that these views exist and adapt messages to take account of them when developing policies and practices. Examples include the development of health promotion campaigns to prevent the spread of HIV or to encourage the uptake of immunisation. Such campaigns are based on a detailed assessment of the beliefs that would undermine them if not confronted. For example, early programmes to tackle HIV/AIDS in east Africa had to address concerns that promotion of condoms was a covert attempt to control the population. It may be necessary to accept that there are some people who cannot be convinced, but there will be many who can.
This leaves those cases where denialist views are being promulgated actively by powerful vested interests. Here, we argue, health professionals have a responsibility to confront the denialists, exposing the tactics they use and the flaws in their arguments to a wide audience. Again, the first step is recognition. When a seemingly bizarre story appears in the media that risks undermining public health, health professionals should ask: “why is this story appearing now?” Many will, however, find this approach uncomfortable because it conflicts with the common tendency to seek compromise and avoid conflict.
Confronting denialism may also require the use of less usual methods of communication, such as analogy and narrative. Crucially, it demands speed of response. However, health authorities and non-governmental organisations are rarely able to respond rapidly, especially at weekends when, in our experience, misleading stories tend to appear in the media. Equally, editors of medical journals (with a few exceptions) often seem unable to appreciate the need to counter denialist stories.
In this paper we have looked at some of the most outrageous examples of denialism. Yet denialism is often much more subtle, and researchers are far from immune to its effects. There is a wealth of evidence on how reviewers find real or imagined flaws in papers whose messages they disagree with while discounting real errors in those they agree with. Perhaps, during the Christmas break, we, as reviewers and editors, might all take some time out to reflect on our own innate cognitive biases as well as how to overcome those of others
Characteristics of denialism
- Identification of conspiracies: Denialists argue that scientific consensus arises not as a result of independent researchers converging on the same view but instead because researchers have engaged in a complex and secretive conspiracy. They are depicted as using the peer review process to suppress dissent rather than fulfil its legitimate role of excluding work that is devoid of evidence or logical thought.
- Use of fake experts: It is rarely difficult to find individuals who purport to be experts on some topic but whose views are entirely inconsistent with established knowledge. The tobacco industry coined the term “Whitecoats” for those scientists who were willing to advance its policies regardless of the growing scientific evidence on the harms of smoking
- Selectivity of citation: Any paper, no matter how methodologically flawed, that challenges the dominant consensus is promoted extensively by denialists, whereas any minor weaknesses in papers that support the dominant position are highlighted and used to discredit their messages.
- Creation of impossible expectations of research: This may involve corporate bodies sponsoring methodological workshops that espouse standards in research that are so high as to be unattainable in practice.
- Misrepresentation and logical fallacies: An extreme example of this characteristic is the phenomenon of reductio ad hitlerum, in which anything that Hitler supported (especially restrictions on tobacco) is tainted by association. Other methods of misrepresentation include using “red herrings” (deliberate attempts to divert attention from what is important), “straw men” (misrepresentation of an opposing view so as to make it easier to attack), false analogies (for example, because both a watch and the universe are extremely complex, the universe must have been made by some cosmic watchmaker), and excluded middle fallacies (in which the “correct” answer is presented as one of two extremes, with no middle way. Thus, passive smoking causes either all forms of cancer or none, and as it can be shown not to cause some it must, it is argued, cause none).
- Manufacture of doubt: Denialists highlight any scientific disagreement (whether real or imagined) as evidence that the entire topic is contested, and argue that it is thus premature to take action.