Humans Are Hardwired To Dismiss CoronavirusFacts That Don’t Fit Their Worldview

October 9, 2020

Anthony Fauci, senior advisor to COVID-19, recently bemoaned the disparity in individual and state compliance with public health recommendations, blaming the ineffective response to pandemics in the United States on an “anti-science bias.” He called this bias “unthinkable” because “science is truth.” Fauci compares those who discount the importance of masks and social alienation to “anti-vaxxers” who “spectacularly” refuse to listen to science.

What amazes me is Fauci’s expertise. While he is well-versed in the science of coronaviruses, he ignores the “anti-science bias,” or the well-established science of science denial.

Americans increasingly exist in highly polarized, information-insulated ideological communities that occupy their own information universe.

In the political blogosphere, global warming is considered a hoax, or so uncertain as to be unworthy of a response. In other geographic or online communities, the science of vaccine safety, fluoridated drinking water, and genetically modified foods is misrepresented or ignored. Depending on political party affiliation, there is a clear disparity of concern about coronaviruses, apparently based in part on partisan disagreements about factual issues, such as the effectiveness of social grooming or the actual death rate from COVID-19.

In theory, resolving factual disputes should be relatively easy: all that is required is the presentation of strong evidence, or evidence on which there is strong expert consensus. This approach works most of the time when the issue is, say, the atomic weight of hydrogen.

But when the scientific advice presents a picture that threatens someone’s cognitive interests or ideological worldview, that’s not how things work. In fact, it turns out that a person’s political, religious, or ethnic identity quite effectively predicts a person’s willingness to accept expertise on any given politicized issue.

“Motivated reasoning” is what social scientists call the process of deciding what evidence to accept based on one’s preferred conclusions. As I explain in The Truth of Denial, this human tendency applies to a variety of facts about the physical world, economic history, and current events.

Denial does not stem from ignorance.
Interdisciplinary research into this phenomenon has made one thing clear: the failure of various groups to acknowledge the truth about issues such as climate change is not due to a lack of information about the scientific consensus on the subject.

Rather, what strongly predicts the denial of expertise on many controversial topics is simply one’s political persuasion.

A 2015 meta-study showed that ideological polarization over the reality of climate change actually increased with respondents’ knowledge of politics, science, and/or energy policy. If a conservative is college-educated, his or her chances of becoming a climate science denier increase dramatically. Conservatives who score highest on tests of cognitive ability or quantitative reasoning are most likely to be influenced by motivated reasoning about climate science.

Denialism isn’t just a problem for conservatives. Studies have found that liberals are less likely to accept a hypothetical expert consensus about the likelihood of safe storage of nuclear waste, or about the impact of concealed carry laws.

Denial is natural.
The human gift of rationalization is the product of hundreds of thousands of years of adaptation. Our ancestors evolved in small groups, where cooperation and persuasion were at least as relevant to reproductive success as holding accurate factual beliefs about the world. Assimilation into one’s tribe requires assimilation into the group’s ideological belief system – whether it is based on science or superstition. Instinctive prejudices about one’s “own tribe” and its worldview are deeply rooted in human psychology.

A person’s sense of self is closely tied to the status and beliefs of his or her identity group. Thus, it is not surprising that people automatically react defensively to messages that threaten the worldview of the group with which they identify. We react by rationalizing and selectively evaluating evidence – that is, we engage in “confirmation bias,” giving credence to expert testimony that we like and finding reasons to reject other evidence.

Unpopular information can be threatening in other ways as well. “System legitimacy” theorists like psychologist John Jost have shown how situations representing perceived threats to established systems can trigger inflexible thinking. For example, populations experiencing economic distress or external threats often turn to authoritarian leaders who promise security and stability.

In a situation of intense ideological intensity, a person’s biases can ultimately affect their factual beliefs. If you define yourself by your cultural affiliation, your attachment to the social or economic status quo, or a combination of both, then information that threatens your belief system – for example, about the negative impact of industrial production on the environment – threatens your sense of identity itself. If trusted political leaders or partisan media tell you that the COVID-19 crisis is overblown, then factual information that contradicts the scientific consensus can feel like a personal attack.

Denial is natural.
This emotionally charged, motivated thinking explains a series of extreme examples of refusal to accept historical facts and scientific consensus.

Have tax cuts been proven to pay for themselves in terms of economic growth? Do communities with high levels of immigration have higher rates of violent crime? Did Russia interfere in the 2016 U.S. presidential election? Predictably, expert opinions on these issues are used as evidence by the partisan media, which is itself partisan.

The phenomenon of denialism is varied, but ultimately the story behind it is very simple. Human cognition is inseparable from the unconscious emotional responses that accompany it. Under the right conditions, universal human traits such as in-group favoritism, existential anxiety, and a desire for stability and control can combine to create a toxic, systemically rational identity politics.

Scientific denial is notoriously resistant to facts, because it is not about facts in the first place. Science denial is an expression of identity – usually in the face of perceived threats to the social and economic status quo – and it usually manifests itself as a response to elite messages.

I would be very surprised if Anthony Fauci is not actually aware of the significant influence of politics on COVID-19 attitudes, or what statements by Republican state officials, partisan mask rejection in Congress, or the recent Trump rally in Tulsa are signaling. Effective science communication is critical because partisan messaging can have a profound impact on public attitudes. Vaccination, resource depletion, climate and COVID-19 are life-or-death issues. To successfully address these issues, we must not ignore what science is telling us about science denial.