The Psychology Behind Coronavirus Denial

Image: Pixabay

Like a lot of people, I’ve been participating in group Zoom calls with people from my past lately. One that’s especially engaging is a group from my high school class that meets once a week. It’s interesting to hear people from very different parts of the country share their experiences in the type of candid way that only old acquaintances can.

Unfortunately, a number of my classmates were infected with the virus. One, an executive at a paramedic services company, was filling in on the front lines when he got it, even though he had on full protective gear. He believes he got it through his eyes. Others spoke of weeks in bed, unable to move.

So I find it bizarre when I see other people I know who deny the severity of the crisis. In most cases, these are otherwise reasonable people, who watch YouTube videos or other alternative sources of media that minimize the crisis or portray it as a hoax. As hard as it is to believe, these people aren’t crazy. Rather, they are falling prey to very common cognitive biases.

As of the first week in May there were 1.2 million coronavirus cases in the US. That sounds like a lot, but it’s only about a third of 1% of the population and half of those cases have been in the northeast. In many parts of the country, people have no direct experience with the coronavirus epidemic.

There’s a natural human tendency to overweight information that is most available to us. For example, reading statistics about automobile fatalities will do little to affect our driving behavior. However, when we pass a fatal accident on the road, we will naturally slow down and be more cautious. Direct observation feels real. Statistics don’t.

Psychologists call this tendency availability bias and it is amazingly common, even in professional settings where you would expect more deliberative decision making. Researchers have found that it even affects how investors react to analysts reports, how corporations invest in research and how jurors evaluate witness testimony. Other studies find that availability bias even affects medical judgments.

For many people, shelter-in-place restrictions seem like an outside encroachment into their communities. After all, they aren’t sick and they don’t know anybody who is. So why should they be denied the opportunity to go to work and visit family and friends?

A big part of our everyday experience — and the information that is most available to us — is the people that surround us, who have a major influence in what we perceive and how we think. In fact, a series of famous experiments done at Swarthmore College in the 1950’s showed that we will conform to the opinions of those around us even if they are obviously wrong.

It shouldn’t be surprising that those closest to us influence our thinking, but more recent research has found that the effect extends to three degrees of social distance. So it is not only those we know well, but even the friends of our friend’s friends affect how we think and behave, even for health issues like obesity and smoking.

The effect is then multiplied by our tendency to be tribal, even when the source of division is arbitrary. For example, in a study where young children were randomly assigned to a red or a blue group, they liked pictures of other kids who wore t-shirts that reflected their own group better. In another study of adults that were randomly assigned to “leopards” and “tigers,” fMRI studies noted hostility to outgroup members regardless of their race.

So it isn’t surprising that people will be more willing to believe, say, a conspiracy theory floated by a high school friend rather than information from a government agency or recognized news media. If the majority of people around you believe something, you’re likely to believe it too.

The machinery in our brains is naturally geared towards making definitive judgments, even when we have a dearth of information. We tend to lock onto the first information we see (called priming) and that affects how we see subsequent data (framing). Sometimes, we just get bad information from a seemingly trustworthy, but unreliable source.

In any case, once we come to believe something, we will tend to look for information that confirms it and discount contrary evidence. We will also interpret new information differently according to our preexisting beliefs. When presented with a relatively ambiguous set of facts, we are likely to see them as supporting our position.

This dynamic plays out in groups as well. We tend to want to form an easy consensus with those around us. Dissent and conflict are uncomfortable. In one study that asked participants to solve a murder mystery, the more diverse teams came up with better answers, but reported doubt and discomfort. The more homogenous teams performed worse, but were more confident in their judgments.

People who, either because they lack direct experience with the pandemic or because their social network is dominated by doubters and conspiracy theorists, don’t believe in the severity of the pandemic, will find no shortage of support. The internet is awash with half-baked quacks willing to lend credence to any story, no matter how unlikely or bizarre.

The range of conspiracy theories surrounding the coronavirus pandemic is truly breathtaking. In America, rumors circulate that the coronavirus was engineered in a Chinese lab, while in China and Iran, they accuse America of inventing it. Others say it was caused by 5G mobile networks. Still others insist it doesn’t exist at all and is a massive fraud cooked up by Bill Gates and Anthony Fauci.

Yet there are ways to circumvent the faulty machinery in our brains. Availability bias can be countered by communicating true facts effectively. The same research that showed how local majorities influence us also found that just one or two dissenters can break the spell. Rigorous fact checking can help to mitigate confirmation bias.

The lesson here is that we are all prone to bias. We need to be careful and promote a healthy skepticism, especially with opinions that we are predisposed to agree with. Just as we shouldn’t believe everything our government tells us, we shouldn’t put our trust in YouTube videos. In the end, everything needs to be traced back to an authoritative, primary source.

The truth is that it’s incredibly hard to get facts right. I used hundreds of sources in researching my two books and still it took weeks of fact checking, by both my publisher and me, to correct inaccuracies. In at least one case, I had held a false notion for years which could have been corrected by a simple Google search.

As the physicist Richard Feynman once put it, “The first principle is that you must not fool yourself — and you are the easiest person to fool.”

Greg Satell is an international keynote speaker, adviser and bestselling author of Cascades: How to Create a Movement that Drives Transformational Change. His previous effort, Mapping Innovation, was selected as one of the best business books of 2017. You can learn more about Greg on his website, GregSatell.com and follow him on Twitter @DigitalTonto

Bestselling Author of Cascades and Mapping Innovation, @HBR Contributor, - Learn more at www.GregSatell.com — note: I use Amazon Affiliate links for books.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store