The Algorithm exploits multiple inherent cognitive biases, and it is contributing to the COVID infodemic.
Medical misinformation is nothing new, but I think we can all agree that the coronavirus pandemic has added fuel to the misinformation fire. For the first time in modern memory, we have a medical issue that literally affects everyone, and it’s a particularly scary one — emerging out of nowhere, with a bizarre range of effects from asymptomatic illness to particularly disturbing deaths, to bizarre long-haul symptoms.
But there’s another culprit, besides COVID-19 itself that has led to this so-called infodemic — that’s social media.
But how? How exactly does social media lead us to bad inference? I don’t have a huge social media presence, but I have enough that I’ve seen the dark side of things.
Here’s someone who opens calling me a quack, but it gets quite a bit worse from there.
Here’s someone saying that a video I made about blood type and covid-19 was a deliberate fraud.
And here’s someone who is… well… farther out there.
In any case, it’s not magic how social media leads us away from what is true. Social media actually exploits some really well-researched cognitive biases that affect us all. At the heart of the issue is the algorithm.
Social media algorithms are all designed to show you stuff similar to stuff you’ve engaged with before; they are designed to maximize engagement with the service, not to provide you with truth, or what is best for you, or what will make you happy. The goal is to keep your eyes on the website. All the problems I will discuss spring from that one dark well. But maybe if we start to recognize the biases being exploited, it will help solve the problem.
So — here are four cognitive biases that social media exploits to lead us all to false beliefs. There are more, of course, but these are the big ones.
1) The Illusory Truth Effect. This is the tendency for people to believe statements that are familiar, regardless of whether they are true or not.
The more a statement is repeated, the more likely we are to believe it. Did you ever hear that the average human eats 8 spiders in their sleep per year? This is completely untrue — but people are likely to believe it because, well, they’ve heard it so often.
Like many biases, this was adaptive at some point — multiple tribe-members telling you not to eat that red fruit probably saved some lives in prehistory. But in the social media era, this is problematic because we are NOT presented with information in a rate consistent with its factual nature. In fact, social media algorithms ensure that we will be presented with similar facts to those with which we’ve engaged before. This is the “echo chamber” effect of social media, where the algorithm quickly optimizes itself to a worldview that feels familiar to you — everyone is talking about the same thing! But that thing may well be wrong.
2) Framing Effect.
Given the exact same data, I can influence your choices based just on how I present it. In medicine, a surgeon might tell a patient that they have a 95% chance of surviving an operation. Or they could say they have a 5% chance of dying. And that framing of the same data will drastically influence what the patient chooses to do. So it is with COVID statistics. Do we extoll that 40% of US adults are fully vaccinated, or chastise that 60% are not? Whatever you engage with will be presented to you again.
3) Salience bias.
This is the tendency of people to believe things are more likely when they are emotionally resonant, even when that is not the case. Remember the summer of the shark?
In the summer of 2001, major news networks breathlessly covered every shark attack that occurred on American beaches — leading many people to avoid going in the ocean, despite the actual statistics — there were less shark attacks and less fatalities (5 total worldwide) than in the year prior. In social media, salience bias causes us to retweet, share, or like a statement that has a high emotional valence. There may honestly have been no better practitioner of this than former President Trump — with his well-characterized tendency to include short, punctuated emotional words in his tweets. That emotion drives… you guessed it… engagement.
4) Motivated Reasoning. Put simply, motivated reasoning is the tendency to search out facts that support what you want to be true rather than letting facts lead you to truth.
This one is really hard to avoid, even for scientists. As a researcher, I can tell you that when I do a clinical trial of some intervention, I really want it to work. I want to help people. But if the data doesn’t show it? Well, you swallow your pride and write up your negative study. I wish this was a hypothetical.
But this is happening all over social media in the covid era. People are engaging with statements that they want to be true, not statements that are true. We want COVID to end, we want to go back to our normal lives, we want to believe that we, and our loved ones, are safe. We want to believe there is a cure out there, and moreover that the cure is cheap, widely-available and side-effect free. Any statement on social media that supports those conclusions will be strongly reinforced.
The problem is, again, the algorithm. The volume of statements out there is nearly infinite, so when you find a data point that supports what you want to be true, you engage with it and then you are shown more and more posts that are similar, further reinforcing your belief.
I don’t know exactly how to say this, but if you are doing your research on social media, you’re doing it wrong. It is designed to reinforce your beliefs, not challenge them.
I want to be clear here — no one is immune to these effects. These cognitive biases are hard-wired into our brains through millennia of evolution. In many cases, they are beneficial. But social media is doing to our brains what fast food did to our bodies. For thousands of years, our ability to store excess energy as fat was literally life-saving. Now it is maladaptive. For thousands of years, our quick-thinking heuristics led to a flourishing of human culture — now social media has made this maladaptive.
I’m not saying you need to quit Facebook, or Twitter, or Parler or Gab. I’m saying we need to realize that these platforms aren’t looking out for our best interests — they are hacking our brains to keep us engaged. If you recognize that, you’ll be much harder to fool.
A version of this commentary first appeared on medscape.com.