Can people be ‘inoculated’ against misinformation

Source:


issue cover image

A version of this story appeared in Science, Vol 386, Issue 6721.Download PDF

This story was supported by the Pulitzer Center.

As a young boy growing up in the Netherlands in the 1990s, Sander van der Linden learned that most of his mother’s relatives, who were Jewish, had been killed by the Nazis, in the grip of racist ideology. At school, he was confronted with antisemitic conspiracy theories still circulating in Europe. It all got him wondering about the power of propaganda and how people become convinced of falsehoods.

Eventually, he would make studying those issues his career. As head of the Social Decision-Making Lab at the University of Cambridge, Van der Linden is studying the power of lies and how to keep people from believing them. He has become academia’s biggest proponent of a strategy pioneered after the Korean War 
to “inoculate” humans against persuasion, the way they are vaccinated against dangerous infections.

The recipe only has two steps: First, warn people they may be manipulated. Second, expose them to a weakened form of the misinformation, just enough to intrigue but not persuade anyone. “The goal is to raise eyebrows (antibodies) without convincing (infecting),” Van der Linden and his colleague Jon Roozenbeek wrote recently in JAMA.

SIGN UP FOR THE AWARD-WINNING SCIENCEADVISER NEWSLETTER

The latest news, commentary, and research, free to your inbox daily

Inoculation, also called “prebunking,” is just one of several techniques researchers are testing to stop people from falling for misinformation and spreading it further. Others have focused on fact checking and debunking falsehoods, educating people about news sources’ trustworthiness, or reminding people periodically to consider that what they’re reading may be false. But Van der Linden has captured public imagination in a way few others have, perhaps because the concept is so seductively simple. “It’s definitely the one that has gotten most attention,” says Lisa Fazio, a psychologist at Vanderbilt University.

Van der Linden’s 2023 book, Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity, has won many awards, and Google’s research arm, Jigsaw, has rolled out the approach to tens of millions of people via YouTube ads. “My reading of the literature is that it’s probably the most effective strategy,” says Jay van Bavel, a psychologist at New York University.

But others say inoculation is an analogy gone awry that wrongly focuses on recipients of misinformation instead of on its sources and the social media companies—such as X (formerly Twitter), Facebook, and TikTok—that enable and profit from its spread. “I think this metaphor is very limiting in how we understand where the problem really lies,” says Sandra González-Bailón, a social scientist at the University of Pennsylvania. “It’s easier to do than dealing with the systemic issues, but it puts all the pressure on the individual.”

The idea of inoculation dates back to the Cold War era. In 1954, when the Korean War ended, 21 U.S. prisoners of war decided to move to communist China instead of coming home, a choice that shocked the nation. Many assumed the soldiers were victims of “brainwashing,” a term invented a few years earlier. To resist this kind of manipulation, experts declared, young people in the United States needed to be taught more about “American” ideals at home, at school, and in the Army.

But psychologist William McGuire, who spent most of his career at Yale University, had a different idea. He argued that the soldiers were vulnerable to the endless propaganda they encountered because it was their first exposure to those ideas. Such prisoners, McGuire argued, were like someone brought up in an “aseptic” environment who, “although appearing in very good health, proves quite vulnerable when suddenly exposed to a massive dose of an infectious virus.” The remedy seemed obvious: Pre-expose people to “weakened, defense-stimulating forms of the counterarguments.”

McGuire tested the hypothesis by seeing whether inoculation could preserve students’ belief in four cultural truisms, including that they should brush their teeth after every meal and that antibiotics had been a huge benefit to humankind. Confronting students with counterarguments—for example, that antibiotics led to the development of deadly resistant strains—could drastically decrease their belief in these truisms, McGuire found. But if the students first read an essay that laid out the counterarguments and refuted them, their belief didn’t erode nearly as much.

The idea fascinated Van der Linden, who first read McGuire’s papers as a Yale student working on the public’s perception of climate change. But McGuire believed inoculation only worked if people’s beliefs had never been challenged before. Van der Linden thought that assumption was mistaken. “In the real world, we are dealing with people in various stages of infection, and inoculation can both work as a preventative and therapeutic agent,” he wrote in his book. “If I ever had a ‘lightbulb’ moment, this was it.”

portrait of Sander van der Linden.

Cold War–era research inspired Sander van der Linden to test inoculation as a weapon against 21st century misinformation.JUSTIN GRIFFITHS-WILLIAMS

Van der Linden put the theory to the test in an online study of more than 2000 participants that probed their views of climate change—and tried to change them. Respondents on average estimated 70% of scientists agree human-caused global climate change is real. When the subjects were told that the actual number is 97%, their estimates rose accordingly. But exposing participants to a misinformation trope that “over 31,000 American scientists have signed a petition” countering the consensus completely erased that increase.

However, if participants were warned of exactly this kind of falsehood before being exposed to both the consensus message and the misinformation, the net effect was an increase to about 84%. A preemptive warning about politically motivated attempts to spread misinformation worked, Van der 
Linden concluded. The effect was seen even in those who were more skeptical of climate change to begin with.

The study, published in Global Challenges in January 2017—just after Donald Trump had been sworn in as U.S. president—created a flood of media attention. “My phone started ringing non-stop,” Van der Linden writes in Foolproof. “News media, government officials, corporations, everyone wanted me to explain the basic idea behind psychological inoculation.”

Since then, Van der Linden has continued to study and promote the idea, with one twist: Instead of targeting one specific lie, he is trying to prebunk misinformation more generally. For example, with Roozenbeek, a former member of his group who’s now at King’s College London, he has created online games in which players become propagandists and learn about techniques used to spread falsehoods.

The first one, Bad News, which came out in 2018, shows users how a fake profile that looks official can make misinformation more persuasive. (“I have issued an executive order to rename Canada North North Dakota,” a certain Joe Bideñ posts in the game.) In a 2020 study in the Journal of Cognition, Van der Linden and colleagues showed that playing Bad News for 15 minutes increased people’s ability to spot misinformation techniques in 18 fictitious Twitter posts. (A control group played Tetris instead.) In a game called Harmony Square, developed with the U.S. Department of Homeland Security, players set out to destroy the idyllic harmony in a town, deploying propaganda to pit its residents against each other over a statue of 
a pineapple.

The games have reached millions of people and have been used in schools. Other researchers have designed similar games, such as Cranky Uncle, developed by University of Melbourne psychologist and part-time cartoonist John Cook. Now, “This could be implemented at a nationwide scale as part of a media literacy curriculum,” Van der Linden says. But he is ready to move on himself. “We feel we’re all gamed out,” he says. “Also, not everyone wants to play a game, so it’s time to think of the next virtual needle!”

In 2021, Van der Linden says, Jigsaw contacted him with the idea of making another kind of inoculation tool: videos. The project resulted in five short cartoon videos—
referencing popular movies such as Star Wars and Anchorman—that explain some of the techniques used in misinformation, such as presenting false dilemmas or eliciting emotions such as fear, anger, and contempt. The researchers tested the videos in lab studies but also in an ad campaign on YouTube that reached about 1 million people.

A 2022 paper by Van der Linden and researchers at Jigsaw in Science Advances reported the videos helped people spot misinformation techniques and better discern trustworthy and untrustworthy content. More recently, Google created a set of three German-language videos about misinformation tactics and showed them to YouTube, Facebook, and Instagram users in Germany. The videos have reached more than half of the country’s population, Van der Linden says.

Cornell University psychologist Gordon Pennycook, however, thinks this type of general inoculation may well do more harm than good. The problem, he says, is there is no simple, easy way to spot misinformation, and techniques such as emotional language are used in reliable information, too. Because most people still encounter far more real information online than misinformation, even a slight increase in their distrust of truthful content could outweigh the positive effects of inoculation.

quotation mark

This is kind of like Band-Aid stuff.

Pennycook offered an example during a debate with Van der Linden at the Harvard Kennedy School in December 2023: the 24 May 
2020 front page of The New York Times. It was filled entirely with names of people who had succumbed to COVID-19, under the headline “U.S. deaths near 100,000, an incalculable loss.” “I think some people probably cried when they saw the headline, and it’s true,” he told the audience. “Now, whether you think this is a manipulation technique really depends on your perspective, right? If you are somebody who is skeptical about COVID, you can see that maybe as manipulative.”

That’s a valid concern, Van der Linden says, but he points out that as people learn about several different markers of misinformation, the accuracy of their judgment goes up. Moreover, if a reputable newspaper runs an alarmist, emotionally charged headline, people would be right to find it a little less credible, he says.

There have been other criticisms of the inoculation idea. The metaphor itself medicalizes the problem in a way that is not helpful when trying to speak to neighbors, friends, or family, says Danielle Lee Tomson of the University of Washington: “I can’t look at them as like disease vectors, because then I can’t have a conversation with them.”

Pennycook has focused instead on a different type of intervention: so-called accuracy nudges, short messages that get people to think about the information they’re seeing, such as: “Is the news you’re sharing accurate?” They are based on the assumption that most people have a good sense of what may not be trustworthy but rarely stop to think before sharing. “If people are thinking more about accuracy they are more conservative in sharing things that seem dubious,” 
Pennycook says.

Other researchers have their own alternatives. Some would add comments about sources’ credibility to social media posts—as Twitter did for a while during the COVID-19 
pandemic. Others would give media literacy tips to the public, to help them identify credible sources. “Everybody has a pet theory and the reward structure of publishing papers is based on finding evidence for your theory,” Van Bavel says. It reflects “our immaturity as a field,” says Thomas Wood, a political scientist studying misinformation at Ohio State University.

In 2023, researchers staged a head-to-head comparison. “We had these different camps that were arguing about what they thought was best and what would be most appropriate,” says Fazio, who ran the study with 
David Rand at the Massachusetts Institute of Technology and Stephan Lewandowsky at Bristol University. “I think it was kind of a necessary next step … to start to get some consensus.” The Mega Study, as it was known internally, ended up randomizing more than 33,000 online participants to one of nine different interventions or no intervention, and then presented them with a series of true, false, and misleading headlines. Participants had to rate the headlines’ accuracy and how likely they were to share it. (The study was part a larger project on health misinformation funded by the Social Science Research Council, a U.S. nonprofit.)

The results, published only as a preprint so far, show there is no silver bullet. “All of them work a little bit, but none of them work that well,” is how Wood summarizes it. “I thought it was actually hilarious,” Pennycook says. “We have all these debates about different interventions, and then it is like, oh, well, they’re all kind of the same.”

As for inoculation, “It’s better than about half of this stuff” at preventing people from sharing misinformation, Fazio says. But when it comes to helping people discriminate between true and false headlines, “it’s pretty similar to the other things.” Pennycook 
says it did better than he had expected, but he remains skeptical that the intervention will work as well in the real world.

Van Bavel, too, was not particularly impressed by any of the interventions alone, “But if you can find four or five of the best ones and find out how to layer them in some real way,” he says, “then you can probably make a dent.” In fact, a recent preprint from Pennycook’s group showed that an inoculation video meant to help people recognize emotional manipulation did help them spot the technique but did not make them better at discerning real from fake news. Combining such videos with an accuracy prompt improved people’s ability to tell apart real and fake headlines, however.

Almost all of the approaches shift the burden of fighting misinformation from social media platforms, which often profit from spreading it, to the individual user. It’s a problem researchers are acutely aware of now that X owner Elon Musk is sending out a steady stream of pro-Trump misinformation in the run-up to the U.S. presidential election. “You cannot use psychological interventions to resolve this problem. There are structural, systematic, underlying problems that need to be dealt with,” Pennycook says. “This is kind of like Band-Aid stuff.”

Van der Linden agrees—and his collaborations with social media companies are a balancing act, he acknowledges. “When you work with companies like Meta, the ask often is: ‘Can you fix this problem without hurting engagement on the platform?’” he says. “You are working with them as part of their business model, and their business model sucks.”

Still, he argues, researchers can target both the spreaders and receivers of misinformation. “I’m a psychologist, I’m biased towards individual solutions,” Van der Linden 
says. “But I’m under no illusion. We are also going to need, you know, hard, structural solutions.”


Further Reading