Is there a “backfire effect”?

Author

Jason Collins

Published

January 17, 2018

I saw the answer hinted at in a paper released mid last-year (covered on WNYC), but Daniel Engber has now put together a more persuasive case:

Ten years ago last fall, Washington Post science writer Shankar Vedantam published an alarming scoop: The truth was useless.

His story started with a flyer issued by the Centers for Disease Control and Prevention to counter lies about the flu vaccine. The flyer listed half a dozen statements labeled either “true” or “false”—“Not everyone can take flu vaccine,” for example, or “The side effects are worse than the flu” —along with a paragraph of facts corresponding to each one. Vedantam warned the flyer’s message might be working in reverse. When social psychologists had asked people to read it in a lab, they found the statements bled together in their minds. Yes, the side effects are worse than the flu, they told the scientists half an hour later. That one was true—I saw it on the flyer.

This wasn’t just a problem with vaccines. According to Vedantam, a bunch of peer-reviewed experiments had revealed a somber truth about the human mind: Our brains are biased to believe in faulty information, and corrections only make that bias worse.

These ideas, and the buzzwords that came with them—filter bubbles, selective exposure, and the backfire effect—would be cited, again and again, as seismic forces pushing us to rival islands of belief.

Fast forward a few years:

When others tried to reproduce his the research [Ian Skurnik’s vaccine research], though, they didn’t always get the same result. Kenzie Cameron, a public health researcher and communications scholar at Northwestern’s Feinberg School of Medicine, tried a somewhat similar experiment in 2009. … “We found no evidence that presenting both facts and myths is counterproductive,” Cameron concluded in her paper, which got little notice when it was published in 2013.

There have been other failed attempts to reproduce the Skurnik, Yoon, and Schwarz finding. For a study that came out last June, Briony Swire, Ullrich Ecker, and “Debunking Handbook” co-author Stephan Lewandowsky showed college undergrads several dozen statements of ambiguous veracity (e.g. “Humans can regrow the tips of fingers and toes after they have been amputated”).  … But the new study found no sign of this effect.

And on science done right (well done Brendan Nyhan and Jason Reifler):

Brendan Nyhan and Jason Reifler described their study, called “When Corrections Fail,” as “the first to directly measure the effectiveness of corrections in a realistic context.” Its results were grim: When the researchers presented conservative-leaning subjects with evidence that cut against their prior points of view—that there were no stockpiled weapons in Iraq just before the U.S. invasion, for example—the information sometimes made them double-down on their pre-existing beliefs. …

He [Tom Wood] and [Ethan] Porter decided to do a blow-out survey of the topic. Instead of limiting their analysis to just a handful of issues—like Iraqi WMDs, the safety of vaccines, or the science of global warming—they tried to find backfire effects across 52 contentious issues. … They also increased the sample size from the Nyhan-Reifler study more than thirtyfold, recruiting more than 10,000 subjects for their five experiments.

In spite of all this effort, and to the surprise of Wood and Porter, the massive replication effort came up with nothing. That’s not to say that Wood and Porter’s subjects were altogether free of motivated reasoning.

The people in the study did give a bit more credence to corrections that fit with their beliefs; in those situations, the new information led them to update their positions more emphatically. But they never showed the effect that made the Nyhan-Reifler paper famous: People’s views did not appear to boomerang against the facts. Among the topics tested in the new research—including whether Saddam had been hiding WMDs—not one produced a backfire.

Nyhan and Reifler, in particular, were open to the news that their original work on the subject had failed to replicate. They ended up working with Wood and Porter on a collaborative research project, which came out last summer, and again found no sign of backfire from correcting misinformation. (Wood describes them as “the heroes of this story.”) Meanwhile, Nyhan and Reifler have found some better evidence of the effect, or something like it, in other settings. And another pair of scholars, Brian Schaffner and Cameron Roche, showed something that looks a bit like backfire in a recent, very large study of how Republicans and Democrats responded to a promising monthly jobs report in 2012. But when Nyhan looks at all the evidence together, he concedes that both the prevalence and magnitude of backfire effects could have been overstated and that it will take careful work to figure out exactly when and how they come in play.

Read Engber’s full article. It covers a lot more territory, including some interesting history on how the idea spread.

I have added this to the growing catalogue of readings on my critical behavioural economics and behavioural science reading list. (Daniel Engber makes a few appearances.)