When A Correction May Not Be Helpful
As information—and misinformation—about the new coronavirus outbreak continues to spread, how can public health officials ensure that correct information is out there? New work relating to messages about the Zika virus and yellow fever published this week in the journal Science Advances indicates that delivering accurate messaging may be harder than you think.
The researchers studied how participants in two online surveys responded to ‘corrective information’ about Zika—facts written to challenge incorrect ideas about how the Zika virus was transmitted and its effects. They found that the corrective campaigns didn’t significantly reduce misconceptions about Zika—and, in fact, may have made it less likely for the participants to believe correct information. The results, the researchers say, are consistent with a “tainted truth” effect, in which being told that specific information is unreliable can reduce trust in related information that wasn’t included in the warning at all.
Brendan Nyhan, a professor of government at Dartmouth College and one of the authors of the report, joins Ira to talk about the study and what lessons it might hold for educating people about other public health risks.
Brendan Nyhan is a professor of government at Dartmouth College in Hanover, New Hampshire.
IRA FLATOW: Now it’s time to play Good Thing, Bad Thing. Because every story has a flip side. And as information and misinformation about the new coronavirus outbreak continues to spread, how can public health officials ensure that correct information, the correct stuff, is out there?
It turns out it may be harder than you think. Writing this week in the journal, Science Advances, researchers say that providing corrective information to counter misconceptions can sometimes have unintended consequences. Brendan Nyhan is one of the authors of this study. He’s professor of government at Dartmouth College. Welcome back to Science Friday, Brendan.
BRENDAN NYHAN: Thanks for having me.
IRA FLATOW: All right, let’s talk about some of the misconceptions that people have in your study. What did they have about the Zika virus?
BRENDAN NYHAN: Well, we looked at three specific misperceptions that had been identified as being prevalent in the area. One was that genetically modified mosquitoes were spreading the outbreak. And there were two about the causes of the seeming surging cases of microcephaly among infants. One story was that that was actually caused by larvacides and another that that was actually caused by vaccines. And in both cases, actually, it seems to have been linked to the Zika virus itself.
IRA FLATOW: Now, you tried to counter these misconceptions with corrective information. What happened?
BRENDAN NYHAN: That’s right. My co-authors and I wanted to see if providing corrective information in the form that the World Health Organization presented on its own website would be effective. People had said in reporting from the region that rumors and misconceptions seem to be hindering the response to the outbreak.
So what would happen if we debunk those rumors and told people actually there’s no evidence for any of these claims? There’s no evidence that genetically modified mosquitoes are responsible for the outbreak? Instead, it’s the regular mosquitoes that are endemic in the region. There’s no evidence that vaccines or larvacides cause microcephaly. Would that change their minds about these factual beliefs and might that, in turn, cause them to be more likely to take measures to protect themselves or to support public policies that would limit the spread of the disease?
IRA FLATOW: So if– did it have just the opposite effect of what you were hoping?
BRENDAN NYHAN: Well, the results were disappointing. We found generally no measurable effect on people’s levels of misperception. So it didn’t seem to be– the corrective information didn’t seem to be effective at reducing those misperceptions. And then, even more surprisingly– and this is the bad news that you warned your listeners about– we found some spillover effects on these other factual beliefs people had about Zika. We asked in our survey– and then we repeated it again to make sure that we were, to be more confident we were right– we asked people a number of other factual belief questions about Zika, the Claims that people had made about Zika that were either true or false that weren’t covered in the corrective information we provided.
And what we found was, for a number of those factual beliefs, people’s beliefs actually became less accurate when they were exposed to that corrective information. So it seemed to have this kind of negative spillover effect, where people were becoming more skeptical of other beliefs they held about Zika. In the process, their beliefs were becoming less accurate, because many of those beliefs they were questioning were actually true.
IRA FLATOW: Wow. How do you attribute this? What– are people just sort of giving up on knowing the truth?
BRENDAN NYHAN: Well, our worry is that people are becoming more skeptical of all the beliefs they hold. This is something called the tainted truth effect. When I’ve heard that these sources that I get information from might have told me something that isn’t true, I come to be more skeptical of all the things that I’ve heard. And in the process, I might not just disbelieve false information I’ve heard, I might disbelieve true information I’ve heard, too. You know, this is something I worry about with all the warnings in the United States about fake news, that people may actually apply those indiscriminately and come to disbelieve reputable news, as well as untrustworthy news.
IRA FLATOW: You think?
BRENDAN NYHAN: Not your show though, I promise.
IRA FLATOW: No, it’s true. We deal in the research, but you have real research showing that people are going to give up on the truth and just not believe anything.
BRENDAN NYHAN: Well, it’s not that they don’t believe anything. But it does seem to move people in the wrong direction. And I think this should make us reexamine how we communicate about emerging diseases. There is a real challenge going on right now with risk communication. And sometimes people assume if we just tell people the truth about the misinformation they’ve heard, that will turn them around, not just in their factual beliefs but in their attitudes and policy preferences. And that’s not always the case.
IRA FLATOW: We have found that out when we try to talk about vaccination, didn’t help. You know? That sort of– vaccination is tough. But that’s another show. OK. Brendan, thanks for coming by and confirming what we were fearing. Brendan Nyhan, professor of government at Dartmouth College.