Since the late 1990s, doctors and public health officials have been battling the belief, fueled by a discredited study, that vaccines can cause autism. They’ve tried to present the facts about vaccine safety and highlight the risks of the diseases the shots prevent. Once parents had accurate information, they would surely choose to vaccinate, they reasoned.
New research suggests they may need a new strategy. A study by Dartmouth political scientist Brendan Nyhan published in the journal Pediatrics found educating parents about vaccines failed to persuade skeptics to get their children shots; among parents most opposed to vaccination, education actually made them less likely to vaccinate.
Researchers tested four messages encouraging parents to vaccinate kids against measles, mumps, and rubella—from explaining that there’s no evidence linking vaccines to autism to showing pictures of children suffering from the diseases that the vaccine prevents.
The interventions, presented online to 1,759 U.S. parents, initially looked promising. When the most anti-vaccine parents got the message debunking the link to autism, they were convinced, somewhat. In follow-up questions they became less likely to agree that “vaccines cause autism.”
But when researchers asked these parents whether they’d give vaccines to another child, the number who said they would actually decreased. In other words, when vaccine skeptics heard the public health evidence, they understood that their fears about vaccines causing autism were unwarranted. And then, in greater numbers, they said they still wouldn’t vaccinate future kids.
Two of the other interventions Nyhan’s team tested also appeared to have unintended consequences. Looking at pictures of kids sick with measles, mumps, and rubella seemed to increase people’s belief that vaccines cause autism. And when the most anti-vaccine parents read the story of a mother whose infant, too young to be vaccinated, nearly died of measles, the intervention seemed to increase the belief that vaccines have bad side effects.
Why do these messages seem to backfire? Nyhan’s study doesn’t speculate, but NPR’s Shankar Vedantam floated a theory:
What Nyhan seems to be finding is that when you’re confronted by information that you don’t like … it damages your sense of self-esteem. It damages something about your identity. And so what you do is you fight back against the new information. You try and martial other kinds of information that would counter the new information coming in.
The study echoes what Pennsylvania State University researchers found in an analysis of Twitter messages about the H1N1 flu vaccine (aka swine flu) during that pandemic in 2009: “People who were exposed to a lot of pro-vaccine messages—those whose social networks were basically telling them to get their flu shots—were more likely to come out with the opposite view,” I wrote last year.
The stakes for getting people vaccinated are high, as Britain’s resurgent measles epidemic shows. And though we shouldn’t extrapolate too much from vaccine attitudes, it’s not hard to see how the dynamic Nyhan describes might play out in other charged public conversations. Think about genetically modified crops, which scientists find no riskier than conventionally grown foods but still face fervent opposition. Or climate change, where the scientific consensus has failed to convince one-third of Americans that global warming is real.
The thought that pro-vaccine messages are not only ineffective but may actually backfire must make doctors and scientists (and those of us who write about their work) stop and think about how they communicate. There’s a great distance between the laboratory and the kitchen tables of parents who surely genuinely care about their kids. And it doesn’t seem like we’re bridging it.