Insert controversy of choice in the blank:
For people who mistrust ___, learning the facts may make the problem worse.
That’s the tagline from “Vaccine Myth-Busting Can Backfire“, which highlights these findings:
A new study published earlier this week in the journal Vaccine found that when it comes to vaccination, a subject rife with myths and misperceptions, knowing the facts may not really be all that effective in winning over anti-vaxxers—and, in some cases, may even do more harm than good.
The study found that when people concerned about side effects of the flu shot learned that it couldn’t cause the flu, they actually became less willing to get it.
It’s a variant on the classic phenomenon of
the “backfire effect,” or the idea that when presented with information that contradicts their closely-held beliefs, people will become more convinced, not less, that they’re in the right.
What’s simultaneously interesting and more troubling about this finding is that people did change their knowledge, but that still didn’t translate into the corresponding action:
Though the vaccine studies have yielded results subtly different from the “backfire effect”—people were willing to accept new information as true, even when it had no effect on what they did in the end—Nyhan believes that the same sort of mental gymnastics is likely at work across both areas: reactance, the psychological phenomenon in which persuading people to accept certain idea can push them in the opposite direction.
It raises the ethical question of how educators should “first, do no harm” when teaching, if their efforts may backfire. It also highlights how crucial it is for instruction to account for learners’ identities, values, and motivations in order to be meaningfully effective.