When education makes problems worse

Insert controversy of choice in the blank:

For people who mistrust ___, learning the facts may make the problem worse.

That’s the tagline from “Vaccine Myth-Busting Can Backfire“, which highlights these findings:

A new study published earlier this week in the journal Vaccine found that when it comes to vaccination, a subject rife with myths and misperceptions, knowing the facts may not really be all that effective in winning over anti-vaxxers—and, in some cases, may even do more harm than good.

The study found that when people concerned about side effects of the flu shot learned that it couldn’t cause the flu, they actually became less willing to get it.

It’s a variant on the classic phenomenon of

the “backfire effect,” or the idea that when presented with information that contradicts their closely-held beliefs, people will become more convinced, not less, that they’re in the right.

What’s simultaneously interesting and more troubling about this finding is that people did change their knowledge, but that still didn’t translate into the corresponding action:

Though the vaccine studies have yielded results subtly different from the “backfire effect”—people were willing to accept new information as true, even when it had no effect on what they did in the end—Nyhan believes that the same sort of mental gymnastics is likely at work across both areas: reactance, the psychological phenomenon in which persuading people to accept certain idea can push them in the opposite direction.

It raises the ethical question of how educators should “first, do no harm” when teaching, if their efforts may backfire. It also highlights how crucial it is for instruction to account for learners’ identities, values, and motivations in order to be meaningfully effective.

Don’t Just Acknowledge Bias, Demand Its Reduction

From Adam Grant and Sheryl Sandberg’s “When Talking About Bias Backfires“:

The assumption is that when people realize that biases are widespread, they will be more likely to overcome them. But new research suggests that if we’re not careful, making people aware of bias can backfire, leading them to discriminate more rather than less.

The key is to conclude the acknowledgment of the problem with clear disapproval of it, such as:

“Please don’t remove the petrified wood from the park.”

“A vast majority of people try to overcome their stereotypic preconceptions.”

“I don’t ever want to see this happen again.”

Social cues can be powerful; use them well.

Less identity, more ideas

Once again, “we would all benefit from more meaningful interaction and less labeling… along any dimension by which we divide humanity.”

From Tom Jacob’s “America’s Increasingly Tribal Electorate“, describing political scientist Lilliana Mason’s research:

“behavioral polarization”—anger at the other side, activism for one’s own side, and a tendency to look at political arguments through a biased lens—is driven much more strongly by that sense of team spirit, as opposed to one’s views on public policy.

According to her:

the only way to reduce the anger and bias would be “to reduce the strength or alignment of political identities.”

Yet I remain hopeful that, in spite of the dangers of the backfire effect, we can find ways to separate ideas from identities, and share knowledge both dispassionately and compassionately at the same time. As before: “Most of all, we should put wrongness back in its place– linked to the idea, not the person,” or the identity.

Misplaced critical thinking

In Physics Today‘s Science controversies past and present, Steven Sherwood compares the current public response to anthropogenic climate change to the historical responses to heliocentrism and relativity. Even though theories of climate change pale in comparison to the others on the scale of scientific revolutions, he notes many fundamental similarities in their effects on people’s conception of the world. Here are some choice quotes that capture important scientific principles which tend to escape lay understanding and which may make acceptance of scientific theories more difficult. On scientific elegance and parsimony in model comparison:

Surely, the need for a new tweak to the model each time more accurate observations came along should have been a tip-off that something fundamental was wrong.

On deduction vs. observation:

the worked-out consequences of evident physical principles rather than direct observation

A common refrain is the disparagement of new paradigms as mere theories with too little observational basis.

On the backfire effect:

Instead of quelling the debate, the confirmation of the theory and acclaim for its author had sparked an organized opposition dedicated to discrediting both theory and author.

As [confirmatory] evidence continues to accumulate… skepticism seem[s] to be growing rather than shrinking…

provocative ideas… have shattered notions that make us feel safe. That kind of change can turn people away from reason and toward emotion, especially when the ideas are pressed on them with great force.

why the backlash happens: the frailty of human reason and supremacy of emotional concerns that we humans all share but do not always acknowledge

On communicating scientific uncertainty:

“All our science, measured against reality, is primitive and childlike—and yet it is the most precious thing we have.” (Einstein)

One of the most difficult yet fundamental principles of science is that we don’t and can’t know if we’re right. We can only get closer to what is probably right. Yet science is seldom conveyed or perceived that way. And what makes science so precious is its ability to show us, through inference and deduction, that which seems to contradict our casual observation and which most surprises us. This suggests caution both when employing discovery learning, as we cannot always trust ourselves to discover accurately, and when employing lecture-based instruction, as we are also unlikely to trust authoritarian telling that threatens our preferences for and sense of security about our world. Understanding the relationship between our flawed tools of reason—through cognitive science—and our imperfect tools of science—probabilistic inference, mathematical proof, model comparison—can help us learn better from both. — Sherwood, S. (2011). Science controversies past and present. Physics Today, 64(10), 39-44. http://dx.doi.org/10.1063/PT.3.1295

Distinguishing science from pseudoscience

Here’s another excellent reminder of the importance of responding to others’ different beliefs gently, in “The 10 Commandments of Helping Students Distinguish Science from Pseudoscience in Psychology“:

Gently challenge students’ beliefs with sympathy and compassion. Students who are emotionally committed to paranormal beliefs will find these beliefs difficult to question, let alone relinquish. Ridiculing these beliefs can produce reactance and reinforce students’ stereotypes of science teachers as closed-minded and dismissive.

Summary of commandments:

  1. Delineate the features that distinguish science from pseudoscience.
  2. Distinguish skepticism from cynicism.
  3. Distinguish methodological skepticism from philosophical skepticism.
  4. Distinguish pseudoscientific claims from claims that are merely false.
  5. Distinguish science from scientists.
  6. Explain the cognitive underpinnings of pseudoscientific beliefs.
  7. Remember that pseudoscientific beliefs serve important motivational functions.
  8. Expose students to examples of good science as well as to examples of pseudoscience.
  9. Be consistent in one’s intellectual standards.
  10. Distinguish pseudoscientific claims from purely metaphysical religious claims.

I think the implications of these guidelines extend well beyond psychology into the nature of science more generally, and into methods for helping the broader public evaluate the connection between belief and evidence more critically. Guidelines #6 and #7 are especially valuable for describing how to do this respectfully and kindly.

When discussing risk backfires

On “More Talk, Less Agreement: Risk Discussion Can Hurt Consensus-Building on Science/Technology“:

When it comes to issues pertaining to science and technology, “talking it out” doesn’t seem to work. A new study shows that the more people discuss the risks and benefits of scientific endeavors, the more entrenched they become in their viewpoint, and the less likely they are to see the merits of opposing views.

Still more evidence on how people become more entrenched in their views upon actively considering contradictory information and perspectives. We really need to learn more about how emotion and identity influence these discussions, and develop better techniques for listening and communicating.


Andrew R. Binder, Dietram A. Scheufele, Dominique Brossard and Albert C. Gunther. Interpersonal Amplification of Risk? Citizen Discussions and Their Impact on Perceptions of Risks and Benefits of a Biological Research Facility”. Risk Analysis, 29 Oct 2010 DOI: 10.1111/j.1539-6924.2010.01516.x

Dealing with the “scientific impotence” excuse

On “Five minutes with the discoverer of the Scientific Impotence Excuse“:

When people are faced with scientific research that clashes with their personal view, they invoke a range of strategies to discount the findings. They will often judge that the topic at hand is not amenable to scientific enquiry [and embrace] the general idea that some topics are beyond the reach of science.

Anyone who seeks to educate, inform, or influence, take note of these techniques to avoid backfire or unwarranted discounting:

  1. Affirm people’s values first.
  2. Frame findings to be consistent with their values.
  3. Present findings in non-threatening ways.
  4. Speak with humility.
  5. Say “discover” instead of “disagree”.
  6. Decrease in-group/out-group salience.
  7. Provide an alternate target for negative emotions.
  8. Teach critical thinking and metacognition in safe settings.

What I really appreciated was the research-based guidance for how to address this resistance to scientific evidence, in the second section of the interview (as summarized above). Misunderstanding the distinction between evidence and belief contributes to the problem, but it may not be so obvious how to highlight that distinction productively. As Munro pointed out, Cohen, Aronson, and Steele’s (2000) research demonstrates one way to resolve this tension, as does some of his own research (which unfortunately didn’t get cited directly in the interview). I think this is an extremely important set of findings because it’s so tempting for people to come down hard on those who “just don’t understand,” lecturing authoritatively and perhaps conveying frustration or even attacking their perspectives.  Unfortunately, that can backfire. Instead, this research shows that a gentler approach can actually be more effective. I take heart in that.

Difficulties of accommodating discrepant information

On “The Wrong Stuff – Reasonable Doubt: Innocence Project Co-Founder Peter Neufeld on Being Wrong“:

I think generally speaking it’s difficult for people to admit they’re wrong, and the higher the stakes, the more difficult it becomes. So what you really want to do is educate people that it’s OK to be wrong. It doesn’t mean you’re a fool. It’s not going to be the end of your life.

There are high social costs to being wrong, and creating a culture that values thoughtfulness and humility rather than tenacity may alleviate this phenomenon. (Ironically, one might expect this to be worse in a collectivist culture, where there could be more shame, surprise, or negative attention attached to retracting publicly stated beliefs. In contrast, individualistic cultures that celebrate different ideas might be more tolerant of changing one’s mind.)

But I think there are high cognitive and metacognitive costs to being wrong as well. Part of it could be a consequence of generating a hypothesis or belief, akin to the dangers of convincing oneself of the correctness of a guess (e.g., when taking a pretest). The more a person articulates or mentally rehearses an idea, the more s/he becomes committed to it (i.e., strengthens the memory trace, elaborates on potential explanations, draws connections to prior knowledge).

Further, someone whose self-concept is strongly linked to having the right answers might feel more threatened by realizing s/he made an error. And someone who thinks that intelligence is knowing facts rather than exercising good reasoning would probably be more disturbed by having to acknowledge getting the facts wrong.

So what does this suggest? Perhaps we should encourage more tentativeness and skepticism, an appreciation of the probabilistic nature of knowledge, comfort with staking cautious claims. Maybe we should ask people to propose multiple conditional hypotheses instead of single predictions. And most of all, we should put wrongness back in its place– linked to the idea, not the person.

How facts backfire

On “How facts backfire“:

Researchers discover a surprising threat to democracy: our brains

This has profound implications for educating the general populace. I’ve actually just been pondering the ethics of educating people up to (down to?) the trough of the U-shaped curve of learning and development.

Lately I’ve found myself coming back to Strike & Posner’s “intelligible, plausible, and fruitful” criteria for conceptual change. If our target audience doesn’t perceive these new ideas to be fruitful, they’ll have no motivation to change.

I’ve also been thinking of all these ways in which a little (or a lot) of knowledge can make learning harder: backfire, U-shaped development, expert blindspot, information overload. I’ll probably think of more to add to the list later. Given the considerable risks of this happening through so many different mechanisms, how can we equip learners against them? It seems that some of the answers may lie in influencing the learner’s affective, motivational, and metacognitive states: making errors and belief change nonthreatening, incentivizing accurate information and valid reasoning, and developing an understanding of these cognitive errors. But I’m still concerned about learners for whom this doesn’t succeed and who then get left worse off than they began.