Connecting knowledge and action: On explaining, informing, educating, and mobilizing

In Public Opinion and Political Participation in the Climate Change Debate, Matthew Nisbet reviews factors influencing how people understand and act upon science and policy issues, in a preprint of his chapter:

Nisbet, M.C. (2011). Public Opinion and Political Participation. In D. Schlosberg, J. Dryzek, & R. Norgaard (Eds.), Oxford Handbook of Climate Change and Society. London, UK: Oxford University Press.

Although he focuses on climate change, the principles he describes are more broadly relevant to communication and engagement, or to understanding and acting on new knowledge in general.

Knowledge isn’t action:

only a small proportion possess the type of opinion intensity that motivates direct participation

Information access isn’t knowledge:

the multi-tasking facilitated by hand-held devices is negatively related to learning and recall

Valuing information isn’t the same as evaluating information:

individuals are ‘cognitive misers,’ relying on personal experience, values, social influences such as friends or colleagues, personal identity, and the most readily available information

He then summarizes these influences:

  1. Schemas

    People have multiple schema[s]… which can be triggered by conversations, personal observation, and direct experience

    tailoring communication to these mental models can improve the ability of individuals and groups to reach decisions and to take actions, especially when statistical information is paired with affective, personally relevant images

  2. Values
    • Hierarchists [worry about] threats to those they respect in power, to established order in society, and to status quo practices

    • Individualists [worry about] unwise restrictions on markets, enterprise, and personal freedom.

    • [Those with] egalitarian and communitarian values [worry about] the need to manage markets and industry in favor of the collective good and to protect the most vulnerable

  3. Framing
    If the information doesn’t fit, it won’t stick.

    a specific media frame is only influential if it is relevant—or applicable—to the audience’s preexisting interpretations and schema

  4. Knowledge
    Knowing how to act matters more than knowing why it is.

    understanding how to take actions or to get involved on an issue [is] generally more important to decision making and behavior [than knowledge about the causes of a problem]

  5. Interpretative Communities
    Whom you know affects what you know.

    Different interpretative communities tend to prefer their own ideologically like-minded news and opinion media

There’s a slight irony in the fact that the initiatives he describes for how to apply these principles to promote understanding and action seem a bit less well developed than the principles themselves. But he does offer this guideline:

the ideal approach… [establishes] an iterative dialogue between stakeholders and experts, where the experts can explain uncertainty and the ways it is likely to be misinterpreted [and] the stakeholders in turn can explain their decision-making criteria as well as their own local knowledge

More recommendations along these lines are critical, especially considering the backfire effect. Knowing that risk discussion can backfire on building consensus should remind us to tread gently when confronting uncertainty, feelings of lack of control, and conflicting beliefs.

Misplaced critical thinking

In Physics Today‘s Science controversies past and present, Steven Sherwood compares the current public response to anthropogenic climate change to the historical responses to heliocentrism and relativity. Even though theories of climate change pale in comparison to the others on the scale of scientific revolutions, he notes many fundamental similarities in their effects on people’s conception of the world. Here are some choice quotes that capture important scientific principles which tend to escape lay understanding and which may make acceptance of scientific theories more difficult. On scientific elegance and parsimony in model comparison:

Surely, the need for a new tweak to the model each time more accurate observations came along should have been a tip-off that something fundamental was wrong.

On deduction vs. observation:

the worked-out consequences of evident physical principles rather than direct observation

A common refrain is the disparagement of new paradigms as mere theories with too little observational basis.

On the backfire effect:

Instead of quelling the debate, the confirmation of the theory and acclaim for its author had sparked an organized opposition dedicated to discrediting both theory and author.

As [confirmatory] evidence continues to accumulate… skepticism seem[s] to be growing rather than shrinking…

provocative ideas… have shattered notions that make us feel safe. That kind of change can turn people away from reason and toward emotion, especially when the ideas are pressed on them with great force.

why the backlash happens: the frailty of human reason and supremacy of emotional concerns that we humans all share but do not always acknowledge

On communicating scientific uncertainty:

“All our science, measured against reality, is primitive and childlike—and yet it is the most precious thing we have.” (Einstein)

One of the most difficult yet fundamental principles of science is that we don’t and can’t know if we’re right. We can only get closer to what is probably right. Yet science is seldom conveyed or perceived that way. And what makes science so precious is its ability to show us, through inference and deduction, that which seems to contradict our casual observation and which most surprises us. This suggests caution both when employing discovery learning, as we cannot always trust ourselves to discover accurately, and when employing lecture-based instruction, as we are also unlikely to trust authoritarian telling that threatens our preferences for and sense of security about our world. Understanding the relationship between our flawed tools of reason—through cognitive science—and our imperfect tools of science—probabilistic inference, mathematical proof, model comparison—can help us learn better from both. — Sherwood, S. (2011). Science controversies past and present. Physics Today, 64(10), 39-44. http://dx.doi.org/10.1063/PT.3.1295

Science answers the question of “how,” not “what”

In “Trust Me, I’m a Scientist” , cognitive psychologist Daniel Willingham argues that the belief that improving science education would increase students’ appreciation for scientific opinion is a misconception, since “Those who know more science have only a slightly greater propensity to trust scientists.” Instead, he suggests, “A more direct approach would be to educate people about why they are prone to accept inaccurate beliefs in the first place.”

I agree with Willingham that educating people in some basic cognitive science (specifically, common fallacies of thinking) would go a long way, but I think he mischaracterizes what good science education should be. It’s not simply about the amount of content, but about an understanding of the nature of science. Science is not a collection of facts, but a way of knowing. Learning more about the history of science (whether in a history class or science class, or both) certainly is one valuable component in providing a richer view of science. Still, it’s only part of the picture. Science education itself should incorporate a strong focus on building an understanding of how scientific knowledge is developed over time. That demands an appreciation for evaluating and quantifying how well evidence supports explanation and comparing the explanatory power of competing theories.

We do still need to provide better science education—a better understanding of “how,” not “what.” It’s crucial for creating a responsible citizenry.

How science supplements cognition

Chris Mooney provides some choice excerpts from his interview of astrophysicist Neil DeGrasse Tyson on this week’s Point of Inquiry:

Science exists… because the data-taking faculties of the human body are faulty. And what science does as an enterprise is provide ways to get data, acquire data from the natural world that don’t have to filter through your senses. And this ensures, or at least minimizes as far as possible, the capacity of your brain to fool itself.

If it were natural to think scientifically, science as we currently practice it would have been going on for thousands of years. But it hasn’t…. Science as we now practice it [has] been going on for no more than 400 years.

The operations of the universe can be understood through your fluency in math and science, and it’s math and science that give people the greatest challenges in the school system.

It is precisely because they are not “natural” to our thinking that math and science are such powerful tools: They enable us to overcome our natural cognitive biases.

Math and science are perhaps the greatest cultural artifacts that we have, because our appreciation of them is not innate (as opposed to language, music, and visual perception). Rather, our understanding of them derives from the wisdom discovered, constructed, and passed down from others.

Distinguishing science from pseudoscience

Here’s another excellent reminder of the importance of responding to others’ different beliefs gently, in “The 10 Commandments of Helping Students Distinguish Science from Pseudoscience in Psychology“:

Gently challenge students’ beliefs with sympathy and compassion. Students who are emotionally committed to paranormal beliefs will find these beliefs difficult to question, let alone relinquish. Ridiculing these beliefs can produce reactance and reinforce students’ stereotypes of science teachers as closed-minded and dismissive.

Summary of commandments:

  1. Delineate the features that distinguish science from pseudoscience.
  2. Distinguish skepticism from cynicism.
  3. Distinguish methodological skepticism from philosophical skepticism.
  4. Distinguish pseudoscientific claims from claims that are merely false.
  5. Distinguish science from scientists.
  6. Explain the cognitive underpinnings of pseudoscientific beliefs.
  7. Remember that pseudoscientific beliefs serve important motivational functions.
  8. Expose students to examples of good science as well as to examples of pseudoscience.
  9. Be consistent in one’s intellectual standards.
  10. Distinguish pseudoscientific claims from purely metaphysical religious claims.

I think the implications of these guidelines extend well beyond psychology into the nature of science more generally, and into methods for helping the broader public evaluate the connection between belief and evidence more critically. Guidelines #6 and #7 are especially valuable for describing how to do this respectfully and kindly.

When discussing risk backfires

On “More Talk, Less Agreement: Risk Discussion Can Hurt Consensus-Building on Science/Technology“:

When it comes to issues pertaining to science and technology, “talking it out” doesn’t seem to work. A new study shows that the more people discuss the risks and benefits of scientific endeavors, the more entrenched they become in their viewpoint, and the less likely they are to see the merits of opposing views.

Still more evidence on how people become more entrenched in their views upon actively considering contradictory information and perspectives. We really need to learn more about how emotion and identity influence these discussions, and develop better techniques for listening and communicating.


Andrew R. Binder, Dietram A. Scheufele, Dominique Brossard and Albert C. Gunther. Interpersonal Amplification of Risk? Citizen Discussions and Their Impact on Perceptions of Risks and Benefits of a Biological Research Facility”. Risk Analysis, 29 Oct 2010 DOI: 10.1111/j.1539-6924.2010.01516.x

Dealing with the “scientific impotence” excuse

On “Five minutes with the discoverer of the Scientific Impotence Excuse“:

When people are faced with scientific research that clashes with their personal view, they invoke a range of strategies to discount the findings. They will often judge that the topic at hand is not amenable to scientific enquiry [and embrace] the general idea that some topics are beyond the reach of science.

Anyone who seeks to educate, inform, or influence, take note of these techniques to avoid backfire or unwarranted discounting:

  1. Affirm people’s values first.
  2. Frame findings to be consistent with their values.
  3. Present findings in non-threatening ways.
  4. Speak with humility.
  5. Say “discover” instead of “disagree”.
  6. Decrease in-group/out-group salience.
  7. Provide an alternate target for negative emotions.
  8. Teach critical thinking and metacognition in safe settings.

What I really appreciated was the research-based guidance for how to address this resistance to scientific evidence, in the second section of the interview (as summarized above). Misunderstanding the distinction between evidence and belief contributes to the problem, but it may not be so obvious how to highlight that distinction productively. As Munro pointed out, Cohen, Aronson, and Steele’s (2000) research demonstrates one way to resolve this tension, as does some of his own research (which unfortunately didn’t get cited directly in the interview). I think this is an extremely important set of findings because it’s so tempting for people to come down hard on those who “just don’t understand,” lecturing authoritatively and perhaps conveying frustration or even attacking their perspectives.  Unfortunately, that can backfire. Instead, this research shows that a gentler approach can actually be more effective. I take heart in that.