Connecting knowledge and action: On explaining, informing, educating, and mobilizing

In Public Opinion and Political Participation in the Climate Change Debate, Matthew Nisbet reviews factors influencing how people understand and act upon science and policy issues, in a preprint of his chapter:

Nisbet, M.C. (2011). Public Opinion and Political Participation. In D. Schlosberg, J. Dryzek, & R. Norgaard (Eds.), Oxford Handbook of Climate Change and Society. London, UK: Oxford University Press.

Although he focuses on climate change, the principles he describes are more broadly relevant to communication and engagement, or to understanding and acting on new knowledge in general.

Knowledge isn’t action:

only a small proportion possess the type of opinion intensity that motivates direct participation

Information access isn’t knowledge:

the multi-tasking facilitated by hand-held devices is negatively related to learning and recall

Valuing information isn’t the same as evaluating information:

individuals are ‘cognitive misers,’ relying on personal experience, values, social influences such as friends or colleagues, personal identity, and the most readily available information

He then summarizes these influences:

  1. Schemas

    People have multiple schema[s]… which can be triggered by conversations, personal observation, and direct experience

    tailoring communication to these mental models can improve the ability of individuals and groups to reach decisions and to take actions, especially when statistical information is paired with affective, personally relevant images

  2. Values
    • Hierarchists [worry about] threats to those they respect in power, to established order in society, and to status quo practices

    • Individualists [worry about] unwise restrictions on markets, enterprise, and personal freedom.

    • [Those with] egalitarian and communitarian values [worry about] the need to manage markets and industry in favor of the collective good and to protect the most vulnerable

  3. Framing
    If the information doesn’t fit, it won’t stick.

    a specific media frame is only influential if it is relevant—or applicable—to the audience’s preexisting interpretations and schema

  4. Knowledge
    Knowing how to act matters more than knowing why it is.

    understanding how to take actions or to get involved on an issue [is] generally more important to decision making and behavior [than knowledge about the causes of a problem]

  5. Interpretative Communities
    Whom you know affects what you know.

    Different interpretative communities tend to prefer their own ideologically like-minded news and opinion media

There’s a slight irony in the fact that the initiatives he describes for how to apply these principles to promote understanding and action seem a bit less well developed than the principles themselves. But he does offer this guideline:

the ideal approach… [establishes] an iterative dialogue between stakeholders and experts, where the experts can explain uncertainty and the ways it is likely to be misinterpreted [and] the stakeholders in turn can explain their decision-making criteria as well as their own local knowledge

More recommendations along these lines are critical, especially considering the backfire effect. Knowing that risk discussion can backfire on building consensus should remind us to tread gently when confronting uncertainty, feelings of lack of control, and conflicting beliefs.

Misplaced critical thinking

In Physics Today‘s Science controversies past and present, Steven Sherwood compares the current public response to anthropogenic climate change to the historical responses to heliocentrism and relativity. Even though theories of climate change pale in comparison to the others on the scale of scientific revolutions, he notes many fundamental similarities in their effects on people’s conception of the world. Here are some choice quotes that capture important scientific principles which tend to escape lay understanding and which may make acceptance of scientific theories more difficult. On scientific elegance and parsimony in model comparison:

Surely, the need for a new tweak to the model each time more accurate observations came along should have been a tip-off that something fundamental was wrong.

On deduction vs. observation:

the worked-out consequences of evident physical principles rather than direct observation

A common refrain is the disparagement of new paradigms as mere theories with too little observational basis.

On the backfire effect:

Instead of quelling the debate, the confirmation of the theory and acclaim for its author had sparked an organized opposition dedicated to discrediting both theory and author.

As [confirmatory] evidence continues to accumulate… skepticism seem[s] to be growing rather than shrinking…

provocative ideas… have shattered notions that make us feel safe. That kind of change can turn people away from reason and toward emotion, especially when the ideas are pressed on them with great force.

why the backlash happens: the frailty of human reason and supremacy of emotional concerns that we humans all share but do not always acknowledge

On communicating scientific uncertainty:

“All our science, measured against reality, is primitive and childlike—and yet it is the most precious thing we have.” (Einstein)

One of the most difficult yet fundamental principles of science is that we don’t and can’t know if we’re right. We can only get closer to what is probably right. Yet science is seldom conveyed or perceived that way. And what makes science so precious is its ability to show us, through inference and deduction, that which seems to contradict our casual observation and which most surprises us. This suggests caution both when employing discovery learning, as we cannot always trust ourselves to discover accurately, and when employing lecture-based instruction, as we are also unlikely to trust authoritarian telling that threatens our preferences for and sense of security about our world. Understanding the relationship between our flawed tools of reason—through cognitive science—and our imperfect tools of science—probabilistic inference, mathematical proof, model comparison—can help us learn better from both. — Sherwood, S. (2011). Science controversies past and present. Physics Today, 64(10), 39-44. http://dx.doi.org/10.1063/PT.3.1295