Letting children choose promotes prosocial behavior

As described in “Giving Preschoolers Choice Increases Sharing Behavior“:

[S]haring when given a difficult choice leads children to see themselves in a new, more beneficent light. Perceiving themselves as people who like to share makes them more likely to act in a prosocial manner in the future.

Previous research has shown that this idea — as described by the over-justification effect — explains why rewarding children for sharing can backfire. Children come to perceive themselves as people who don’t like to share since they had to be rewarded for doing so. Because they don’t view themselves as “sharers” they are less likely to share in the future.

Developmental psychologists Nadia Chernyak and Tamar Kushnir found that compared to children who were given a non-costly choice or who were required to share, preschoolers given a costly choice were more likely to share again at a subsequent opportunity.

My thoughts:

  1. I would be interested in an analysis comparing the effect of the conditions on the children who did not share– that is, collecting baseline data on children’s initial propensity to share, and then comparing how the interventions affected them across the range of initial tendencies.
  2. I wonder how well this would apply to long-term planning and diligence (e.g., completing homework, practicing a difficult skill, doing chores).

The results do suggest that choice can be a powerful mechanism for promoting positive habits and attitudes, something which I think parents and schools could harness more productively. That choice can potentially foster empathy and perspective-taking is very encouraging.


Full reference: N. Chernyak, T. Kushnir. Giving Preschoolers Choice Increases Sharing Behavior. Psychological Science, 2013; DOI: 10.1177/0956797613482335

Less identity, more ideas

Once again, “we would all benefit from more meaningful interaction and less labeling… along any dimension by which we divide humanity.”

From Tom Jacob’s “America’s Increasingly Tribal Electorate“, describing political scientist Lilliana Mason’s research:

“behavioral polarization”—anger at the other side, activism for one’s own side, and a tendency to look at political arguments through a biased lens—is driven much more strongly by that sense of team spirit, as opposed to one’s views on public policy.

According to her:

the only way to reduce the anger and bias would be “to reduce the strength or alignment of political identities.”

Yet I remain hopeful that, in spite of the dangers of the backfire effect, we can find ways to separate ideas from identities, and share knowledge both dispassionately and compassionately at the same time. As before: “Most of all, we should put wrongness back in its place– linked to the idea, not the person,” or the identity.

Connecting knowledge and action: On explaining, informing, educating, and mobilizing

In Public Opinion and Political Participation in the Climate Change Debate, Matthew Nisbet reviews factors influencing how people understand and act upon science and policy issues, in a preprint of his chapter:

Nisbet, M.C. (2011). Public Opinion and Political Participation. In D. Schlosberg, J. Dryzek, & R. Norgaard (Eds.), Oxford Handbook of Climate Change and Society. London, UK: Oxford University Press.

Although he focuses on climate change, the principles he describes are more broadly relevant to communication and engagement, or to understanding and acting on new knowledge in general.

Knowledge isn’t action:

only a small proportion possess the type of opinion intensity that motivates direct participation

Information access isn’t knowledge:

the multi-tasking facilitated by hand-held devices is negatively related to learning and recall

Valuing information isn’t the same as evaluating information:

individuals are ‘cognitive misers,’ relying on personal experience, values, social influences such as friends or colleagues, personal identity, and the most readily available information

He then summarizes these influences:

  1. Schemas

    People have multiple schema[s]… which can be triggered by conversations, personal observation, and direct experience

    tailoring communication to these mental models can improve the ability of individuals and groups to reach decisions and to take actions, especially when statistical information is paired with affective, personally relevant images

  2. Values
    • Hierarchists [worry about] threats to those they respect in power, to established order in society, and to status quo practices

    • Individualists [worry about] unwise restrictions on markets, enterprise, and personal freedom.

    • [Those with] egalitarian and communitarian values [worry about] the need to manage markets and industry in favor of the collective good and to protect the most vulnerable

  3. Framing
    If the information doesn’t fit, it won’t stick.

    a specific media frame is only influential if it is relevant—or applicable—to the audience’s preexisting interpretations and schema

  4. Knowledge
    Knowing how to act matters more than knowing why it is.

    understanding how to take actions or to get involved on an issue [is] generally more important to decision making and behavior [than knowledge about the causes of a problem]

  5. Interpretative Communities
    Whom you know affects what you know.

    Different interpretative communities tend to prefer their own ideologically like-minded news and opinion media

There’s a slight irony in the fact that the initiatives he describes for how to apply these principles to promote understanding and action seem a bit less well developed than the principles themselves. But he does offer this guideline:

the ideal approach… [establishes] an iterative dialogue between stakeholders and experts, where the experts can explain uncertainty and the ways it is likely to be misinterpreted [and] the stakeholders in turn can explain their decision-making criteria as well as their own local knowledge

More recommendations along these lines are critical, especially considering the backfire effect. Knowing that risk discussion can backfire on building consensus should remind us to tread gently when confronting uncertainty, feelings of lack of control, and conflicting beliefs.

Misplaced critical thinking

In Physics Today‘s Science controversies past and present, Steven Sherwood compares the current public response to anthropogenic climate change to the historical responses to heliocentrism and relativity. Even though theories of climate change pale in comparison to the others on the scale of scientific revolutions, he notes many fundamental similarities in their effects on people’s conception of the world. Here are some choice quotes that capture important scientific principles which tend to escape lay understanding and which may make acceptance of scientific theories more difficult. On scientific elegance and parsimony in model comparison:

Surely, the need for a new tweak to the model each time more accurate observations came along should have been a tip-off that something fundamental was wrong.

On deduction vs. observation:

the worked-out consequences of evident physical principles rather than direct observation

A common refrain is the disparagement of new paradigms as mere theories with too little observational basis.

On the backfire effect:

Instead of quelling the debate, the confirmation of the theory and acclaim for its author had sparked an organized opposition dedicated to discrediting both theory and author.

As [confirmatory] evidence continues to accumulate… skepticism seem[s] to be growing rather than shrinking…

provocative ideas… have shattered notions that make us feel safe. That kind of change can turn people away from reason and toward emotion, especially when the ideas are pressed on them with great force.

why the backlash happens: the frailty of human reason and supremacy of emotional concerns that we humans all share but do not always acknowledge

On communicating scientific uncertainty:

“All our science, measured against reality, is primitive and childlike—and yet it is the most precious thing we have.” (Einstein)

One of the most difficult yet fundamental principles of science is that we don’t and can’t know if we’re right. We can only get closer to what is probably right. Yet science is seldom conveyed or perceived that way. And what makes science so precious is its ability to show us, through inference and deduction, that which seems to contradict our casual observation and which most surprises us. This suggests caution both when employing discovery learning, as we cannot always trust ourselves to discover accurately, and when employing lecture-based instruction, as we are also unlikely to trust authoritarian telling that threatens our preferences for and sense of security about our world. Understanding the relationship between our flawed tools of reason—through cognitive science—and our imperfect tools of science—probabilistic inference, mathematical proof, model comparison—can help us learn better from both. — Sherwood, S. (2011). Science controversies past and present. Physics Today, 64(10), 39-44. http://dx.doi.org/10.1063/PT.3.1295

From positive self-esteem to positive other-esteem and learning

Dealing with differences needs to be encouraged gently, whether with ideas or with people.

As described in “People with Low Self-Esteem Show More Signs of Prejudice”[1]:

When people are feeling bad about themselves, they’re more likely to show bias against people who are different. …People who feel bad about themselves show enhanced prejudice because negative associations are activated to a greater degree, but not because they are less likely to suppress those feelings.

The connection between low self-esteem and negative expectations reminds me of related research on the impact of a value-affirming writing exercise in improving the academic performance of minority students:

From “Simple writing exercise helps break vicious cycle that holds back black students”[2]:

In 2007, [Geoffrey Cohen from the University of Colorado] showed that a simple 15-minute writing exercise at the start of a school year could boost the grades of black students by the end of the semester. The assignment was designed to boost the student’s sense of self-worth, and in doing so, it helped to narrow the typical performance gap that would normally separate them from white students.

After two years, the black students earned higher GPAs if they wrote self-affirming pieces on themselves rather than irrelevant essays about other people or their daily routines. On average, the exercises raised their GPA by a quarter of a point.

And from 15-minute writing exercise closes the gender gap in university-level physics[3]:

Think about the things that are important to you. Perhaps you care about creativity, family relationships, your career, or having a sense of humour. Pick two or three of these values and write a few sentences about why they are important to you. You have fifteen minutes. …

In a university physics class, Akira Miyake from the University of Colorado used [this writing exercise] to close the gap between male and female performance. … With nothing but his fifteen-minute exercise, performed twice at the beginning of the year, he virtually abolished the gender divide and allowed the female physicists to challenge their male peers.

Helping people feel better about themselves seems like an obvious, “everybody-wins” approach to improving education, social relations, and accepting different ideas.


[1] T. J. Allen, J. W. Sherman. Ego Threat and Intergroup Bias: A Test of Motivated-Activation Versus Self-Regulatory Accounts. Psychological Science, 2011. DOI: http://dx.doi.org/10.1177/0956797611399291

[2] Cohen, G.L., Garcia, J., Purdie-Vaughns, V., Apfel, N., & Brzustoski, P. (2009). Recursive Processes in Self-Affirmation: Intervening to Close the Minority Achievement Gap. Science, 324(5925), 400-403. DOI: http://dx.doi.org/10.1126/science.1170769

[3] Miyake, A., Kost-Smith, L.E., Finkelstein, N.D., Pollock, S.J., Cohen, G.L., & Ito, T.A. (2010). Reducing the Gender Achievement Gap in College Science: A Classroom Study of Values Affirmation. Science, 330(6008), 1234-1237. DOI: http://dx.doi.org/10.1126/science.1195996

Distinguishing science from pseudoscience

Here’s another excellent reminder of the importance of responding to others’ different beliefs gently, in “The 10 Commandments of Helping Students Distinguish Science from Pseudoscience in Psychology“:

Gently challenge students’ beliefs with sympathy and compassion. Students who are emotionally committed to paranormal beliefs will find these beliefs difficult to question, let alone relinquish. Ridiculing these beliefs can produce reactance and reinforce students’ stereotypes of science teachers as closed-minded and dismissive.

Summary of commandments:

  1. Delineate the features that distinguish science from pseudoscience.
  2. Distinguish skepticism from cynicism.
  3. Distinguish methodological skepticism from philosophical skepticism.
  4. Distinguish pseudoscientific claims from claims that are merely false.
  5. Distinguish science from scientists.
  6. Explain the cognitive underpinnings of pseudoscientific beliefs.
  7. Remember that pseudoscientific beliefs serve important motivational functions.
  8. Expose students to examples of good science as well as to examples of pseudoscience.
  9. Be consistent in one’s intellectual standards.
  10. Distinguish pseudoscientific claims from purely metaphysical religious claims.

I think the implications of these guidelines extend well beyond psychology into the nature of science more generally, and into methods for helping the broader public evaluate the connection between belief and evidence more critically. Guidelines #6 and #7 are especially valuable for describing how to do this respectfully and kindly.

When discussing risk backfires

On “More Talk, Less Agreement: Risk Discussion Can Hurt Consensus-Building on Science/Technology“:

When it comes to issues pertaining to science and technology, “talking it out” doesn’t seem to work. A new study shows that the more people discuss the risks and benefits of scientific endeavors, the more entrenched they become in their viewpoint, and the less likely they are to see the merits of opposing views.

Still more evidence on how people become more entrenched in their views upon actively considering contradictory information and perspectives. We really need to learn more about how emotion and identity influence these discussions, and develop better techniques for listening and communicating.


Andrew R. Binder, Dietram A. Scheufele, Dominique Brossard and Albert C. Gunther. Interpersonal Amplification of Risk? Citizen Discussions and Their Impact on Perceptions of Risks and Benefits of a Biological Research Facility”. Risk Analysis, 29 Oct 2010 DOI: 10.1111/j.1539-6924.2010.01516.x

Dealing with the “scientific impotence” excuse

On “Five minutes with the discoverer of the Scientific Impotence Excuse“:

When people are faced with scientific research that clashes with their personal view, they invoke a range of strategies to discount the findings. They will often judge that the topic at hand is not amenable to scientific enquiry [and embrace] the general idea that some topics are beyond the reach of science.

Anyone who seeks to educate, inform, or influence, take note of these techniques to avoid backfire or unwarranted discounting:

  1. Affirm people’s values first.
  2. Frame findings to be consistent with their values.
  3. Present findings in non-threatening ways.
  4. Speak with humility.
  5. Say “discover” instead of “disagree”.
  6. Decrease in-group/out-group salience.
  7. Provide an alternate target for negative emotions.
  8. Teach critical thinking and metacognition in safe settings.

What I really appreciated was the research-based guidance for how to address this resistance to scientific evidence, in the second section of the interview (as summarized above). Misunderstanding the distinction between evidence and belief contributes to the problem, but it may not be so obvious how to highlight that distinction productively. As Munro pointed out, Cohen, Aronson, and Steele’s (2000) research demonstrates one way to resolve this tension, as does some of his own research (which unfortunately didn’t get cited directly in the interview). I think this is an extremely important set of findings because it’s so tempting for people to come down hard on those who “just don’t understand,” lecturing authoritatively and perhaps conveying frustration or even attacking their perspectives.  Unfortunately, that can backfire. Instead, this research shows that a gentler approach can actually be more effective. I take heart in that.

False-belief task

Description of original task by its developers, Wimmer & Perner (1983):

Understanding of another person’s wrong belief requires explicit representation of the wrongness of this person’s belief in relation to one’s own knowledge. Three- to nine-year-old children’s understanding of two sketches was tested. In each sketch subjects observed how a protagonist put an object into a location x and then witnessed that in the absence of the protagonist the object was transferred from x to location y. Since this transfer came as a surprise they had to assume that the protagonist still believed that the object was in x. Subjects had to indicate where the protagonist will look for the object at his return.

Even as adults, we still make similar errors (whether in belief or behavior). Lesson: Just because you know something doesn’t mean others do (as in flawed perspective-taking). Or: Once you know something, it’s hard to imagine (yourself or others) not knowing it (as in expert blindspot).

Original paper:

Interesting critique of the false-belief task:

Difficulties of accommodating discrepant information

On “The Wrong Stuff – Reasonable Doubt: Innocence Project Co-Founder Peter Neufeld on Being Wrong“:

I think generally speaking it’s difficult for people to admit they’re wrong, and the higher the stakes, the more difficult it becomes. So what you really want to do is educate people that it’s OK to be wrong. It doesn’t mean you’re a fool. It’s not going to be the end of your life.

There are high social costs to being wrong, and creating a culture that values thoughtfulness and humility rather than tenacity may alleviate this phenomenon. (Ironically, one might expect this to be worse in a collectivist culture, where there could be more shame, surprise, or negative attention attached to retracting publicly stated beliefs. In contrast, individualistic cultures that celebrate different ideas might be more tolerant of changing one’s mind.)

But I think there are high cognitive and metacognitive costs to being wrong as well. Part of it could be a consequence of generating a hypothesis or belief, akin to the dangers of convincing oneself of the correctness of a guess (e.g., when taking a pretest). The more a person articulates or mentally rehearses an idea, the more s/he becomes committed to it (i.e., strengthens the memory trace, elaborates on potential explanations, draws connections to prior knowledge).

Further, someone whose self-concept is strongly linked to having the right answers might feel more threatened by realizing s/he made an error. And someone who thinks that intelligence is knowing facts rather than exercising good reasoning would probably be more disturbed by having to acknowledge getting the facts wrong.

So what does this suggest? Perhaps we should encourage more tentativeness and skepticism, an appreciation of the probabilistic nature of knowledge, comfort with staking cautious claims. Maybe we should ask people to propose multiple conditional hypotheses instead of single predictions. And most of all, we should put wrongness back in its place– linked to the idea, not the person.