Why Scientific Perceptions Persist Even with Facts & Teaching

There was a very interesting study completed at the University of Michigan entitled When Corrections Fail: The persistence of political misperceptions by researchers Brendan Nyhan, School of Public Health, University of Michigan, and Jason Reifler, Department of Political Science, Georgia State University.  This study, although in the realm of political behavior, has strong implications for science education, especially in the teaching of science-related social issues.

In their abstract (which follows), Nyhan and Reifler point out that even when individuals are provided with corrective knowledge about a particular issue, some respondents actually increase their misperceptions:

An extensive literature addresses citizen ignorance, but very little research focuses on misperceptions. Can these false or unsubstantiated beliefs about politics be corrected? Previous studies have not tested the efficacy of corrections in a realistic format. We conducted four experiments in which subjects read mock news articles that included either a misleading claim from a politician, or a misleading claim and a correction. Results indicate that corrections frequently fail to reduce misperceptions among the targeted ideological group. We also document several instances of a “backfire effect” in which corrections actually increase misperceptions among the group in question.

In their study, these researchers reported results from two rounds of experiments investigating the effect of “corrective information embedded in realistic news reports succeeds in reducing misperceptions about contemporary politics”.  We, as science educators, know that students appear in our classes with misconceptions or in this context, misperceptions.  We also know that these initial ideas are deeply held by students, and that changing these misconceptions is very difficult to do.

In this study, the authors suggest that beliefs about controversial factual questions are closely linked to one’s ideological preferences or partisan beliefs.  One of the controversial factual questions that the authors studied was whether or not Iraq had Weapons of Mass Destruction (WMD) prior to the Iraq War.  In science education, a similar and controversial factual question is whether or not human activity has contributed to global warming.  I think the results reported by these researchers have implications for our understanding of why people do or do not accept a particular scientific theory.  Let’s take a look.

One of the key aspects of this study for me is the authors discussion of why pre-existing beliefs are preserved even with contrary information.  The first mechanism that they shine a light on is that individuals may “engage in a biased search process, seeking out information that supports their preconceptions and avoiding evidence that undercuts their beliefs.  A second mechanism is called the “backfire effect.”  In this case, individuals who receive unwelcome information may not simply resist challenges to their views, they may come to support their original opinion even more strongly—i.e.–the backfire effect.

In their study, three hypotheses were investigated about how the effectiveness of corrections will vary by participant ideology (liberal, centrist, conservative):

Hypothesis 1: An ideological interaction
The effect of corrections on misperceptions will be moderated by ideology.
Hypothesis 2a: Resistance to corrections
Corrections will fail to reduce misperceptions among the ideological subgroup that is likely to hold the misperception.
Hypothesis 2b: Correction backfire
In some cases, the interaction between corrections and ideology will be so strong that misperceptions will increase for the ideological subgroup in question.

The researchers conducted four experiments in which subjects were read mock newspaper articles containing a statement from a political figure that reinforces a widespread misperception.  Individuals were randomly assigned to read articles that either included or did not include corrective information right after a false or misleading statement.  They then answered a series of factual and opinion questions.

Three areas were investigated from contemporary politics: the war in Iraq, tax cuts, and stem cell research).  This brought more realism to the study rather than using hypothetical situations and questions.   The war in Iraq focused on the risk associated with Saddam Hussein passing weapons or materials or information to terrorist networks.  Subjects read a news article that included remarks made by President George Bush that defended the Iraq war, and that there was a real risk that Saddam would pass on weapons or information.  Some respondents were given correction which discusses the Duelfer Report, which documents the lack of Iraqi WMD or active production program prior to the U.S. invasion.   After reading the article, respondents were asked to state whether they agreed (on a five-point Likert scale ranging from “strongly disagree” (1) to “strongly agree” (5) with the following statement:

Immediately before the U.S. invasion, Iraq had an active weapons of mass destruction program, the ability to produce these weapons, and large stockpiles of WMD, but Saddam Hussein was able to hide or destroy these weapons right before U.S. forces arrived.

In this 2005 experiment, the results supported the “backfire” hypothesis.  For very liberal subjects, the corrective information made them more likely to disagree that Iraq had WMD.  For liberal and centrist individuals, the corrective information had little effect.  But for those that were to the right of center (ideologically conservative), the correction backfired—that is conservatives who received the corrective information that Iraq did not have WMD were more likely to believe that Iraq had WMD.  One explanation was that conservatives tended to believe Bush rather than the media, thus resulting in the backfire effect.

However, in a 2006 experiment, the results of a similar experiment did not support the backfire effect, and the researchers explained that this have been due to the shifting explanation for the war, and dissociation among Republicans that Iraq had WMD.

However the researchers conclude that their study seems to support the idea that citizens engage in motivated reasoning.  Their studies support the notion that using corrections on factual beliefs shows that responses to the corrections about controversial issues vary systematically by ideology.

As the researchers point out, their study did support the hypothesis that conservatives are especially dogmatic, but they also pointed out that liberals and Democrats also interpret factual information in ways that are consistent with their political world views.

There are many controversial questions in the realm of science education that the study reviewed here impinges upon.  The one that is most obvious is the question regarding human influence on global warming.  The scientific consensus on global warming is overwhelming in support of the theory that global temperatures have increased alongside the increase of CO2 in the atmosphere, which has been associated with human activity.  Yet, even with the convincing arguments of scientists, the public is skeptical about global warming.

Helping students construct their ideas about the world around them will contribute to helping students evaluate arguments about controversial questions.  The constructivist science teacher assumes that students enter their class with pre-existing ideas and opinions, and is challenged to provide a learning environment in which students have an opportunity to identify their existing ideas, but to explore new ideas in the context of their prior-beliefs.  Through dialog, challenge, projects, discussions, and research, teachers help students construct new ideas.  This is the core of teaching, and it is the most challenging.  It seems easier to simply tell students; but we know that this approach has not worked.

Finally, we see the effect of partisan politics on how science is not used to solve many of our problems.  All around us, there is evidence that climate is changing, and most likely the result of the rise in the Earth’s temperature.  Yet there are politicians who ignore the results of science, and use personal opinion (global warming is a hoax) as factual, and ignore any evidence to the contrary.


Nyhan, B., & Reifler, J. (2010). When Corrections Fail: The Persistence of Political Misperceptions Political Behavior, 32 (2), 303-330 DOI: 10.1007/s11109-010-9112-2

About Jack Hassard

Jack Hassard is a writer, a former high school teacher, and Professor Emeritus of Science Education, Georgia State University.