デバンキングハンドブック(5/6) 代わりになる説明で隙間を埋める

[Cook, J., Lewandowsky, S. (2011), The Debunking Handbook. St. Lucia, Australia: University of Queensland. November 5. ISBN 978-0-646-56812-6.]

Filling the gap with an alternative explanation

Assuming you successfully negotiate the various backfire effects, what is the most effective way to debunk a myth? The challenge is that once misinformation gets into a person’s mind, it’s very difficult to remove. This is the case even when people remember and accept a correction.

さまざまなバックファイアー効果にうまく折り合いをつけたとすると、もっとも効果的な誤情報デバンクの方法は何だろうか? 問題は、人の心に一度入り込んだ誤情報を削除するのは非常に困難だということである。人々が訂正情報を覚えていて、それを受け入れている場合でも同様に困難である。

This was demonstrated in an experiment in which people read a fictitious account of a warehouse fire.[15],[16],[3] Mention was made of paint and gas cans along with explosions. Later in the story, it was clarified that paint and cans were not present at the fire. Even when people remembered and accepted this correction, they still cited the paint or cans when asked questions about the fire. When asked, “Why do you think there was so much smoke?”, people routinely invoked the oil paint despite having just acknowledged it as not being present.


When people hear misinformation, they build a mental model, with the myth providing an explanation. When the myth is debunked, a gap is left in their mental model. To deal with this dilemma, people prefer an incorrect model over an incomplete model. In the absence of a better explanation, they opt for the wrong explanation.[17]


In the warehouse fire experiment, when an alternative explanation involving lighter fluid and accelerant was provided, people were less likely to cite the paint and gas cans when queried about the fire. The most effective way to reduce the effect of misinformation is to provide an alternative explanation for the events covered by the misinformation.



This strategy is illustrated particularly clearly in fictional murder trials. Accusing an alternative suspect greatly reduced the number of guilty verdicts from participants who acted as jurors, compared to defences that merely explained why the defendant wasn’t guilty.[18]


For the alternative to be accepted, it must be plausible and explain all observed features of the event.[19],[15] When you debunk a myth, you create a gap in the person’s mind. To be effective, your debunking must fill that gap.


One gap that may require filling is explaining why the myth is wrong. This can be achieved by exposing the rhetorical techniques used to misinform. A handy reference of techniques common to many movements that deny a scientific consensus is found in Denialism: what is it and how should scientists respond?[20] The techniques include cherry picking, conspiracy theories and fake experts. Another alternative narrative might be to explain why the misinformer promoted the myth. Arousing suspicion of the source of misinformation has been shown to further reduce the influence of misinformation.[21],[22]

埋める必要があると思われる隙間のひとつに、誤情報が誤りである理由の説明がある。これは誤情報を広めるために使われたレトリックをあばくことで実現可能である。科学的混戦さ鵜を否定する多くの運動で共通して見られるテクニックは、否定論に見られる。否定論とは何か、そして科学者はこれにどう対処すべきか? 否定論に用いられるテクニックは、チェリーピッキング(証拠の選択的提示)と陰謀論とニセの専門家である。これ以外の方法としては、誤情報を宣伝する誤情報屋の動機を説明することである。誤情報の情報源の疑わしさを喚起することで、誤情報の影響をより削減できる。

Another key element to effective rebuttal is using an explicit warning (“watch out, you might be misled”) before mentioning the myth. Experimentation with different rebuttal structures found the most effective combination included an alternative explanation and an explicit warning.[17]


Graphics are also an important part of the debunker’s toolbox and are significantly more effective than text in reducing misconceptions. When people read a refutation that conflicts with their beliefs, they seize on ambiguities to construct an alternative interpretation. Graphics provide more clarity and less opportunity for misinterpretation. When self-identified Republicans were surveyed about their global warming beliefs, a significantly greater number accepted global warming when shown a graph of temperature trends compared to those who were given a written description.[13]


Another survey found that when shown data points representing surface temperature, people correctly judged a warming trend irrespective of their views towards global warming.[23] If your content can be expressed visually, always opt for a graphic in your debunking.



When you debunk a myth, you create a gap in the person’s mind. To be effective, your debunking must fill that gap.

13. Nyhan, B., & Reifler, J. (2011). Opening the Political Mind? The effects of self-affirmation and graphical information on factual misperceptions. In press.
14. Hardisty, D. J., Johnson, E. J. & Weber, E. U. (1999). A Dirty Word or a Dirty World?: Attribute Framing, Political Affiliation, and Query Theory, Psychological Science, 21, 86-92
15. Seifert, C. M. (2002). The continued influence of misinformation in memory: What makes a correction effective? The Psychology of Learning and Motivation, 41, 265-292.
16. Wilkes, A. L.; Leatherbarrow, M. (1988). Editing episodic memory following the identification of error, The Quarterly Journal of Experimental Psychology A: Human Experimental Psychology, 40A, 361-387.
17. Ecker, U. K., Lewandowsky, S., & Tang, D. T. (2011). Explicit warnings reduce but do not eliminate the continued influence of misinformation. Memory & Cognition, 38, 1087-1100.
18. Tenney, E. R., Cleary, H. M., & Spellman, B. A. (2009). Unpacking the doubt in “Beyond a reasonable doubt:” Plausible alternative stories increase not guilty verdicts. Basic and Applied Social Psychology, 31, 1-8.
19. Rapp, D. N., & Kendeou, P. (2007). Revising what readers know: Updating text representations during narrative comprehension. Memory & Cognition, 35, 2019-2032.
20. Diethelm, P., & McKee, M. (2009). Denialism: what is it and how should scientists respond? European Journal of Public Health, 19, 2-4.
21. Lewandowsky, S., Stritzke, W. G., Oberauer, K., & Morales, M. (2005). Memory for fact, fiction and misinformation: The Iraq War 2003. Psychological Science, 16, 190-195.
22. Lewandowsky, S., & Stritzke, W. G. K., Oberauer, K., & Morales, M. (2009). Misinformation and the ‘War on Terror’: When memory turns fiction into fact. In W. G. K. Stritzke, S. Lewandowsky, D. Denemark, J. Clare, & F. Morgan (Eds.), Terrorism and torture: An interdisciplinary perspective (pp. 179-203). Cambridge, UK: Cambridge University Press.
23. Lewandowsky, S. (2011). Popular consensus: Climate change set to continue. Psychological Science, 22, 460-463.


posted by Kumicit at 2011/12/04 10:26 | Comment(0) | TrackBack(0) | ID: General | このブログの読者になる | 更新情報をチェックする



コメント: [必須入力]