Hostname: page-component-745bb68f8f-b6zl4 Total loading time: 0 Render date: 2025-02-10T10:05:34.820Z Has data issue: false hasContentIssue false

Displacing Misinformation about Events: An Experimental Test of Causal Corrections

Published online by Cambridge University Press:  01 April 2015

Brendan Nyhan
Affiliation:
Department of Government, Dartmouth College, Hanover, NH, USA; e-mail: nyhan@dartmouth.edu
Jason Reifler
Affiliation:
Department of Politics, University of Exeter, Exeter, Devon, United Kingdom; e-mail: J.Reifler@exeter.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

Misinformation can be very difficult to correct and may have lasting effects even after it is discredited. One reason for this persistence is the manner in which people make causal inferences based on available information about a given event or outcome. As a result, false information may continue to influence beliefs and attitudes even after being debunked if it is not replaced by an alternate causal explanation. We test this hypothesis using an experimental paradigm adapted from the psychology literature on the continued influence effect and find that a causal explanation for an unexplained event is significantly more effective than a denial even when the denial is backed by unusually strong evidence. This result has significant implications for how to most effectively counter misinformation about controversial political events and outcomes.

Type
Research Article
Copyright
Copyright © The Experimental Research Section of the American Political Science Association 2015 

INTRODUCTION

Misinformation can be very difficult to correct (e.g., Lewandowsky et al. Reference Lewandowsky, Ecker, Seifert, Schwarz and Cook2012; Nyhan and Reifler Reference Nyhan and Reifler2010, Reference Nyhan and Reifler2012). In particular, even after these false or unsupported claims are discredited, they may continue to affect the beliefs and attitudes of those who were exposed to them – a phenomenon often referred to as belief perseverance (e.g., Bullock Reference Bullock2007; Cobb et al. Reference Cobb, Nyhan and Reifler2013; Ross et al. Reference Ross, Lepper and Hubbard1975) or the continued influence effect (e.g., Johnson and Seifert Reference Johnson and Seifert1994; Wilkes and Leatherbarrow Reference Wilkes and Leatherbarrow1988). One reason for this persistence is the manner in which people automatically make causal inferences from the information they have at hand. As a result, false information may persist and continue to influence judgments even after it has been debunked if it is not displaced by an alternate causal account.

In the canonical paradigm of the continued influence effect literature (Wilkes and Leatherbarrow Reference Wilkes and Leatherbarrow1988), subjects are told one piece of information at a time about an unfolding fictional event (typically, a fire in a warehouse). In stylized form, participants in a study within this paradigm are told that there was a fire in a warehouse and that there were flammable chemicals in the warehouse that were improperly stored. When hearing these pieces of information in succession, people typically make a causal link between the two facts and infer that the fire was caused in some way by the flammable chemicals. Some subjects are then told that there were no flammable chemicals in the warehouse. Subjects who received this corrective information may correctly answer that there were no flammable chemicals in the warehouse while still believing that flammable chemicals caused the fire. This seeming contradiction can be explained by the fact that people update the factual information about the presence of flammable chemicals without also updating the causal inferences that followed from the incorrect information they initially received. Johnson and Seifert (Reference Johnson and Seifert1994) build on the findings of Wilkes and Leatherbarrow (Reference Wilkes and Leatherbarrow1988) by showing that the incorrect link between flammable chemicals and the fire can be displaced when an alternative cause (arson) is provided for the fire – a finding that was replicated by Ecker et al. (Reference Ecker, Lewandowsky and Apai2011) in a plane crash scenario.

In this study, we adapt the design and theoretical approach of the continued influence paradigm to examine the role of causal inferences in political misperceptions. Our approach is novel within the literature on political misinformation in that it examines new misperceptions rather than those that are already widely held (e.g., Nyhan and Reifler Reference Nyhan and Reifler2010). Our design thus represents both change (examining political misperceptions as they are formed) and continuity (the use of a well-developed experimental paradigm from psychology).

Specifically, we examine how a particular type of information (a causal correction) can help prevent new misperceptions from taking root in the context of a realistic fast-breaking political news story that includes speculative and unconfirmed reports. These sorts of early reports – which frequently contain false information and speculation – can create long-standing misperceptions about the causes of an event. For instance, Democratic Rep. Gary Condit was initially blamed for the death of Chandra Levy, an intern with whom he had a relationship, though another man was later convicted of her murder (Associated Press 2002; Barakat Reference Barakat2010). In other cases, widely publicized accusations can damage the reputation of political figures who are accused of misconduct based on evidence that is later discredited. For instance, former Senator Ted Stevens was defeated in his 2008 re-election campaign after being convicted on bribery charges that were later overturned due to prosecutorial misconduct. Similarly, former U.S. Department of Agriculture official Shirley Sherrod was asked to resign due to a misleading video clip posted online, forcing the White House to apologize and prompting a defamation lawsuit against the blogger who posted the video (Oliphant Reference Oliphant2011; Tapper and Khan Reference Tapper and Kahn2010).

The breaking news event in our experiment is the sudden and unexpected resignation of a politician. Not surprisingly, we find that respondents exposed to initial innuendo linking the resignation to an ongoing scandal investigation view the resigning politician less favorably than a control group. More importantly, we show that even providing evidence against the innuendo is not enough to undo its damage to respondents’ view of the politician. By contrast, adding an alternate causal explanation for why the politician resigned is significantly more effective in limiting acceptance of misperceptions – a result that has significant implications for how to most effectively counter misinformation about controversial events and outcomes.

METHODS

YouGov collected data for this study from October 13, 2012 to October 16, 2012 as a part of the 2012 Cooperative Campaign Analysis Project. Respondents were matched and weighted to population targets using the firm's sample matching procedure, which is designed to approximate a nationally representative sample (Rivers N.d.). The demographic characteristics of the 1,000 respondents in our final sample are thus approximately representative of the U.S. population (details available upon request). In particular, 35% identified as Democrats and 27% as Republicans.Footnote 1

Participants in our study read a series of short news items, which we call “dispatches,” about a fictional state senator in Alaska named Don Swensen. They were randomly assigned to one of four conditions that differed in the number and content of dispatches read. Only one dispatch was displayed at a time. To maximize the precision of our treatment effect estimates, respondents were block-randomized to each condition based on their response to an American National Election Studies question about the extent to which “the people running the government” are crooked (e.g., Duflo et al. Reference Duflo, Glennerster and Kremer2007; Moore Reference Moore2012).Footnote 2 Given that our study concerned innuendo about political misconduct, we used this procedure to ensure that our sample was balanced on prior beliefs about the prevalence of unethical behavior among public officials rather than relying on random assignment alone, which still risks sample imbalance on a potentially important confounding variable.Footnote 3

Table 1 presents the exact wording of the dispatches that subjects received in each condition. Each dispatch included the date and time of their ostensible publication and was presented on a separate page for respondents to read. After respondents read a dispatch, they proceeded sequentially to the next one (moving downward in the table), continuing in chronological order until they completed the dispatches available in their condition.

Table 1 Experimental Stimuli

Note: Stimulus materials were shown sequentially to experimental participants. Respondents read one dispatch per page, proceeding chronologically through the set available in their condition until they completed the dispatches available in their condition. Each dispatch included the ostensible time and date of publication listed above as well as the text of the dispatch.

In the control condition, subjects received information that Swensen resigned from office without any indication why he resigned. In the innuendo condition, subjects received news of the resignation as well as information suggesting that Swensen resigned because of a bribery investigation. Exposure to suggestive questions or claims has previously been shown to have damaging effects (Wegner et al. Reference Wegner, Wenzlaff, Kerker and Beattie1981). The denial condition is the same as the innuendo condition except that it also includes a dispatch in which Swensen denies the bribery allegation and provides a letter from prosecutors stating that he is not under investigation (a credible form of evidence that is often used to defend political figures facing allegations of misconduct – see, e.g., Associated Press 2013; Baquet and Gerth Reference Baquet and Gerth1992; Lizza Reference Lizza2014; Maddux Reference Maddux2004). Finally, the causal correction condition is the same as the denial except that the dispatch includes additional information providing an explanation for his resignationFootnote 4 – he had been named president of a local university but could not disclose his appointment until his predecessor stepped down.Footnote 5

After completing unscramble and word search tasks intended to clear working memory, we asked respondents to report their beliefs and attitudes toward Swensen on two outcome measures. Respondents in each condition were asked their opinion of Swensen on a six-point scale from “very unfavorable (1) to “very favorable” (6) and whether they believe it is likely that he “accepted bribes or engaged in other illegal practices” on a five-point scale from “not at all likely” (1) to “extremely likely” (5). In addition, respondents who were exposed to the innuendo (i.e., not assigned to the control condition) were also asked how likely they think it is that he “is resigning from office because he is under investigation for bribery” on a four-point scale from “not likely at all” (1) to “very likely” (4). (Question wording is provided in the online appendix.)

RESULTS

Table 2 presents OLS models estimating the effect of the innuendo, denial, and causal correction treatments on respondents’ opinion of Swensen and their beliefs about he took bribes or broke the law.Footnote 6 We also estimate the effect of the denial and causal correction treatments on respondents’ belief that Swensen resigned due to the investigation. Because this question was only asked of respondents who were exposed to the rumor about his resignation, respondents in the innuendo condition serve as controls in these models. As the table indicates, each model was estimated using both inverse probability weights accounting for varying probabilities of treatment due to block randomization (Gerber and Green Reference Gerber and Green2012: 117) and YouGov survey weights intended to help ensure that the data approximate a national probability sample.Footnote 7

Table 2 OLS Models of Experimental Results

Note: OLS models estimated with survey weights provided by YouGov or inverse probability weights accounting for block randomization; standard errors in parentheses (** p < 0.01, * p< 0.05). “Favorable” measures favorability toward the fictional politician in the experiment on a six-point scale from “very unfavorable” (1) to “very favorable” (6). “Bribes” measures belief that the politician “accepted bribes or engaged in other illegal practices” on a five-point scale from “not at all likely” (1) to “extremely likely” (5). “Investigation” measures belief that the politician “is resigning from office because he is under investigation for bribery” on a four-point scale from “not likely at all” (1) to “very likely” (4). The reference category (excluded condition) for Models 1–4 is the control condition, whereas it is the innuendo condition for Models 5 and 6. The experimental design, question wording, and pairwise comparisons of the statistical significance of differences in means between conditions are provided in the online appendix.

As Models 1 and 2 in Table 2 show, exposure to innuendo significantly reduced how favorably respondents viewed Swensen (−0.58 and −0.52, respectively; p < 0.01). The denial failed to completely repair the damage. Despite being told of credible evidence against the innuendo (a letter from prosecutors stating he is not under investigation), respondents still viewed Swensen significantly less favorably than controls (−0.39 and −0.41, p < 0.01).Footnote 8 By contrast, the causal correction significantly increased favorability relative to respondents in the innuendo condition (0.66, p < 0.01 using results from Model 1; 0.32, p < 0.05 using Model 2).Footnote 9 As a result, we cannot reject the null hypothesis of no difference in how favorably respondents view Swensen between the control and causal correction conditions. Figure 1 illustrates these effects using results from Model 1.Footnote 10

Mean favorability toward the fictional politician and 95% confidence intervals by experimental condition (estimated using inverse probability weights). Favorability was measured on a six-point scale from “very unfavorable (1) to “very favorable” (6). The experimental design, question wording, and pairwise comparisons of the statistical significance of differences in means between conditions are provided in the online appendix.

Figure 1 Politician Favorability

Given our concerns about factual misperceptions, we are especially interested in the effects of our treatments on whether respondents believed Swensen took bribes or committed other crimes. These treatment effect estimates are provided in Models 3 and 4 in Table 2. We again find significant damage incurred by the innuendo, which in this case makes respondents much more likely to believe Swensen engaged in illegal activity (0.78 and 0.67, respectively; p < 0.01). The denial again did not prevent the innuendo from damaging his reputation. Respondents who received Swensen's denial were still more likely than controls to believe that he had broken the law (0.55 and 0.52, p < 0.01).Footnote 11 Most importantly, the causal correction was successful at offsetting the damaging effects of the innuendo, significantly reducing belief that Swensen took bribes or committed other crimes relative to the innuendo (−0.72 and −0.51 using results from Models 3 and 4, respectively; p < 0.01) and denial (−0.48, p < 0.01 using results from Model 3; −0.36, p < 0.05 using Model 4). We thus cannot reject the null hypothesis of no difference in belief in Swensen breaking the law between the control and causal correction conditions. These effects are illustrated in Figure 2 using the results in Model 3.

Mean belief that the fictional politician “accepted bribes or engaged in other illegal practices” by experimental condition and 95% confidence intervals (estimated using inverse probability weights). Belief in this claim was measured on a five-point scale from “not at all likely” (1) to “extremely likely” (5). The experimental design, question wording, and pairwise comparisons of the statistical significance of differences in means between conditions are provided in the online appendix.

Figure 2 Likelihood of Bribery or Other Crimes

Finally, Models 5 and 6 in Table 2 directly test the effectiveness of the denial and causal correction in reducing belief in the investigation rumor, the specific content of the innuendo. Because belief in the rumor was not asked of respondents in the control condition, treatment effects are estimated relative to respondents in the innuendo condition. Both Model 5 and 6 indicate that the denial and causal correction treatments significantly reduced belief in the bribery rumor (denial: −0.25 and −0.26, respectively, p < 0.01; causal correction: −0.58 and −0.42, p < 0.01). The point estimates suggest that the causal correction was more effective, though the difference did not quite reach significance in the survey weights model (−0.34, p < 0.01 using results from Model 5; −0.16, p < 0.12 using Model 6). Figure 3 illustrates the results from Model 5.

Mean belief that the fictional politician “is resigning from office because he is under investigation for bribery” and 95% confidence intervals by experimental condition (estimated using inverse probability weights). Belief in this claim was measured on a four-point scale from “not likely at all” (1) to “very likely” (4). The experimental design, question wording, and pairwise comparisons of the statistical significance of differences in means between conditions are provided in the online appendix.

Figure 3 Likelihood that the Innuendo is True

CONCLUSION

Our results provide further evidence that corrections of misinformation are frequently ineffective (e.g., Nyhan and Reifler Reference Nyhan and Reifler2010, Reference Nyhan and Reifler2012; Nyhan et al. Reference Nyhan, Reifler and Ubel2013). In particular, a denial failed to fully undo the damage to the fictional politician's reputation caused by exposure to innuendo despite being backed by evidence (the letter from prosecutors). However, there is reason for optimism. By providing another explanation for the event in question (the resignation), the causal correction was able to reverse the damage from the innuendo, suggesting that it was necessary to displace the original attribution of the event to the investigation with an alternate account.

This study makes several important contributions. First, our results demonstrate that findings from the psychology literature on the continued influence effect apply to politics. Simply telling participants that an initial account about the cause of a political event is false does not undo the effects of that misinformation; it is necessary to provide an alternate causal explanation that displaces inferences made from the false information in order to prevent it from continuing to affect respondents’ beliefs and attitudes. Second, we successfully adapt the paradigm of a sequence of short dispatches from the psychology literature on the continued influence effect to the political domain, mimicking the flow of information about breaking news, which frequently contains incorrect claims that are later corrected. Third, we show how citizens are easily influenced by innuendo, making false inferences that are difficult to later correct.

Based on these findings, we suggest several directions for future research. As noted above, our study is novel in examining the formation of political misperceptions and suggesting a new approach to reducing them. It is important to be clear that we have not solved the problem of misperceptions, of course. Even respondents in the causal correction condition believe, on average, that it is “somewhat likely” that the politician in our experiment resigned due to a bribery investigation. However, given the difficulty of correcting existing misperceptions, understanding how to reduce the likelihood that misperceptions will initially take hold seems especially important. In addition, while previous research has emphasized the role of motivated reasoning in the formation and maintenance of misperceptions (e.g., Nyhan and Reifler Reference Nyhan and Reifler2010; Nyhan et al. Reference Nyhan, Reifler and Ubel2013), this design demonstrates how limitations of human memory and inference can also contribute to false beliefs. Future research should seek to determine if these results hold when partisanship or ideological factors are incorporated into the experiment. Finally, the study employs a novel breaking news paradigm in which the innuendo is never definitely disproved. While both the design and the lack of definitive proof against the claim are realistic representations of many situations, the extent to which the findings of the study generalize is unknown. Both features should make correcting misperceptions more difficult, suggesting that the findings would hold in a more conventional design or in a case in which the misinformation were disproved, but these expectations should be evaluated empirically.

Ultimately, we believe that our results suggest the potential value of causal corrections or countering misperceptions about the cause of events. Journalists should seek to utilize them whenever possible. When they cannot be used, it is especially important for the media to avoid reporting rumors or unverified information, which can have lasting – and damaging – effects.

SUPPLEMENTARY MATERIAL

For supplementary material for this article, please visit Cambridge Journals Online http://dx.doi.org/10.1017/XPS.2014.22.

Footnotes

1 A Pew poll conducted by telephone at nearly the same time (October 12–14, 2012) reported a sample that was 33% Democrat and 21% Republican (Pew 2012).

2 This question was asked before the experiment so that it could be used as a pre-treatment moderator in the subsequent analysis. It is possible that the question could have primed respondents to think of politicians as crooked and thus increased baseline perceptions of misconduct. However, any such effect is orthogonal to the block-randomized treatment assignment, which ensures balanced distributions of respondents for each level of expressed belief that politicians are crooked. Moreover, we would expect any such effect to make it more difficult for the causal correction to reduce such beliefs, which would work against our hypothesis.

3 The distributions of several key demographic characteristics across experimental conditions are presented in the online appendix. None of the distributions are significantly different from what we would expect due to chance, suggesting that the randomization was successful.

4 The causal correction was provided in addition to the denial rather than in place of it in order to test whether it was more effective than a denial alone. While this design choice means that the dispatch participants read in the causal condition was two sentences longer than the denial condition, almost any imaginable causal correction would include a denial of the false claim. This design facilitates a clean test of the added corrective value provided by the alternate causal explanation while ruling out any unintended interpretations (i.e., switching jobs due to investigations).

5 In the context of the study, the claim that Swensen resigned due to an investigation is never definitively disproved, but it is a misperception according to the definition in Nyhan and Reifler (Reference Nyhan and Reifler2010), which defines misperceptions as “cases in which people's beliefs about factual matters are not supported by clear evidence and expert opinion.” We believe this is a realistic representation of many real-world situations in which unsubstantiated accusations are made that lack adequate supporting evidence but cannot be ruled out with perfect certainty. As we discuss in the conclusion, however, future research should investigate cases in which the claim in question is disproved. Future research should also consider whether these effects vary depending on the valence of the causal explanation. (In this case, being named university president might affect how favorably respondents view Swensen, though it is unclear why it would affect their beliefs in the likelihood of him engaging in illegal activities or being under investigation.)

6 The results are substantively identical if we estimate the models using ordered probit (see online appendix).

7 The assignment probabilities were identical a priori for all respondents but it is necessary to account for fluctuations in the proportions assigned to each condition across blocks due to random variation in order to obtain an unbiased treatment effect estimate (Gerber and Green Reference Gerber and Green2012: 117). Estimated treatment effects are substantively identical when the models are estimated without weights of any kind (results are provided in the online appendix).

8 The marginal effect of the denial relative to the innuendo was significant at the p < 0.10 level using inverse probability weights (Model 1) but not using survey weights (Model 2).

9 When compared directly, respondents in the causal correction condition viewed Swensen significantly more favorably than those in the denial condition using inverse probability weights (p < 0.01) but the difference was not quite significant using survey weights (p < 0.12). The full set of pairwise comparisons between the different experimental conditions for each dependent variable using the inverse probability weight models in Table 2 are provided in the appendix.

10 The estimated treatment effects do not appear to be the result of differing levels of recall about the event. Specifically, we cannot reject the null hypothesis of no difference in response accuracy on a three-question battery between the innuendo, denial, and causal correction conditions (results are provided in the online appendix).

11 The marginal effect of the denial relative to the innuendo on beliefs about illegal activity was significant at the p < 0.05 level using inverse probability weights (Model 3) but not survey weights (Model 4).

References

REFERENCES

Associated Press. 2002. “Gary Condit Loses Primary to Former Protege Cardoza.” Grand Rapids Press (March 6): A2.Google Scholar
Associated Press. 2013. “Lawmaker Won't Face Contribution Probe.” Monterey County Herald (December 30). (http://www.montereyherald.com/general-news/20131230/lawmaker-wont-face-contribution-probe), accessed December 5, 2014.Google Scholar
Barakat, M. 2010. “Chandra Levy Verdict: Suspect Found Guilty in 2001 Death of DC Intern.” Associated Press (November 22, 2010).Google Scholar
Baquet, D., and Gerth, J.. 1992. “Lawmaker's Defense of B.C.C.I. Went Beyond Speech in Senate.” New York Times (August 26, 1992).Google Scholar
Bullock, J. 2007. “Experiments on Partisanship and Public Opinion: Party Cues, False Beliefs, and Bayesian Updating.” Ph.D. Dissertation, Stanford University.Google Scholar
Cobb, M. D., Nyhan, B., and Reifler, J.. 2013. “Beliefs Don't Always Persevere: How Political Figures Are Punished When Positive Information About Them Is Discredited.” Political Psychology 34 (3): 307326.Google Scholar
Duflo, E., Glennerster, R., and Kremer, M.. 2007. “Using Randomization in Development Economics Research: A Toolkit.” Handbook of Development Economics 4: 38953962.Google Scholar
Ecker, U. K. H., Lewandowsky, S., and Apai, J.. 2011. “Terrorists Brought Down the Plane!—No, Actually it Was a Technical Fault: Processing Corrections of Emotive Information.” Quarterly Journal of Experimental Psychology, 64:2, 283310.Google Scholar
Gerber, A. S., and Green, D. P.. 2012. Field Experiments: Design, Analysis, and Interpretation. New York: W. W. Norton & Company.Google Scholar
Johnson, H., and Seifert, C.. 1994. “Sources of the Continued Influence Effect: When Misinformation in Memory Affects Later Inferences.” Journal of Experimental Psychology: Learning, Memory, and Cognition 20 (6): 14201436.Google Scholar
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., and Cook, J.. 2012. “Misinformation and Its Correction: Continued Influence and Successful Debiasing.” Psychological Science in the Public Interest 13 (3): 106131.Google Scholar
Lizza, R. 2014. “Crossing Christie.” The New Yorker (April 14): 40–51.Google Scholar
Maddux, M. 2004. “McGreevey Aide Cleared in Parole; Had no Role in Mobster's Release.” The Record (March 2): A03.Google Scholar
Moore, R. T. 2012. “Multivariate Continuous Blocking to Improve Political Science Experiments.” Political Analysis 20 (4): 460479.CrossRefGoogle Scholar
Nyhan, B., and Reifler, J.. 2010. “When Corrections Fail: The Persistence of Political Misperceptions.” Political Behavior 32 (2): 303330.CrossRefGoogle Scholar
Nyhan, B., and Reifler, J.. 2012. “Misinformation and Fact-Checking: Research Findings from Social Science.” New America Foundation Media Policy Initiative Research Paper.Google Scholar
Nyhan, B., Reifler, J., and Ubel, P.. 2013. “The Hazards of Correcting Myths About health care Reform.” Medical Care 51 (2): 127132.Google Scholar
Oliphant, J. 2011. “Shirley Sherrod suing Breitbart.” Los Angeles Times (February 15): A13.Google Scholar
Pew Research Center for the People & the Press. 2012. “On Eve of Foreign Debate, Growing Pessimism about Arab Spring Aftermath” Poll conducted October 12–14, 2014. (http://www.people-press.org/files/legacy-questionnaires/10-18-12%20Foreign%20topline%20for%20release.pdf), accessed March 24, 2014.Google Scholar
Rivers, D. Unpublished manuscript. “Sample Matching: Representative Sampling from Internet Panels.” (http://psfaculty.ucdavis.edu/bsjjones/rivers.pdf), accessed March 21, 2014.Google Scholar
Ross, L., Lepper, M., and Hubbard, M.. 1975. “Perseverance in Self-Perception and Social perception: Biased Attributional Processes in the Debriefing Paradigm.” Journal of Personality and Social Psychology 32: 880892.Google Scholar
Tapper, J., and Kahn, H.. 2010. “White House Apologizes to Shirley Sherrod, Ag Secretary Offers Her New Job.” ABCNews.com (July 21, 2010).Google Scholar
Wegner, D. M., Wenzlaff, R., Kerker, R. M., and Beattie, A. E.. 1981. “Incrimination Through Innuendo: Can Media Questions Become Public Answers?Journal of Personality and Social Psychology 40 (5): 822832.Google Scholar
Wilkes, A. L., and Leatherbarrow, M.. 1988. “Editing Episodic Memory Following the Identification of Error.” The Quarterly Journal of Experimental Psychology 40A (2): 361387.Google Scholar
Figure 0

Table 1 Experimental Stimuli

Figure 1

Table 2 OLS Models of Experimental Results

Figure 2

Figure 1 Politician Favorability

Mean favorability toward the fictional politician and 95% confidence intervals by experimental condition (estimated using inverse probability weights). Favorability was measured on a six-point scale from “very unfavorable (1) to “very favorable” (6). The experimental design, question wording, and pairwise comparisons of the statistical significance of differences in means between conditions are provided in the online appendix.
Figure 3

Figure 2 Likelihood of Bribery or Other Crimes

Mean belief that the fictional politician “accepted bribes or engaged in other illegal practices” by experimental condition and 95% confidence intervals (estimated using inverse probability weights). Belief in this claim was measured on a five-point scale from “not at all likely” (1) to “extremely likely” (5). The experimental design, question wording, and pairwise comparisons of the statistical significance of differences in means between conditions are provided in the online appendix.
Figure 4

Figure 3 Likelihood that the Innuendo is True

Mean belief that the fictional politician “is resigning from office because he is under investigation for bribery” and 95% confidence intervals by experimental condition (estimated using inverse probability weights). Belief in this claim was measured on a four-point scale from “not likely at all” (1) to “very likely” (4). The experimental design, question wording, and pairwise comparisons of the statistical significance of differences in means between conditions are provided in the online appendix.
Supplementary material: File

Nyhan and Reifler supplementary material

Appendix

Download Nyhan and Reifler supplementary material(File)
File 76 KB