Hostname: page-component-745bb68f8f-hvd4g Total loading time: 0 Render date: 2025-02-06T05:09:29.381Z Has data issue: false hasContentIssue false

Experiments make a good breakfast, but a poor supper

Published online by Cambridge University Press:  13 May 2022

Jolanda Jetten
Affiliation:
School of Psychology, The University of Queensland, St. Lucia, 4072 QLD, Australia. j.jetten@psy.uq.edu.au; https://psychology.uq.edu.au/profile/2317/jolanda-jetten; h.selvanathan@uq.edu.au; https://psychology.uq.edu.au/profile/7410/hema-preya-selvanathan; c.crimston@uq.edu.au; https://psychology.uq.edu.au/profile/2698/charlie-crimston; s.bentley@uq.edu.au; https://psychology.uq.edu.au/profile/2536/sarah-bentley; a.haslam@uq.edu.au; https://psychology.uq.edu.au/profile/3181/alex-haslam
Hema Preya Selvanathan
Affiliation:
School of Psychology, The University of Queensland, St. Lucia, 4072 QLD, Australia. j.jetten@psy.uq.edu.au; https://psychology.uq.edu.au/profile/2317/jolanda-jetten; h.selvanathan@uq.edu.au; https://psychology.uq.edu.au/profile/7410/hema-preya-selvanathan; c.crimston@uq.edu.au; https://psychology.uq.edu.au/profile/2698/charlie-crimston; s.bentley@uq.edu.au; https://psychology.uq.edu.au/profile/2536/sarah-bentley; a.haslam@uq.edu.au; https://psychology.uq.edu.au/profile/3181/alex-haslam
Charlie R. Crimston
Affiliation:
School of Psychology, The University of Queensland, St. Lucia, 4072 QLD, Australia. j.jetten@psy.uq.edu.au; https://psychology.uq.edu.au/profile/2317/jolanda-jetten; h.selvanathan@uq.edu.au; https://psychology.uq.edu.au/profile/7410/hema-preya-selvanathan; c.crimston@uq.edu.au; https://psychology.uq.edu.au/profile/2698/charlie-crimston; s.bentley@uq.edu.au; https://psychology.uq.edu.au/profile/2536/sarah-bentley; a.haslam@uq.edu.au; https://psychology.uq.edu.au/profile/3181/alex-haslam
Sarah V. Bentley
Affiliation:
School of Psychology, The University of Queensland, St. Lucia, 4072 QLD, Australia. j.jetten@psy.uq.edu.au; https://psychology.uq.edu.au/profile/2317/jolanda-jetten; h.selvanathan@uq.edu.au; https://psychology.uq.edu.au/profile/7410/hema-preya-selvanathan; c.crimston@uq.edu.au; https://psychology.uq.edu.au/profile/2698/charlie-crimston; s.bentley@uq.edu.au; https://psychology.uq.edu.au/profile/2536/sarah-bentley; a.haslam@uq.edu.au; https://psychology.uq.edu.au/profile/3181/alex-haslam
S. Alexander Haslam
Affiliation:
School of Psychology, The University of Queensland, St. Lucia, 4072 QLD, Australia. j.jetten@psy.uq.edu.au; https://psychology.uq.edu.au/profile/2317/jolanda-jetten; h.selvanathan@uq.edu.au; https://psychology.uq.edu.au/profile/7410/hema-preya-selvanathan; c.crimston@uq.edu.au; https://psychology.uq.edu.au/profile/2698/charlie-crimston; s.bentley@uq.edu.au; https://psychology.uq.edu.au/profile/2536/sarah-bentley; a.haslam@uq.edu.au; https://psychology.uq.edu.au/profile/3181/alex-haslam

Abstract

Cesario's analysis has three key flaws. First, the focus on whether an effect is “real” (an “effects flaw”) overlooks the importance of theory testing. Second, obsession with effects (a “fetishization flaw”) sidelines theoretically informed questions about when and why an effect may arise. Third, failure to take stock of cultural and historical context (a “decontextualization flaw”) strips findings of meaning.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2022. Published by Cambridge University Press

Cesario provides a number of good reasons why we should be cautious about relying solely on experimental findings to understand the social world around us. While we welcome the focus on experimental validity (after years of focusing more or less exclusively on problems associated with replication and reliability), unfortunately, his own analysis falls foul of some of the problems that it seeks to rectify. There are three specific flaws in his reasoning, and all three are commonly observed in researchers' understanding of what experiments are meant to do and how they should be used.

First, Cesario's analysis misunderstands the purpose of experiments. Their function is not to try as hard as possible to mimic aspects of the world outside the laboratory so that researchers can establish whether a given effect is observable in the world and hence “real” (e.g., whether or not police officers are racially biased). To imagine that they are is to fall prey to an “effects flaw” in which experimental outcomes are privileged over the processes that produce them.

Instead, then, experiments and the evidence they produce are better suited to the task of testing theories of human psychology and behaviour. They do this principally by helping us to understand under what conditions a given effect is observed, and what mechanisms underlie that effect. Indeed, by focusing on effects rather than processes, Cesario's analysis fails to capitalise on the key value of experiments – namely their capacity to support theory development (Haslam & McGarty, Reference Haslam and McGarty2001; Swann & Jetten, Reference Swann and Jetten2017).

This “effects flaw” is not just present in Cesario's analysis, but is a pervasive problem in the social psychological literature. It is perhaps most apparent in reports of the classic studies in social psychology (e.g., Milgram's obedience studies and Zimbardo's Stanford Prison Experiment; see Smith & Haslam, Reference Smith and Haslam2017). For instance, because of the “effect flaw” the contribution of Milgram's obedience studies is routinely misunderstood. For the real theoretical value of the work can be seen to lie less in the 65% obedience rate that was observed in the so-called “baseline condition” (the classic effect reported in most textbooks) than in the many variants that Milgram conducted to explore the conditions under which obedience is either far greater or far weaker (see Jetten & Mols, Reference Jetten and Mols2014; Reicher, Haslam, & Smith, Reference Reicher, Haslam and Smith2012). To be sure, experimental effects can capture our attention and make the case for much-needed theory development, but without a theoretical focus and grounding, their contribution is unproductively circumscribed.

Second, while we agree that, on its own, experimental evidence is of limited use, we argue that what is needed is a proper analysis of how experimental evidence should be complemented with other forms of evidence. Here, we would argue that experimental evidence should never be considered in isolation, but always in conjunction with data sourced using complementary methods (e.g., field surveys, longitudinal research, and qualitative work). What is more, theory-derived hypotheses need to be examined in a range of different contexts. Unfortunately, although, experimental evidence is too often seen as the “gold (and only) standard” for our field, with evidence gleaned via other means relegated to the margins.

This prioritization of experimental effects contributes to a “fetishization flaw” associated with what Reicher (Reference Reicher2000) refers to as methodolatry. As a result of this there is little incentive for researchers to move out of the lab, and once an “effect” is established within a controlled laboratory setting, it hardly ever comes out of it. The experimental paradigm, therefore, becomes equated with the phenomena itself. This exacerbates the consequences of the first flaw by cultivating an obsession with (the replication of) experimental effects and attendant neglect of broader questions of process. In short, questions of “when” and “why” are crowded out by questions of “whether” and “how much” in ways that stymie and suppress theory development and the deep understanding that accompanies it. As the replication crisis of recent years attests, this narrowing of the field has not served social psychology well.

Third, alongside these issues, a “decontextualization flaw” means that researchers typically use experiments for hypothetico-deductive purposes in a quest to discover “objective truth.” This epistemology generally assumes value neutrality and context independence and tends to catalogue psychological effects with scant regards to the broader historical and societal contexts in which they arise (Adams, Estrada-Villalta, Sullivan, & Markus, Reference Adams, Estrada-Villalta, Sullivan and Markus2019).

In crucial ways, this has led to the disappearance of the “social” in social psychology (see Greenwood, Reference Greenwood2003). For it is important to remember that the underlying causes and nature of systemic issues such as discrimination and inequality cannot be reduced to (or sufficiently captured within) experiments alone. Rather, these realities – and the questions they raise – need to be explored within the worlds that give rise to them (Oishi & Graham, Reference Oishi and Graham2010; Trawalter, Bart-Plange, & Hoffman, Reference Trawalter, Bart-Plange and Hoffman2020). Here, qualitative methods are often particularly valuable by virtue of their inductive, reflexive, and phenomenological potential. Critically too, these alternative (and complementary) methodologies are better able to capture the meaning of data in situ and prioritize community participation in the co-creation of knowledge – something which is all too often missing in experimental research (Burman, Reference Burman1997).

In sum, as with a good breakfast, experiments are an excellent point of departure. But on their own, they can never be enough to satisfy our scientific appetites. For their scientific potential to be fulfilled, their contributions need to be consolidated with meaningful theory development and complementary methodologies. Lacking this, not only will our diet be unbalanced, but it will also be profoundly unsatisfying – and potentially harmful.

Financial support

This research was supported by an Australian Research Council Laureate Fellowship (FL180100094) awarded to Jolanda Jetten.

Conflict of interest

The authors have no conflicts of interests to report in relation to this commentary.

References

Adams, G., Estrada-Villalta, S., Sullivan, D., & Markus, H.R. (2019). The psychology of neoliberalism and the neoliberalism of psychology. Journal of Social Issues, 75, 189216.CrossRefGoogle Scholar
Burman, E. (1997). Minding the gap: Positivism, psychology, and the politics of qualitative methods. Journal of Social Issues, 53(4), 785801.CrossRefGoogle Scholar
Greenwood, J. D. (2003). The disappearance of the social in American social psychology. Cambridge University Press.CrossRefGoogle Scholar
Haslam, S. A., & McGarty, C. (2001). A hundred years of certitude? Social psychology, the experimental method and the management of scientific uncertainty. British Journal of Social Psychology, 40, 121. doi: 10.1348/014466601164669CrossRefGoogle Scholar
Jetten, J., & Mols, F. (2014). 50–50 Hindsight: Appreciating anew the contributions of Milgram's obedience experiments. Journal of Social Issues, 70, 587602.CrossRefGoogle Scholar
Oishi, S., & Graham, J. (2010). Social ecology: Lost and found in psychological science. Perspectives on Psychological Science, 5(4), 356377.CrossRefGoogle ScholarPubMed
Reicher, S. D. (2000). Against methodolatry. British Journal of Clinical Psychology, 39(1), 16.CrossRefGoogle ScholarPubMed
Reicher, S. D., Haslam, S. A., & Smith, J. R. (2012). Working towards the experimenter: Reconceptualizing obedience within the Milgram paradigm as identification-based followership. Perspectives on Psychological Science, 7, 315324. doi: 10.1177/1745691612448482CrossRefGoogle Scholar
Smith, J. R., & Haslam, S. A. (Eds.) (2017). Social psychology: Revisiting the classic studies (2nd ed.). Sage.Google Scholar
Swann, W. B. Jr., & Jetten, J. (2017). Restoring agency to the human actor. Perspectives on Psychological Science, 12, 382399.CrossRefGoogle Scholar
Trawalter, S., Bart-Plange, D. J., & Hoffman, K. M. (2020). A socioecological psychology of racism: Making structures and history more visible. Current Opinion in Psychology, 32, 4751.CrossRefGoogle ScholarPubMed