Yarkoni stresses that a mismatch between intended verbal theories and hypotheses and current quantitative methods is a primary driver behind the generalizability crisis in psychology. Yarkoni suggests a number of ways forward, and we, as early career researchers, would like to present an additional option that avoids what often appears as a choice between doing either qualitative or quantitative work. In fact, recent calls for causal analysis within statistical practice require a strong integration between qualitative and quantitative work (Gelman & Vehtari, Reference Gelman and Vehtari2020; Pearl & Mackenzie, Reference Pearl and Mackenzie2018). Here, we argue that the increasing need for clear qualitative understanding in order to conduct good quantitative modeling can act as a bridge between methods traditionally, and falsely, portrayed as separate.
Using statistical models to make causal inferences presents many difficulties. However, current theoretical and analytical developments have made it possible to make causal assumptions explicit, testable, and interpretable (Pearl & Mackenzie, Reference Pearl and Mackenzie2018; Textor, van der Zander, Gilthorpe, Liśkiewicz, & Ellison, Reference Textor, van der Zander, Gilthorpe, Liśkiewicz and Ellison2017). By adopting causal analysis, researchers are forced to make their theoretical assumptions clear, including the relevant variables to the system under study, how such variables are influenced by one another, and how they interact. As such, researchers need sufficient qualitative understanding of their study system in order to develop appropriate quantitative models in order to avoid statistical confounds (McElreath, Reference McElreath2020).
Thus, the growing use of causal frameworks within statistical practice presents a promising means to reduce the mismatch between intended verbal and statistical hypotheses, as well as facilitate communication between competing schools of thought. For example, personality research in the 1930s illustrates the emergence of the conflict between using quantitative and qualitative methods, the sources of which are made apparent through the use of a causal analysis. Certain assumptions, such as that only the individual and test instrument are relevant to personality research, rendered the environment unimportant in determining personality, which cemented the American quantitative approach to psychological research (Vernon, Reference Vernon1933). For example, tests of submissive behavior were assumed to reveal stable aspects of behavior in all contexts, regardless of the order of the items presented, past experience, comprehension skills, etc.
In contrast, Kurt Lewin's Berlin group held different philosophical assumptions, most notably, that the experimental environment – including communication between the researcher and participant – is a necessary part of psychological investigations. Therefore, Lewin's group rejected the idea that psychological phenomena could be meaningfully studied in terms of test scores and took a more qualitative approach (Van der Veer, Reference Van der Veer2000). For example, Tamara Dembo, a student of Lewin's, considered the role of the environment, as well as investigator, on behavior. Such aspects were either ignored or actively rejected by American psychologists at the time, so much so that when Lewin migrated to America, his ideas were marginalized. Today, most of Lewin's early papers remain untranslated, and his qualitative methods are “excluded from the experimental mainstream of American psychology” (Danziger, Reference Danziger1994, p. 178).
The important take away from this dispute is that it was a result of unchecked assumptions, many of which, as noted in Yarkoni's argument, remain in use today. It is our view that such conflicts have hope of resolution through the use of a causal analysis. Causal analysis provides a method, and in many ways a language, for researchers to collaboratively compare predictions and findings.
For example, below we use a form of causal modeling, directed acyclic graphs, to illustrate the different assumptions of two hypothetical theories of behavior (Textor et al., Reference Textor, van der Zander, Gilthorpe, Liśkiewicz and Ellison2017). Every directed acyclic graph generates a set of testable implied conditional independencies that describe the model. Conditional independencies allow researchers to check that their statistical modeling matches their theoretical understanding of relationships between the variables under consideration, and to identify statistical confounds.
The first theory is represented by the directed acyclic graph in Figure 1A and is similar to the assumptions of early German personality researchers. The second, Figure 1B, is similar to early (and current) American assumptions about personality. The central difference between these two hypothetical approaches is the role of environment; in A, environment is a common cause of personality and behavior, and would be necessary in any statistical model measuring the direct effect of personality on behavior. In B, environment is conditional on behavior only, and statistical models using this framework could include personality, environment, or both.
Figure 1. Directed acyclic graphs representing two hypothetical theories of behavior. Arrows indicate the direction of the assumed causal relationships.
However, implementing a causal approach within psychology's current framework requires considerable changes, both in research practices and pedagogy. As early career researchers, we are distinctly familiar with how statistics is currently being taught at universities and how the current system emphasizes quantitative statistics, often with not much more than a cursory glance at what qualitative methods may offer. A shift in pedagogy is essential. Quantitative and qualitative methods need to be placed on equal footing if we hope to build on and improve current research. We need to emphasize both the value of constructing strong theoretical frameworks for our systems of interest, and adopt more focused teaching of qualitative methods, starting at the undergraduate level. In doing so, psychological research can adopt a more structured mixed-methods approach already shown to be successful, and preferred, in other fields.
Mixed-methods approaches are increasingly popular (Bryman, Reference Bryman and Bergman2008; Johnson & Onwuegbuzie, Reference Johnson and Onwuegbuzie2004), and we propose that directed acyclic graphs can serve as a bridge between qualitative and quantitative methods, overcoming this unnecessary methodological dichotomy while also facilitating well-informed quantitative analyses and communication between disciplines. That is, qualitative methods can help us construct directed acyclic graphs by identifying variables of interest, and then tested by subsequent quantitative statistics. Put more explicitly, we suggest quantitative researchers begin their study with a directed acyclic graph, and qualitative researchers end with one. By summarizing the study system using a causal analysis, researchers can easily identify what should be included (or might be missing) in subsequent research on the same topic or system.
Yarkoni stresses that a mismatch between intended verbal theories and hypotheses and current quantitative methods is a primary driver behind the generalizability crisis in psychology. Yarkoni suggests a number of ways forward, and we, as early career researchers, would like to present an additional option that avoids what often appears as a choice between doing either qualitative or quantitative work. In fact, recent calls for causal analysis within statistical practice require a strong integration between qualitative and quantitative work (Gelman & Vehtari, Reference Gelman and Vehtari2020; Pearl & Mackenzie, Reference Pearl and Mackenzie2018). Here, we argue that the increasing need for clear qualitative understanding in order to conduct good quantitative modeling can act as a bridge between methods traditionally, and falsely, portrayed as separate.
Using statistical models to make causal inferences presents many difficulties. However, current theoretical and analytical developments have made it possible to make causal assumptions explicit, testable, and interpretable (Pearl & Mackenzie, Reference Pearl and Mackenzie2018; Textor, van der Zander, Gilthorpe, Liśkiewicz, & Ellison, Reference Textor, van der Zander, Gilthorpe, Liśkiewicz and Ellison2017). By adopting causal analysis, researchers are forced to make their theoretical assumptions clear, including the relevant variables to the system under study, how such variables are influenced by one another, and how they interact. As such, researchers need sufficient qualitative understanding of their study system in order to develop appropriate quantitative models in order to avoid statistical confounds (McElreath, Reference McElreath2020).
Thus, the growing use of causal frameworks within statistical practice presents a promising means to reduce the mismatch between intended verbal and statistical hypotheses, as well as facilitate communication between competing schools of thought. For example, personality research in the 1930s illustrates the emergence of the conflict between using quantitative and qualitative methods, the sources of which are made apparent through the use of a causal analysis. Certain assumptions, such as that only the individual and test instrument are relevant to personality research, rendered the environment unimportant in determining personality, which cemented the American quantitative approach to psychological research (Vernon, Reference Vernon1933). For example, tests of submissive behavior were assumed to reveal stable aspects of behavior in all contexts, regardless of the order of the items presented, past experience, comprehension skills, etc.
In contrast, Kurt Lewin's Berlin group held different philosophical assumptions, most notably, that the experimental environment – including communication between the researcher and participant – is a necessary part of psychological investigations. Therefore, Lewin's group rejected the idea that psychological phenomena could be meaningfully studied in terms of test scores and took a more qualitative approach (Van der Veer, Reference Van der Veer2000). For example, Tamara Dembo, a student of Lewin's, considered the role of the environment, as well as investigator, on behavior. Such aspects were either ignored or actively rejected by American psychologists at the time, so much so that when Lewin migrated to America, his ideas were marginalized. Today, most of Lewin's early papers remain untranslated, and his qualitative methods are “excluded from the experimental mainstream of American psychology” (Danziger, Reference Danziger1994, p. 178).
The important take away from this dispute is that it was a result of unchecked assumptions, many of which, as noted in Yarkoni's argument, remain in use today. It is our view that such conflicts have hope of resolution through the use of a causal analysis. Causal analysis provides a method, and in many ways a language, for researchers to collaboratively compare predictions and findings.
For example, below we use a form of causal modeling, directed acyclic graphs, to illustrate the different assumptions of two hypothetical theories of behavior (Textor et al., Reference Textor, van der Zander, Gilthorpe, Liśkiewicz and Ellison2017). Every directed acyclic graph generates a set of testable implied conditional independencies that describe the model. Conditional independencies allow researchers to check that their statistical modeling matches their theoretical understanding of relationships between the variables under consideration, and to identify statistical confounds.
The first theory is represented by the directed acyclic graph in Figure 1A and is similar to the assumptions of early German personality researchers. The second, Figure 1B, is similar to early (and current) American assumptions about personality. The central difference between these two hypothetical approaches is the role of environment; in A, environment is a common cause of personality and behavior, and would be necessary in any statistical model measuring the direct effect of personality on behavior. In B, environment is conditional on behavior only, and statistical models using this framework could include personality, environment, or both.
Figure 1. Directed acyclic graphs representing two hypothetical theories of behavior. Arrows indicate the direction of the assumed causal relationships.
However, implementing a causal approach within psychology's current framework requires considerable changes, both in research practices and pedagogy. As early career researchers, we are distinctly familiar with how statistics is currently being taught at universities and how the current system emphasizes quantitative statistics, often with not much more than a cursory glance at what qualitative methods may offer. A shift in pedagogy is essential. Quantitative and qualitative methods need to be placed on equal footing if we hope to build on and improve current research. We need to emphasize both the value of constructing strong theoretical frameworks for our systems of interest, and adopt more focused teaching of qualitative methods, starting at the undergraduate level. In doing so, psychological research can adopt a more structured mixed-methods approach already shown to be successful, and preferred, in other fields.
Mixed-methods approaches are increasingly popular (Bryman, Reference Bryman and Bergman2008; Johnson & Onwuegbuzie, Reference Johnson and Onwuegbuzie2004), and we propose that directed acyclic graphs can serve as a bridge between qualitative and quantitative methods, overcoming this unnecessary methodological dichotomy while also facilitating well-informed quantitative analyses and communication between disciplines. That is, qualitative methods can help us construct directed acyclic graphs by identifying variables of interest, and then tested by subsequent quantitative statistics. Put more explicitly, we suggest quantitative researchers begin their study with a directed acyclic graph, and qualitative researchers end with one. By summarizing the study system using a causal analysis, researchers can easily identify what should be included (or might be missing) in subsequent research on the same topic or system.
Acknowledgments
Many thanks to the members of the Banzi Lab reading and writing group for inspiration.
Financial support
This research received no specific grant from any funding agency, commercial or not-for-profit sectors.
Conflict of interest
None.