Hostname: page-component-745bb68f8f-d8cs5 Total loading time: 0 Render date: 2025-02-11T05:42:59.791Z Has data issue: false hasContentIssue false

For the public, it might be an evidence-based practice not to listen to I-O psychologists

Published online by Cambridge University Press:  26 May 2022

Konrad Kulikowski*
Affiliation:
University of Social Sciences, Łódź
*
*Corresponding author. Email: kkulikowski@san.edu.pl
Rights & Permissions [Opens in a new window]

Abstract

Type
Commentaries
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of the Society for Industrial and Organizational Psychology

“We must face the fact that most of our work is not appreciated by those outside of academia, and for good reason” (Weiss, & Shanteau, Reference Weiss and Shanteau2021, p.11)

Rogelberg et al. (Reference Rogelberg, King and Alonso2022) suggest that to bring industrial-organizational (I-O) science closer to the public we need to concentrate on communication efforts to better translate it. I believe that this argument should be taken further when concentrating on communication as the main causal factor in the research–public gap might lead us astray from some serious problems with our field. The low influence of I-O science on the public originated not only from insufficient knowledge communication but also from shortcomings in the knowledge-generation process (Shapiro et al., Reference Shapiro, Kirkman and Courtney2007). Because of these flaws, I put forward a suggestion that, for the public dealing with real-life problems, a good evidence-based decision might be to ignore I-O scholars’ communication, which is often based on futile and misdirected research. This leads to the reflection that when thinking about how to bring I-O science to the public we should think not only about how we are communicating but also what we are communicating in the first place. In this commentary, I invite the I-O community to reconsider the real potential of I-O knowledge to bring about real changes in organizations and peoples lives, and I suggest that this potential is lower than we like to believe. I discuss my point of view through an analysis of the two dimensions of the I-O knowledge-generation process, (a) the purpose of knowledge generation and (b) trustworthiness of the generated knowledge. Unfortunately, it is my judgment that on both of these dimensions we are not doing as well as we could. I would like to repeat and extend the previous opinions (Banks et al., Reference Banks, Pollack, Bochantin, Kirkman, Whelpley and O’Boyle2016; Cummings, Reference Cummings2007; Guest, Reference Guest2007; Shapiro et al., Reference Shapiro, Kirkman and Courtney2007) that the chance to communicate our knowledge to the public might be lost even before we start to communicate if what we intend to communicate is produced without practitioners and public needs in the minds of I-O researchers.

For whom is I-O psychology knowledge generated—not for the public and practitioners

The first challenge with knowledge-production processes in I-O psychology lies in the question of for whom it is generated and with what goal. Answers to these questions, unfortunately, are not to generate usable evidence for practitioners and not to solve real-life problems. A majority of the I-O knowledge is generated to impress other I-O researchers and reviewers, with aspirations to publish in top journals to advance our careers, but usually not to affect the world outside academia. Thus, before we start educating the public about evidence-based practices, we have to rely on the evidence ourselves, and that evidence seems to suggest that our I-O science in its current form is often not relevant for the public (Davis, Reference Davis2015; Guest, Reference Guest2007). Instead of thinking about how to help people in their daily work, we constantly chase “counterintuitive” findings and “theoretical contributions,” putting novelty over the necessity of our research (Grote & Cortina, Reference Grote and Cortina2018). Instead of being obsessed with solving real-life challenges faced by organizations (Banks et al., Reference Banks, Pollack, Bochantin, Kirkman, Whelpley and O’Boyle2016), we are obsessed with publications in our “top-tier journals” (Aguinis et al., Reference Aguinis, Cummings, Ramani and Cummings2020). Instead of providing the public with robust and elegant analytical insights, we conduct overly complex, statistical analyses, often without a profound understanding of what we are doing (Cortina et al., Reference Cortina, Green, Keeler and Vandenberg2017). This all leads to the situation that, as Tourish (Reference Tourish2020) suggests, we might be described as “genuine imposters,” pretending to be doing more practically useful work than it is in reality. This is a problem of relevance. In my view, a large part of I-O psychology knowledge is irrelevant for the public, even before we make any attempt to communicate it. Thus, we should not be surprised that the public ignores our communication.

What is the trustworthiness of I-O psychology knowledge—we can do better

When we think about the relevance of our research, we might be tempted to say that the relevance for practice must be sacrificed in the name of scientific rigor. However, in recent years psychological research practices have come under scrutiny (Open Science Collaboration, 2015), showing that there is less rigor in our field than we like to think. There is a long list of burning issues with our discipline (Nosek et al., Reference Nosek, Spies and Motyl2012) such as problems with low statistical power and small samples, publication bias, questionable research practices, and lack of research transparency (Efendic & Van Zyl, Reference Efendic and Van Zyl2019; Grand et al., Reference Grand, Rogelberg, Allen, Landis, Reynolds, Scott, Tonidandel and Truxillo2018). Some even suggest (Tourish, Reference Tourish2020) that our field is saturated by nonsense, and others (Castille, Reference Castille2020) ask the serious question of whether I-O psychology is a science or pseudoscience. Although we might not be doing quite as badly as some of the cited authors think, it seems that the culture of robust science in I-O psychology is yet to be achieved (Grand et al., Reference Grand, Rogelberg, Allen, Landis, Reynolds, Scott, Tonidandel and Truxillo2018). When the public looks at this mess, is it surprising for us that they might decide to avoid the communication that flows from us? In some cases, they may have solid evidence for doing so.

Why is I-O science not reaching the public?

I suggest that I-O knowledge is not reaching the public because, despite a years of calling for changes, spanning from Hambrick (Reference Hambrick1994) to Kulik (Reference Kulik2020), and even some progress (e.g., evidence-based management movement; see https://cebma.org/), we are still not producing real value for the public. The answer to the question “Why isn’t I-O science reaching the public?” partially lies in knowledge-communication processes, but influence on the public is more than communication: To reach the public, we must reform the I-O knowledge-production process. I suggest that I-O knowledge is neither relevant for the public nor is robust and generalized enough to be directly applied in practical settings, and thus general public and practitioners when weighing and aggregating evidence from different sources might consciously decide not to use it—and often for good reason (see Chambers, Reference Chambers2017; Tourish, Reference Tourish2019).

What are some keys to translation and public consumption?

We need changes not only in the knowledge-communication processes but also in knowledge production. If we want to increase our chances of providing evidence-based practices to the public, we must start generating more relevant and usable evidence.

Improving I-O psychology science relevance

In general, if we want to reach the public, we should concentrate on what people expect of I-O science. There is a need to address the real problems that interest both academics and practitioners and not let I-O research lose its momentum before translation to the practice (Banks et al., Reference Banks, Pollack, Bochantin, Kirkman, Whelpley and O’Boyle2016; Cummings, Reference Cummings2007). When choosing and evaluating research topics as authors and reviewers, we should consider their relevance to effectively solving practically relevant challenges, not just their theoretical novelty or statistical complexity. Second, we might strive to adopt a more collaborative rather than directive approach in the knowledge-production process to ensure that the goals in I-O knowledge production are the same for practitioners and academics (Lawler & Benson, Reference Lawler and Benson2020). This includes not only educating people from an ivory-tower standpoint but also active learning from the public and acknowledging that academic nonexperts have insightful knowledge that we as I-O scholars do not have. The consumers and end users of research should be not only informed about our results but also coengaged in the knowledge-creation process if the ultimate objective of this process is to address real and relevant problems (Kulik, Reference Kulik2020). We have met the enemy, and it is us. If we are really interested in relevance for the public, simple cosmetic fixes will not help. We must change how we plan and conduct our research and even what we consider to be good research. For I-O researchers, it might be more “scientific” to humbly fail when trying to make a change in a real organization than succeed in publishing another piece of obscure abstract theoretical information in a top journal.

Improving trustworthiness in I-O psychology science

In recent years, much has been said about how changes in research methodology might improve the quality of psychology (Efendic & Van Zyl, Reference Efendic and Van Zyl2019). However, all methodological advancements are unlikely to solve the problem with the trustworthiness of I-O knowledge if we do not change our cultural norms and values. In improving the quality of our research, we must reformulate the way we define academic success, to promote truth over publishability (Nosek et al., Reference Nosek, Spies and Motyl2012). The first step is a more nuanced and plural perspective to research evaluation (Aguinis et al., Reference Aguinis, Shapiro, Antonacopoulou and Cummings2014) that also takes into account not only where we publish but also what we publish and how we arrive at our findings. Reducing pressure to publish might help us to concentrate on solving real organizational problems. The guidelines on how to challenge the status quo in research evaluation are already here, and it is time to implement them (see e.g. https://sfdora.org/, http://www.leidenmanifesto.org/). All methodical advancements might easily become thoughtless ceremonies without real effects if we do not change what we value as I-O researchers. But are we currently doing our research to make a difference in the world of work or to please journal reviewers? What do we value more, influencing the public or another top publication? Third, to improve the trustworthiness of I-O knowledge, we all should be more aware of the shortcomings of our science and build humility as a central value in our research culture (Resnick, Reference Resnick2019). We should be honest about the limitations of our findings and avoid false precision, as one of the greatest data analysts once advised us, “Far better an approximate answer to the right question, which is often vague, than the exact answer to the wrong question, which can always be made precise” (Tukey Reference Tukey1962, p. 13–14).

To conclude, I believe that it might be the high time for the I-O psychology community to stop for a while and think: Is the public not listening to us because they are not hearing us or because we have nothing interesting to say?

References

Aguinis, H., Cummings, C., Ramani, R. S., & Cummings, T. G. (2020). “An A is an A”: The new bottom line for valuing academic research. Academy of Management Perspectives, 34(1), 135154. https://doi.org/10.5465/amp.2017.0193 CrossRefGoogle Scholar
Aguinis, H., Shapiro, D. L., Antonacopoulou, E. P., & Cummings, T. G. (2014). Scholarly impact: A pluralist conceptualization. Academy of Management Learning & Education, 13(4), 623639. https://doi.org/10.5465/amle.2017.0488 CrossRefGoogle Scholar
Banks, G. C., Pollack, J. M., Bochantin, J. E., Kirkman, B. L., Whelpley, C. E., & O’Boyle, E. H. (2016). Management’s science–practice gap: A grand challenge for all stakeholders. Academy of Management Journal, 59(6), 22052231. https://doi.org/10.5465/amj.2015.0728 CrossRefGoogle Scholar
Castille, C. M. (2020). Opening up: A primer on open science for industrial-organizational psychologists. The Industrial-Organizational Psychologist, 57(3). https://www.siop.org/Research-Publications/Items-of-Interest/ArtMID/19366/ArticleID/3293 Google Scholar
Chambers, C. (2017). The seven deadly sins of psychology: A manifesto for reforming the culture of scientific practice. Princeton University Press.Google Scholar
Cortina, J. M., Green, J. P., Keeler, K. R., & Vandenberg, R. J. (2017). Degrees of freedom in SEM: Are we testing the models that we claim to test? Organizational Research Methods, 20(3), 350378. https://doi.org/10.1177/1094428116676345 CrossRefGoogle Scholar
Cummings, T. G. (2007). Quest for an engaged academy. Academy of Management Review, 32(2), 355360. https://doi.org/10.5465/amr.2007.24349184 CrossRefGoogle Scholar
Davis, G. F. (2015). Editorial essay: What is organizational research for? Administrative Science Quarterly, 60(2), 179188. https://doi.org/10.1177/0001839215585725 CrossRefGoogle Scholar
Efendic, E., & Van Zyl, L. E. (2019). On reproducibility and replicability: Arguing for open science practices and methodological improvements at the South African Journal of Industrial Psychology. South African Journal of Industrial Psychology, 45(1), 110. http://dx.doi.org/10.4102/sajip.v45i0.1607.Google Scholar
Grand, J. A., Rogelberg, S. G., Allen, T. D., Landis, R. S., Reynolds, D. H., Scott, J. C., Tonidandel, S., & Truxillo, D. M. (2018). A systems-based approach to fostering robust science in industrial-organizational psychology. Industrial and Organizational Psychology: Perspectives on Science and Practice, 11(1), 442. https://doi.org/10.1017/iop.2017.55 CrossRefGoogle Scholar
Grote, G., & Cortina, J. M. (2018). Necessity (not just novelty) is the mother of invention: Using creativity research to improve research in work and organizational psychology. European Journal of Work and Organizational Psychology, 27(3), 335341. https://doi.org/10.1080/1359432X.2018.1444606 CrossRefGoogle Scholar
Guest, D. E. (2007). Don’t shoot the messenger: A wake-up call for academics. Academy of Management Journal, 50(5), 10201026. https://doi.org/10.5465/amj.2007.27152111 CrossRefGoogle Scholar
Hambrick, D. C. (1994). What if the academy actually mattered? Academy of Management Review, 19(1), 1116. https://doi.org/10.5465/amr.1994.9410122006 Google Scholar
Kulik, C. T. (2020). 2019 presidential address—management scholars, end users, and the power of thinking small. Academy of Management Review, 45(2), 273279. https://doi.org/10.5465/amr.2020.0070 CrossRefGoogle Scholar
Lawler, E. E., III, & Benson, G. S. (2020). The practitioner-academic gap: A view from the middle. Human Resource Management Review, 32(1), Article 100748. https://doi.org/10.1016/j.hrmr.2020.100748 Google Scholar
Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615631. https://doi.org/10.1177/1745691612459058 CrossRefGoogle ScholarPubMed
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), Article 4716. https://doi.org/10.1126/science.aac4716 Google Scholar
Resnick, B. (2019, January 4). Intellectual humility: The importance of knowing you might be wrong. Vox. https://www.vox.com/science-and-health/2019/1/4/17989224/intellectual-humility-explained-psychology-replication Google Scholar
Rogelberg, S. G., King, E. B., & Alonso, A. (2022). How we can bring I-O psychology science and evidence-based practices to the public. Industrial and Organizational Psychology: Perspectives on Science and Practice, 15(2), 259272.Google Scholar
Shapiro, D. L., Kirkman, B. L., & Courtney, H. G. (2007). Perceived causes and solutions of the translation problem in management research. Academy of Management Journal, 50, 249266. https://doi.org/10.2307/20159853 CrossRefGoogle Scholar
Tourish, D. (2019). Management studies in crisis: Fraud, deception and meaningless research. Cambridge University Press.CrossRefGoogle Scholar
Tourish, D. (2020). The triumph of nonsense in management studies. Academy of Management Learning & Education, 19(1), 99109. https://doi.org/10.5465/amle.2019.0255 CrossRefGoogle Scholar
Tukey, J. W. (1962). The future of data analysis. Annals of Mathematical Statistics, 33(1), 167. https://www.jstor.org/stable/2237638 CrossRefGoogle Scholar
Weiss, D. J., & Shanteau, J. (2021). The futility of decision making research. Studies in History and Philosophy of Science Part A, 90, 1014. https://doi.org/10.1016/j.shpsa.2021.08.018 CrossRefGoogle ScholarPubMed