Hostname: page-component-7b9c58cd5d-bslzr Total loading time: 0.001 Render date: 2025-03-15T09:40:13.095Z Has data issue: false hasContentIssue false

Practitioner-oriented recommendations for advancing I-O technological research

Published online by Cambridge University Press:  09 September 2022

Matthew J. Borneman*
Affiliation:
Wonderlic Inc.
Amie Mansfield
Affiliation:
Wonderlic Inc.
*
Rights & Permissions [Opens in a new window]

Abstract

Type
Commentaries
Copyright
© The Author(s), 2022. Published by Cambridge University Press on behalf of the Society for Industrial and Organizational Psychology

As industrial-organizational (I-O) practitioners, it is common to be asked questions from both internal and external stakeholders about the latest research in workplace behaviors. Unfortunately, when these questions broach topics related to technology, there is often frustratingly little substantive research we can share. Thus, we couldn’t be more thrilled with this urgent call for timelier, more productive research on the intersection of technology and the workplace. To this end, we extend the recommendations in White etal.’s (Reference White, Ravid, Siderits and Behrend2022) focal article by providing a practitioner-oriented perspective on the dissemination of research findings and provide three additional recommendations to help facilitate the use of this research by practitioners in organizations of all types.

Dissemination of Research Findings

As practitioners seeking to leverage research findings and provide actionable recommendations to business leaders, it’s imperative that we are able to actually find relevant research. Conference presentations, especially those outside of the Society for Industrial and Organizational Psychology (SIOP) annual conference, are laudable outlets for reaching broader audiences as well as specific areas of industry. However, unless research presented at one of those conferences is published in a format that is widely accessible and searchable, its influence will be limited. Even research presented at the SIOP annual conference, which brings together thousands of academic and practitioner attendees annually, can suffer limited reach and therefore reduced influence on our field’s knowledge bases and professional practices. Indeed, there is plenty of research presented at our conference that escapes the kind of scaled dissemination that would provide near-effortless accessibility.

To this end, we recommend that the ideal state of research dissemination be targeted to open-source journals that prioritize digital publication. This not only promotes timelier publication by circumventing the lags associated with traditional paper copies of a journal, they make the research findable via common search engines, with full-text accessibility to all, without a paywall. Although we recognize this may not be an ideal state from traditional publication metrics, open-source journals do supply an expedited route to the broadest possible reach and potential influence by eliminating the need to jump through metaphorical hoops.

As noted, although open-source journal publication is not viable for all research, there are plenty of alternatives that facilitate accessibility for practitioners. ResearchGate is an exemplar: Formed in 2008, ResearchGate’s purpose is to address problems in the way science is created and shared. Today, there are 20 million researchers in the ResearchGate community, spanning the diverse sectors and across the globe (https://www.researchgate.net/about). From our perspective as I-O practitioners, it’s one of the first places we look for new research.

Another route to consider is to host research publications and presentations on any platform (e.g., personal, academic, work-related) and promote it via social media. This method can be especially effective if the researchers are connected with practitioners who use the technological tool on which the research is focused; it gets the research directly into the hands of people using and making decisions on the technology in question. Blog posts indexed by search engines can also be a useful outlet—relevant research can easily surface from a good query via a search engine.

Annual reviews and chapters

We specifically want to highlight annual reviews and book/handbook chapters as additional publication media for research summaries on technology’s influence on the workplace. The regular publication cycle of these outlets provides a fantastic opportunity for disseminating literature reviews in a timely manner as well as updating them over time with the newest research conducted since the prior annual installment. Too often, the practitioner’s problem is not a lack of research but rather the lack of a convenient synthesis of the research. This challenge, coupled with typical time constraints under which practitioners may have to respond to inquiries about research on a topic, contributes to reduced research usage despite dissemination.

Consider a relevant example that occurred near the beginning of the current pandemic. We received an inquiry from our customer success team that an important client is concerned about the shift to remote work and wants to talk it through with an expert—and the call is later that day. Ideally, we wanted to cite and discuss the latest research rather than relying on general best practice recommendations, so we searched for the most recent research synthesis on the subject of remote work. Given the tight time constraint, we tried to find the most comprehensive literature review or meta-analysis. This is example is all too common of a situation for practitioners. Though we each have areas of expertise, oftentimes we’re responsible for answering questions outside that area on a truncated timeline. Having a wide swath of reviews and meta-analyses to draw from would facilitate practitioners’ daily work. Imagine the effective support we could have provided that client—and their appreciation—if we were able to cite and discuss the influence of virtuality on team performance (e.g., Purvanova & Kenda, Reference Purvanova and Kenda2021) on that call rather than waiting a year and a half for the research to get accepted for publication.

Use modern and relevant terminology

Along with the timely dissemination of research findings, a primary component of maximizing research usage is ensuring that the terminology used in titles, abstracts, and keywords is kept up to date and contains the language I-O practitioners and other professionals use in the workplace. Repurposing the example from the prior section illustrates this point well. The client was concerned about the switch to remote work. Popular media had long been talking about remote work. I-O psychologists regularly communicate about remote work. Consequently, it was a remarkably frustrating experience in the early part of 2020 to seek out research on remote work and come up with surprisingly little. Was there a dearth of research on remote work, despite the concept of remote work being around for several decades? As it turned out, the antiquated terms “telework” and “telecommuting” yielded much more fruitful results, tapping into the decades of research.

Obviously, it’s impossible to go back in time and retroactively modernize the terminology used in prior published research, though this further highlights the need for ongoing reviews and updates to the literature. As terminology and the world of work changes, it’s imperative that researchers’ and publishers’ terminologies keep pace. Ongoing maintenance of research reviews and book chapters move us closer to bridging the gap between legacy and modern terminologies as they relate to technology.

The focal authors’ fifth recommendation, a technological solution to bridging the gap between legacy and modern terminology, is conceivable. Systems like metaBUS (Bosco etal., Reference Bosco, Steel, Oswald, Uggerslev and Field2015) are already in place—a feature enhancement akin to a map of technological terms that automatically includes alternative terms in a search query is almost certainly achievable.

Update prior meta-analyses

If there is one thing we learned from a read of Sackett etal. (Reference Sackett, Zhang, Berry and Lievens2021; and there were a GREAT many things), it’s that we, as a field, need to stop uncritically examining published meta-analyses, particularly published meta-analytic work that predates modern reporting and reviewing standards. The world of work changes, society changes, research changes, and, most germane to this commentary, technology changes. To use the cliché “the only constant is change,” relying on decades-old meta-analytic findings, thereby assuming their relevance endures, could be a grave failing for our entire field.

Consider as an illustrative example the seminal meta-analysis on the predictive validity of the employment interview (McDaniel etal., Reference McDaniel, Whetzel, Schmidt and Maurer1994). As of this writing, the research is nearly 30 years old and a substantial new body of work on the validity of the employment interview has emerged. Unsurprisingly, there’s not a technological moderator to be found in that work, as much of the relevant technology that intersects with the employment interview process has been developed after that work was initially published. This begs the rhetorical question: Where’s the update to this work that includes additional moderators, especially as they relate to technology?

Certainly, there has absolutely been some work to this end since that initial publication. For example, Thorsteinson’s (Reference Thorsteinson2018) recent meta-analysis does include a single technological moderator (i.e., interview medium) yet makes no attempt to be as comprehensive as McDaniel etal. (Reference McDaniel, Whetzel, Schmidt and Maurer1994) because the analysis was scoped differently and focused on interview length. Therefore, only studies reporting interview length were included in the meta-analysis. Consequently, this leads to some second-order sampling error, as there are small k/N values reported for some of those relationships. Additionally, Blacksmith etal. (Reference Blacksmith, Willford and Behrend2016) conducted a meta-analysis on technology-mediated employment interviews yet focused their research on applicant reactions and ratings rather than predictive validity. Thanks to the seismic cultural shift brought about by the COVID-19 pandemic, which included pushing technology-mediated interviews into ubiquity, we may already question the relevance of those findings for present-day applications.

All said, the point here is not to identify a new research agenda for the employment interview. Rather, the goal is to make the broader point related to moderating effects that emerge over time and alter the relationships we study. By failing to keep our field’s meta-analyses up to date with newer research—especially with respect to newer technologies—we leave ourselves with little choice but to rely on findings that may not be serving our modern practices well. In fact, without updated systematic reviews that incorporate technological moderators, practitioners are (more often than we care to admit) likely to rely on gut instinct.

Include practitioners in the review process

A final recommendation on how to improve I-O research as it intersects with technology is to include practitioners in the review process. Obviously, we are not the first to make this very recommendation (e.g., Geimer etal., Reference Geimer, Landers and Solberg2020), but the very nature of technology research as it applies to the workplace places paramount importance on practitioners’ role in advancing this science. The focal article highlights the fact that technology doesn’t have an integrated theoretical framework and is often practically focused. Therefore, the inclusion of practitioners who actually use the technology would add valuable perspective in the editorial process.

With experience in the practical applications of technology, practitioners can provide unique insights to the interpretation of findings, analytic and design choices, and implications of research results. As the focal article notes, practice is often a long way ahead of the research in the realm of technology. But practitioner I-O psychologists working with technology likely have experience working through issues (especially less-frequent ones) that may arise in its practical use. For example, practitioners working in unproctored internet testing frameworks likely have deep real-world expertise in resolving customer objections and overcoming connectivity challenges that may provide an additional lens of interpretation through which research may be shaped during the review process.

Additionally, with deep experience in dealing with real-world applications of technology, practitioners are well positioned to shape and support the practical recommendations brought out by the findings of a research study. One does not have to look far in the literature to find an “implications for practice” section that suggests it may not have been written by researchers wholly familiar with actual practice. Including practitioners in the review process would strengthen the practical implications and insights that are disseminated, advancing the utility of I-O psychology as a field.

Conclusion

We support and extend the recommendations of the focal article by suggesting four actionable recommendations the field of I-O psychology can take to help bridge the science–practice gap and make technological research more relevant to the community of practitioners: making research findings on technology more broadly accessible, using modern and relevant terminology on technology in publications, updating and extending meta-analyses with emerging technological moderators, and including practitioners in the review process. We believe that these recommendations will advance the practical applications of I-O psychology by improving and making more relevant the research we rely upon.

References

Blacksmith, N., Willford, J. C., & Behrend, T. S. (2016). Technology in the employment interview: Ameta-analysis and future research agenda. Personnel Assessment and Decisions, 2(1), 1220. https://doi.org/10.25035/pad.2016.002 Google Scholar
Bosco, F. A., Steel, P., Oswald, F. L., Uggerslev, K. L., & Field, J. G. (2015). Cloud-based meta-analysis to bridge science and practice: Welcome to metaBUS. Personnel Assessment and Decisions, 1(1), 317. https://doi.org/10.25035/pad.2015.002 Google Scholar
Geimer, J. L., Landers, R. N., & Solberg, E. G. (2020). Enabling practical research for the benefit of organizations and society. Industrial and Organizational Psychology: Perspectives on Science and Practice, 13(3), 334338. https://doi.org/10.1017/iop.2020.55 CrossRefGoogle Scholar
McDaniel, M. A., Whetzel, D. L., Schmidt, F. L., & Maurer, S. D. (1994). The validity of employment interviews: Acomprehensive meta-analysis. Journal of Applied Psychology, 79(4), 599616. https://doi.org/10.1037/0021-9010.79.4.599 CrossRefGoogle Scholar
Purvanova, R. K., & Kenda, R. (2021). The impact of virtuality on team effectiveness in organizational and non-organizational teams: Ameta-analysis. Applied Psychology: An International Review. Advance online publication. https://doi.org/10.1111/apps.12348 Google Scholar
Sackett, P. R., Zhang, C., Berry, C. M., & Lievens, F. (2021). Revisiting meta-analytic estimates of validity in personnel selection: Addressing systematic overcorrection for restriction of range. Journal of Applied Psychology. Advance online publication. https://doi.org/10.1037/apl0000994 Google ScholarPubMed
Thorsteinson, T. J. (2018). A meta-analysis of interview length on reliability and validity. Journal of Occupational and Organizational Psychology, 91(1), 132. https://doi.org/10.1111/joop.12186 Google Scholar
White, J., Ravid, D., Siderits, I., & Behrend, T. S. (2022). An urgent call for I-O psychologists to produce timelier technology research. Industrial and Organizational Psychology: Perspectives on Science and Practice, 15(3), 441–459.CrossRefGoogle Scholar