Hostname: page-component-745bb68f8f-b6zl4 Total loading time: 0 Render date: 2025-02-11T13:24:02.223Z Has data issue: false hasContentIssue false

The baby and the bathwater: On the need for substantive–methodological synergy in organizational research

Published online by Cambridge University Press:  14 December 2021

Joeri Hofmans*
Affiliation:
Vrije UniversiteitBrussel, Belgium
Alexandre J. S. Morin
Affiliation:
Concordia University, Canada
Heiko Breitsohl
Affiliation:
University of Klagenfurt, Austria
Eva Ceulemans
Affiliation:
KU Leuven, Belgium
Léandre Alexis Chénard-Poirier
Affiliation:
Université du Québec à Montréal, Canada
Charles C. Driver
Affiliation:
University of Zurich, Switzerland
Claude Fernet
Affiliation:
Université du Québec à Trois-Rivières, Canada
Marylène Gagné
Affiliation:
Curtin University, Australia
Nicolas Gillet
Affiliation:
Université de Tours, France, & Institut Universitaire de France (IUF), France
Vicente González-Romá
Affiliation:
University of Valencia, Spain
Kevin J. Grimm
Affiliation:
Arizona State University, USA
Ellen L. Hamaker
Affiliation:
Utrecht University, The Netherlands
Kit-Tai Hau
Affiliation:
The Chinese University of Hong Kong, Hong Kong SAR
Simon A. Houle
Affiliation:
Concordia University, Canada
Joshua L. Howard
Affiliation:
Monash University, Australia
Rex B. Kline
Affiliation:
Concordia University, Canada
Evy Kuijpers
Affiliation:
Vrije UniversiteitBrussel, Belgium
Theresa Leyens
Affiliation:
Vrije UniversiteitBrussel, Belgium
David Litalien
Affiliation:
Université Laval, Canada
Anne Mäkikangas
Affiliation:
Tampere University, Finland
Herbert W. Marsh
Affiliation:
Australian Catholic University, Australia, & The University of Oxford, UK
Matthew J. W. McLarnon
Affiliation:
Mount Royal University, Canada
John P. Meyer
Affiliation:
The University of Western Ontario, Canada & Curtin University, Australia
Jose Navarro
Affiliation:
University of Barcelona, Spain
Elizabeth Olivier
Affiliation:
Université de Montréal, Canada
Thomas A. O’Neill
Affiliation:
University of Calgary, Canada
Reinhard Pekrun
Affiliation:
University of Essex, United Kingdom, and Australian Catholic University, Australia
Katariina Salmela-Aro
Affiliation:
University of Helsinki, Finland
Omar N. Solinger
Affiliation:
Vrije Universiteit Amsterdam, The Netherlands
Sabine Sonnentag
Affiliation:
University of Mannheim, Germany
Louis Tay
Affiliation:
Purdue University, USA
István Tóth-Király
Affiliation:
Concordia University, Canada
Robert J. Vallerand
Affiliation:
Université du Québec à Montréal, Canada
Christian Vandenberghe
Affiliation:
HEC Montréal, Canada
Yvonne G. T. van Rossenberg
Affiliation:
Institute for Management Research, Radboud University, the Netherlands
Tim Vantilborgh
Affiliation:
Vrije UniversiteitBrussel, Belgium
Jasmine Vergauwe
Affiliation:
Ghent University, Belgium
Jesse T. Vullinghs
Affiliation:
Vrije UniversiteitBrussel, Belgium
Mo Wang
Affiliation:
University of Florida, USA
Zhonglin Wen
Affiliation:
South China Normal University, China
Bart Wille
Affiliation:
Ghent University, Belgium
*
*Corresponding author. Email: joeri.hofmans@vub.be
Rights & Permissions [Opens in a new window]

Abstract

Type
Commentaries
Copyright
© The Author(s), 2021. Published by Cambridge University Press on behalf of the Society for Industrial and Organizational Psychology

Murphy (Reference Murphy2021) argues that the field of industrial-organizational (I-O) psychology needs to pay more attention to descriptive statistics (“Table 1”; e.g., M, SD, reliability, correlations) when reporting and interpreting results. We agree that authors need to present a clear and transparent description of their data and that descriptive statistics and plots can be helpful in making sense of one’s data and analyses (Tay et al., Reference Tay, Parrigon, Huang and LeBreton2016). Many journals already require this. Although this information can be presented in the manuscript, more details can be placed in online supplements where there are fewer space limitations (e.g., detailed presentation and discussion of descriptive statistics, missing data and outliers, plots and diagrams, conceptual issues, and computer syntax). However, we strongly disagree with Murphy’s claim that “increasing complexity and diversity of data-analytic methods in organizational research has created several problems in our field” (p. X). This claim suffers from two important oversights: (a) It neglects the crucial role of methodological fit, or the notion that theory, methods, and analyses need to be aligned, and (b) it neglects the fact that in I-O research, most constructs are not directly observable but need to be inferred indirectly though latent variable models. We expand on both issues, using examples to illustrate that the complexity and diversity of data-analytic methods are not a threat but a blessing for I-O research (and beyond). Finally, we conclude by highlighting the need for substantive–methodological synergies to solve some of the issues raised by Murphy.

The importance of methodological fit

Methodological fit refers to the “internal consistency among elements of a research project” (Edmondson & McManus, Reference Edmondson and McManus2007, p. 1155). It concerns the alignment among research questions, prior work on the topic, research design, analyses, and contribution (Hamaker et al., Reference Hamaker, Mulder and van Ijzendoorn2020). Methodological fit implies that data-analytic methods need to be attuned to theoretical questions, an idea that is reflected in the statement “extraordinary claims require extraordinary evidence.” Human behavior is dynamic, multifaceted, and complex, and it occurs in interaction with equally complex social systems. We thus need sophisticated designs and methods to capture this complexity if we want to understand human behavior. In I-O research, this complexity takes several forms (discussed below), thus placing requirements of our methods, all of which go beyond the consideration of descriptive statistics.

Many phenomena in I-O psychology are multidetermined

Because many of the phenomena in which I-O psychologists are interested are multidetermined (involving dispositional, situational, organizational, and societ al sources of influences) and involved in complex causal chains (including mediation and moderation), the idea of methodological fit implies that to address these complex issues, multivariate data-analytic methods are required. Although several examples can be given, one that recently gained attention is balanced need satisfaction. More precisely, individuals with balanced satisfaction in the needs for autonomy, competence, and relatedness should experience higher levels of well-being than people with the same aggregate level of, yet less balanced, need satisfaction (Sheldon & Niemiec, Reference Sheldon and Niemiec2006). Importantly, the role of need (im)balance cannot be tested using descriptive statistics because it necessitates an inherently multivariate approach (Gillet et al., Reference Gillet, Morin, Huart, Colombat and Fouquereau2020).

Much of our data have a nested data structure

Because people typically work in teams, and teams in organizations, I-O data often have a nested data structure. The same is true when repeated measurements are nested within individuals. Such data structures create dependencies in the data (e.g., the commitment of employees working within the same group is likely to be more similar than that of employees from different groups; Schreurs et al., Reference Schreurs, Hofmans, Wille, Murphy and Tosti-Kharas2021). These dependencies need to be considered to properly analyze such data, necessitating multilevel methods (Morin et al., Reference Morin, Blais and Chénard-Poirier2021). Two examples show how crucial methodological fit is for multilevel data. In educational psychology, the big-fish-little-pond effect (Marsh et al., Reference Marsh, Kuyper, Morin, Parker and Seaton2014) shows that the effect of classroom levels of achievement on the academic self-concept (negative) differs from that of individual levels of achievement (positive) due to social comparison processes. Likewise, McCormick et al.’s (Reference McCormick, Reeves, Downes, Li and Ilies2020) meta-analysis shows that, when comparing between-person and within-person associations (repeated measurements), associations are different across levels of analysis 24.1% of the time. Thus, when dealing with nested data, one needs to take these dependencies into account, which necessitates multilevel or time-structured models. This means that raw scores and descriptive statistics alone might not be sufficiently informative.

Psychological phenomena are usually dynamic

I-O research is often interested in phenomena that develop, evolve, and change over time. These phenomena can have a beginning, a development, an evolution, and a completion, and many of them (e.g., self-esteem) are known to present trait and state components (Perinelli & Alessandri, Reference Perinelli and Alessandri2020). Job attitudes change over time, performance is dynamic, and affect at work is incredibly fluctuating, just to mention a few examples. Expecting these changes and fluctuations to be accurately represented by simple descriptive statistics is, at best, unrealistic and becomes impossible when facing nonlinearity and interindividual variation in shape (e.g., Navarro et al., Reference Navarro, Rueff-Lopes and Rico2020). Even when considering the relatively simple case of mediation, Murphy (Reference Murphy2021) implicitly assumes that we can test mediation from a static perspective, which ignores the fact that most of our theories assume the presence of dynamic psychological processes that unfold over time. For this reason, accurate tests of mediation require longitudinal designs (Cole & Maxwell, Reference Cole and Maxwell2003) and the ability to disaggregate sources of within-person versus between-person variation.Footnote 1

The questionable assumption of population homogeneity

Most I-O studies implicitly assume that a single set of “averaged” parameters can be used to describe the population. Yet, awareness is growing that this assumption is often too simplistic (Meyer & Morin, Reference Meyer and Morin2016). Hofmans et al. (Reference Hofmans, Wille and Schreurs2020) argue that several of our theories imply population heterogeneity rather than homogeneity: People can hold distinct configurations on a series of indicators reflecting their career orientation (McLarnon et al., Reference McLarnon, Carswell and Schneider2015), their motivation (Tóth-Király et al., Reference Tóth-Király, Morin, Bőthe, Rigó and Orosz2021), or their commitment (Meyer & Morin, Reference Meyer and Morin2016), and these can follow distinct longitudinal trajectories (Fernet et al., Reference Fernet, Morin, Austin, Gagné, Litalien, Lavoie-Tremblay and Forest2020). To detect heterogeneity, person-centered analyses are needed (Meyer & Morin, Reference Meyer and Morin2016). For instance, Solinger et al. (Reference Solinger, van Olffen, Roe and Hofmans2013) revealed that the bond between newcomers and their organizations could develop in different ways, whereas Morin et al. (Reference Morin, Maïano, Marsh, Nagengast and Janosz2013) demonstrated the indissociable nature of self-concept levels and stability. Identifying profiles of employees is also more naturally aligned with managers’ tendencies to think in terms of categories (Mäkikangas & Kinnunen Reference Mäkikangas and Kinnunen2016). The person-centered approach is thus particularly helpful for guiding intervention strategies that are tailored to the needs of distinct types of employees, an approach that has yielded benefits for burnout intervention (Hätinen et al., Reference Hätinen, Kinnunen, Mäkikangas, Kalimo, Tolvanen and Pekkonen2009). If we want to relax the often-unrealistic assumption of population homogeneity, we need to embrace complex data analyses. Moreover, heterogeneity cannot be inferred from the inspection of descriptive statistics and correlations, which suggests that the interpretation of such sample-level statistics can be misleading.

Many theories, even simple in appearance, cannot be empirically tested without complex data analyses

Many good theoretical models seek to capture the complex nature of human reality,Footnote 2 which requires complex data analyses. For instance, Lawler’s (Reference Lawler1992) theory of empowerment—highlighting the role of complementariness and coherence in leaders’ empowerment practices—has long been used in textbooks and as a guide for intervention. However, a proper test of this theory has been lacking until Chénard-Poirier et al. (Reference Chénard-Poirier, Morin and Boudrias2017) found partial support for its propositions using a hybrid mixture regression approach. Similarly, to test the dynamic model of the psychological contract properly, dual regime models are required. Indeed, dual regime models “mimic the theoretical processes underlying the elicitation of violation feelings via two model components: Abinary distribution that models whether an event in one’s work environment leads to a crossing of the acceptance limits of the psychological contract and a count distribution that models how severe the negative effect of this crossing is” (Hofmans, Reference Hofmans2017, p. 8). Last, combining many previous issues (person centered, multilevel, theoretical complexity), O’Neill et al. (Reference O’Neill, McLarnon, Hoffart, Woodley and Allen2018) identified distinctive patterns of task, process, and relationship conflict located at the team-level that “variable-centered” studies failed to uncover.

As illustrated, merely examining descriptive statistics is not sufficiently informative for many of our research questions, unless we are ready, as a field, to dramatically simplify our theories. In that sense, Murphy (Reference Murphy2021) seems to argue in favor of theoretical abstraction and simplicity (assuming a limited set of grand universal laws; Healy, Reference Healy2017), whereas we believe that there is also value in understanding complexity (Tsoukas, Reference Tsoukas2017). Life is inherently complex—it is interconnected, multifaceted, paradoxical, ever-changing—and advanced statistics help us come to grips with it.

The need for latent variable models

In I-O psychology, many constructs are not directly observable. We rely on questionnaire data, where responses to multiple items are assumed to reflect an underlying psychological construct. These unobservable constructs do not possess readily established measurement units (like sex or tenure) but units that emerge from our data analytic models. These units can be a function of the response scale or the distribution of scores obtained in the sample (standardization) or population (norms; Meyer & Morin, Reference Meyer and Morin2016). Our measures are thus, by definition, imperfect, which makes descriptive statistics equally imperfect. Indirect measurement poses several challenges, which can be resolved using latent variable models. In what follows, we list some of those challenges that illustrate why latent variable models are critical for advancing our understanding of I-O phenomena.

As a starting point, studies that use multiple indicators to measure any construct should routinely test the measurement model relating these indicators to the latent factors. Without support for the measurement of the constructs, subsequent analyses are dubious. The full measurement model should be presented in sufficient detail to be evaluated, perhaps in online supplements. The latent correlation matrix among constructs based on this measurement model is more useful and accurate than the manifest correlation matrix suggested by Murphy (Reference Murphy2021). Thus, the measurement model is central for both Murphy’s descriptive goals and as a bridge to more complex models.

Random measurement error

Due to the imperfect nature of our measures, manifest scores contain random measurement error, which attenuates our estimates of associations between variables. In most research settings, reliable measurement (i.e., true score variance: the total variance minus the variance due to random measurement error) is reflected in the covariance among ratings that is obtained across various indicators of the same construct, whereas the unique part of each indicator incorporates random measurement error. However, in nonlatent analytic models, these two sources of variance are conflated. Whereas latent variable models naturally separate them, allowing for tests of associations corrected for random measurement error, properly accounting for measurement error is far more complex than simply relying on latent variable models. For example, Marsh and Hau (Reference Marsh and Hau1996) demonstrated that test–retest correlations based on manifest variables tend to be upwardly biased by the failure to account for longitudinal correlated uniquenesses among the matching indicators used repeatedly over time. Marsh et al. (Reference Marsh, Scalas and Nagengast2010a) similarly demonstrated the need to account for wording effects (e.g., negative wording, parallel wording) to achieve an accurate representation of the structure of our constructs. With multilevel data, measurement error due to “interitem agreement” occurs separately across levels of analysis, and “interrater agreement” between members of the higher level reality (e.g., a workgroup) are also likely to bias measurement (Morin et al., in press). To make matters worse, Marsh et al. (Reference Marsh, Seaton, Kuyper, Dumas, Huguet, Regner, Buunk, Monteil, Blanton and Gibbons2010b) demonstrated that rather than attenuating associations, multilevel sources of measurement error, in combination, could create artificial associations between constructs (referred to as phantom effects). Finally, despite their well-controlled nature, laboratory experiments are subject to the same challenges, including biased estimates of intervention effects (Breitsohl, Reference Breitsohl2019). These examples all demonstrate that descriptive results based on manifest variables can be misleading when working with unobservable constructs.

Disentangling distinct sources of variance

Recent developments have shown that measurement issues tend to be a lot more complex than previously believed (Morin et al., Reference Morin, Myers, Lee, Tenenbaum and Eklund2020). For example, statistical research has highlighted the need to account for distinct forms of true score variance in conceptually related and hierarchically ordered constructs (Morin et al., Reference Morin, Boudrias, Marsh, McInerney, Dagenais-Desmarais, Madore and Litalien2017). Exploratory structural equation modeling has been recommended as a way to account for the presence of conceptually related constructs by incorporating cross-loadings (Asparouhov et al., Reference Asparouhov, Muthén and Morin2015), whereas bifactor modeling has been recommended for hierarchically ordered constructs (Morin et al., Reference Morin, Boudrias, Marsh, McInerney, Dagenais-Desmarais, Madore and Litalien2017). In research involving ratings from different sources (e.g., teams, supervisor, assessment centers), more complex latent variable models and mixed effects models for cross-classified data are needed to isolate the multiple sources of variability that are present in these ratings (O’Neill et al., Reference O’Neill, McLarnon and Carswell2015). Such latent variable techniques have advanced our knowledge of measurement in a way that would have been impossible using descriptive statistics, in addition to help model the complexity of real-world phenomena.

Improved techniques for testing moderation

Moderator effects are central to many models and theories. Murphy (Reference Murphy2021) rightfully argues that tests of moderation suffer from several issues, including low reliability (and associated low power) of the interaction term. However, rather than taking a step back and reverting to descriptive statistics, it is equally valuable to take a step forward and work with latent variable models, which make it possible to tackle moderation in a way that accounts for unreliability (Marsh et al., Reference Marsh, Hau, Wen, Nagengast, Morin and Little2013). Once again, why not use the strengths of our complex data-analytic methods, which offer clear solutions to many of the issues that we face in our field.

The baby and the bathwater: The need for substantive–methodological synergy

Having argued that complex data-analytic models are critical to addressing the complex questions that we ask in I-O research, we share Murphy’s (Reference Murphy2021) concern about the incorrect application and interpretation of those methods and the growing science–practice gap. This phenomenon has been known for decades (see Borsboom, Reference Borsboom2006; Marsh & Hau, Reference Marsh and Hau2007) and can be tied to multiple issues, including (a) the lack of proper statistical training in graduate school, (b) the fact that applied researchers often struggle to keep pace with the fast pace of methodological innovations and the equally fast-paced evolution of their theoretical fields after graduate school, (c) the fact that some methodological experts sometimes lose sight of the true needs of applied researchers, and (d) that present statistical innovations in a formal (equation-based) manner that falls beyond the understanding of applied researchers. However, Murphy seems to fail to recognize that “simple” techniques are only seemingly simpler because they make more simplifying assumptions, many of which may be false (oversimplification). In other words, the “burden of assumptions” may be heavier with simple techniques. Moreover, when pushed to the extreme, resorting to descriptive statistics to capture reality may come to imply that inferential statistics and modeling should be replaced by storytelling. Despite our sympathy with Murphy’s claim that more attention should be afforded to descriptive statistics, we thus believe in an alternative and more creative route: substantive–methodological synergies.

The term substantive–methodological synergy has been proposed by Marsh and Hau (Reference Marsh and Hau2007) to describe joint ventures in which new methods provide novel insights into important substantive issues. Such joint ventures involve collaboration between substantive and methodological experts to ensure that (a) the methods are applied correctly to match the needs of the research area and that the findings are translated in a meaningful way to applied researchers and practitioners and (b) new methodological developments are connected to the needs of applied researchers and translated in a way that makes sense to them. For example, the growing field of big data and data science in I-O psychology requires both theoretical and methodological sophistication (Woo et al., Reference Woo, Tay and Proctor2020). Thus, rather than reverting to the simplest tools, I-O scholars need to acknowledge that human behavior is complex and that advanced methods are needed to capture that complexity. Unfortunately, true substantive–methodological synergies (i.e., articles where theory and methods are positioned as dual objectives) remain unwelcome in many I-O journals, due to the false idea that articles should tackle one main objective and that methodologically oriented articles should be sent to methodological journals. Moreover, providing complete and accurate coverage and interpretation of both theoretical and methodological components typically requires more space than is often available in I-O journals.

We hope that this article might reduce these obstacles and pave the way for substantive–methodological synergies in I-O research. Rather than throwing the baby (i.e., proper statistical modeling) out with the soiled bathwater (the challenges posed by the correct application and interpretation of these methods), substantive–methodological synergies make it possible for I-O psychologists to solve the true problems raised by Murphy (Reference Murphy2021), without sacrificing the theoretical richness of our field. Researchers should not shy away from complex methods, which have been born out of a genuine need to model human complexity. We do not see these synergies as the only way forward but as one out of many (e.g., including better statistical training) that may help to reduce the ever increasing gap between statistical developments and applied research. We are also not claiming that “simple” research has no value or that fundamental statistical developments should stop. Rather, we are simply claiming that more work is needed to bridge those two.

Finally, sharing Murphy’s (Reference Murphy2021) concern about the science–practice gap, we agree that we need to educate ourselves and our readers more properly regarding the functionality and interpretability of our models and make a concerted effort to explain how our findings can inform practice. If anything, the need for clarity increases as questions and analyses become more complex. However, clarity is not synonymous with simplicity and, although we agree that “Table 1” is important, we have provided several examples that “simple” statistics can be misleading and/or inappropriate for the research question at hand. It is thus incumbent on researchers who are seeking to achieve substantive–methodological synergy to keep in mind the need to communicate their findings clearly to achieve that synergy.

Footnotes

The first and second author contributed equally to this article and their order was determined at random. Both should be considered first authors. Beyond the first and second authors, the order of appearance of all other coauthors was determined alphabetically (as a function of their family names).

The second author was supported by a grant from the Social Sciences and Humanities Research Council of Canada (435-2018-0368).

1 In addition, contrary to Murphy’s (Reference Murphy2021) claim, it is not necessary for x and y to be correlated for a partial mediation to exist in the form of xmy given that the indirect xy relationship can have a different sign than the direct xy relationship and they may cancel each other out (Zhao et al., Reference Zhao, Lynch and Chen2010).

2 Although some theories themselves might be too complex to be truly useful (Saylors & Trafimow, Reference Saylors and Trafimow2021).

References

Asparouhov, T., Muthén, B. O., & Morin, A. J. S. (2015). Bayesian structural equation modeling with cross-loadings and residual covariances: Comments on Stromeyer et al. Journal of Management, 41(6), 15611577.10.1177/0149206315591075CrossRefGoogle Scholar
Borsboom, D. (2006). The attack of the psychometricians. Psychometrika, 71(3), 425440.10.1007/s11336-006-1447-6CrossRefGoogle ScholarPubMed
Breitsohl, H. (2019). Beyond ANOVA: An introduction to structural equation models for experimental designs. Organizational Research Methods, 22(3), 649677.10.1177/1094428118754988CrossRefGoogle Scholar
Chénard-Poirier, L. A., Morin, A. J. S., & Boudrias, J. S. (2017). On the merits of coherent leadership empowerment behaviors: Amixture regression approach. Journal of Vocational Behavior, 103, 6675.10.1016/j.jvb.2017.08.003CrossRefGoogle Scholar
Cole, D. A., & Maxwell, S. E. (2003). Testing mediational models with longitudinal data. Journal of Abnormal Psychology, 112(4), 558577.10.1037/0021-843X.112.4.558CrossRefGoogle ScholarPubMed
Edmondson, A. C., & McManus, S. E. (2007). Methodological fit in management field research. Academy of Management Review, 32(4), 11551179.10.5465/amr.2007.26586086CrossRefGoogle Scholar
Fernet, C., Morin, A. J. S., Austin, S., Gagné, M., Litalien, D., Lavoie-Tremblay, M., & Forest, J. (2020). Self-determination trajectories at work: Agrowth mixture analysis. Journal of Vocational Behavior, 121, Article 103473 10.1016/j.jvb.2020.103473CrossRefGoogle Scholar
Gillet, N., Morin, A. J. S., Huart, I., Colombat, P., & Fouquereau, E. (2020). The forest and the trees: Investigating the globability and specificity of employees’ basic need satisfaction at work. Journal of Personality Assessment, 102(5), 702713.10.1080/00223891.2019.1591426CrossRefGoogle ScholarPubMed
Hamaker, E. L., Mulder, J. D., van Ijzendoorn, M. H. (2020). Description, prediction and causation: Methodologic challenges of studying child and adolescent development. Developmental Cognitive Neuroscience, 46, Article 100867.10.1016/j.dcn.2020.100867CrossRefGoogle Scholar
Hätinen, M., Kinnunen, U., Mäkikangas, A., Kalimo, R., Tolvanen, A., & Pekkonen, M. (2009). Burnout during a long-term rehabilitation: Comparing low burnout, high burnout -benefited, and high burnout -not benefited trajectories. Anxiety, Stress & Coping, 22(3), 341360.10.1080/10615800802567023CrossRefGoogle ScholarPubMed
Healy, K. (2017). Fuck nuance. Sociological Theory, 35(2), 118127.10.1177/0735275117709046CrossRefGoogle Scholar
Hofmans, J. (2017). Modeling psychological contract violation using dual regime models: An event-based approach. Frontiers in Psychology, 8, Article 1948.10.3389/fpsyg.2017.01948CrossRefGoogle Scholar
Hofmans, J., Wille, B., & Schreurs, B. (2020). Person-centered methods in vocational research. Journal of Vocational Behavior, 118, Article 103398.10.1016/j.jvb.2020.103398CrossRefGoogle Scholar
Lawler, E. E. (1992). The ultimate advantage: Creating the high involvement organization. Jossey-Bass.Google Scholar
Mäkikangas, A., & Kinnunen, U. (2016). The person-oriented approach to burnout: Asystematic review. Burnout Research, 3(1), 1123.10.1016/j.burn.2015.12.002CrossRefGoogle Scholar
Marsh, H. W., & Hau, K-T. (1996). Assessing goodness of fit: Is parsimony always desirable? Journal of Experimental Education, 64(4), 364390.10.1080/00220973.1996.10806604CrossRefGoogle Scholar
Marsh, H. W., & Hau, K-T. (2007). Applications of latent-variable models in educational psychology: The need for methodological-substantive synergies. Contemporary Educational Psychology, 32(1), 151171.10.1016/j.cedpsych.2006.10.008CrossRefGoogle Scholar
Marsh, H. W., Hau, K.-T., Wen, Z., Nagengast, B., & Morin, A. J. S. (2013). Moderation. In Little, T. D. (Ed.), Oxford handbook of quantitative methods, vol. 2 (pp. 361386). Oxford University Press.Google Scholar
Marsh, H. W., Kuyper, H., Morin, A. J. S., Parker, P. D., & Seaton, M. (2014). Big-fish-little-pond social comparison and local dominance effects: Integrating new statistical models, methodology, design, theory and substantive implications. Learning & Instruction, 33, 5066.10.1016/j.learninstruc.2014.04.002CrossRefGoogle Scholar
Marsh, H. W., Scalas, L. F., & Nagengast, B. (2010a). Longitudinal tests of competing factor structures for the Rosenberg self-esteem scale. Psychological Assessment, 22(2), 366381.10.1037/a0019225CrossRefGoogle ScholarPubMed
Marsh, H. W., Seaton, M., Kuyper, H., Dumas, F., Huguet, P., Regner, I., Buunk, A. P., Monteil, J. M, Blanton, H., & Gibbons, F. X. (2010b). Phantom behavioral assimilation effects: Systematic biases in social comparison choice studies. Journal of Personality, 78(2), 671710.10.1111/j.1467-6494.2010.00630.xCrossRefGoogle ScholarPubMed
McCormick, B. W., Reeves, C. J., Downes, P. E., Li, N., & Ilies, R. (2020). Scientific contributions of within-person research in management. Journal of Management, 46(2), 321350.10.1177/0149206318788435CrossRefGoogle Scholar
McLarnon, M. J. W., Carswell, J. J., & Schneider, T. J. (2015). A case of mistaken identity? Latent profiles in vocational interests. Journal of Career Assessment, 23(1), 166185.10.1177/1069072714523251CrossRefGoogle Scholar
Meyer, J. P., & Morin, A. J. S. (2016). A person-centered approach to commitment research: Theory, research, and methodology. Journal of Organizational Behavior, 37(4), 584612.10.1002/job.2085CrossRefGoogle Scholar
Morin, A. J. S., Blais, A.-R., & Chénard-Poirier, L.-A. (2021). Doubly latent multilevel procedures for organizational assessment and prediction. Journal of Business and Psychology. https://doi.org/10.1007/s10869-021-09736-5 CrossRefGoogle Scholar
Morin, A. J. S., Boudrias, J.-S., Marsh, H. W., McInerney, D. M., Dagenais-Desmarais, V., Madore, I., & Litalien, D. (2017). Complementary variable- and person-centered approaches to exploring the dimensionality of psychometric constructs: Application to psychological wellbeing at work. Journal of Business and Psychology, 32, 395419.10.1007/s10869-016-9448-7CrossRefGoogle Scholar
Morin, A. J. S., Maïano, C., Marsh, H. W., Nagengast, B., & Janosz, M. (2013). School life and adolescents’ self-esteem trajectories. Child Development, 84(6), 19671988.10.1111/cdev.12089CrossRefGoogle ScholarPubMed
Morin, A.J.S., Myers, N.D., & Lee, S. (2020). Modern factor analytic techniques: Bifactor models, exploratory structural equation modeling (ESEM) and bifactor-ESEM. In Tenenbaum, G. & Eklund, R.C. (Eds.), Handbook of Sport Psychology, 4th Edition (pp. 1044–1073). London, UK: Wiley 10.1002/9781119568124.ch51CrossRefGoogle Scholar
Murphy, K. R. (2021). In praise of table 1: The importance of making better use of descriptive statistics. Industrial and Organizational Psychology: Perspectives on Science and Practice, 14(4), 461477.Google Scholar
Navarro, J., Rueff-Lopes, R. & Rico, R. (2020). New nonlinear and dynamic avenues for the study of work and organizational psychology: an introduction to the special issue. European Journal of Work and Organizational Psychology, 29(4), 477482.10.1080/1359432X.2020.1794952CrossRefGoogle Scholar
O’Neill, T. A., McLarnon, M. J. W., & Carswell, J. J. (2015). Variance components of job performance ratings. Human Performance, 28(1), 6691.10.1080/08959285.2014.974756CrossRefGoogle Scholar
O’Neill, T. A., McLarnon, M. J. W., Hoffart, G. C., Woodley, H. J., & Allen, N.J. (2018). The structure and function of team conflict profiles. Journal of Management, 44(1), 811836.10.1177/0149206315581662CrossRefGoogle Scholar
Perinelli, E., & Alessandri, G. (2020). A latent state-trait analysis of global self-esteem: Areconsideration of its state-like component in an organizational setting. International Journal of Selection and Assessment, 28(4), 465483.10.1111/ijsa.12308CrossRefGoogle Scholar
Saylors, R., & Trafimow, D. (2021). Why the increasing use of complex causal models is a problem: On the danger sophisticated theoretical narratives pose to truth. Organizational Research Methods, 24(3), 616629.10.1177/1094428119893452CrossRefGoogle Scholar
Schreurs, B., Hofmans, J., & Wille, B. (2021). Multilevel modeling for careers research. In Murphy, W. & Tosti-Kharas, J. (Eds.), Handbook for research methods in careers (pp. 210234). Edward Elgar.10.4337/9781788976725.00018CrossRefGoogle Scholar
Sheldon, K., & Niemiec, C. (2006). It’s not just the amount that counts: Balanced need satisfaction also affects well-being. Journal of Personality and Social Psychology, 91(2), 331341.10.1037/0022-3514.91.2.331CrossRefGoogle Scholar
Solinger, O. N., van Olffen, W., Roe, R. A., & Hofmans, J. (2013). On becoming (un)committed: Ataxonomy and test of newcomer onboarding scenarios. Organization Science, 24(6), 16401661.10.1287/orsc.1120.0818CrossRefGoogle Scholar
Tay, L., Parrigon, S., Huang, Q., & LeBreton, J. M. (2016). Graphical descriptives: Away to improve data transparency and methodological rigor in psychology. Perspectives on Psychological Science, 11(5), 692701.10.1177/1745691616663875CrossRefGoogle Scholar
Tóth-Király, I., Morin, A. J. S., Bőthe, B., Rigó, A., & Orosz, G. (2021). Toward an improved understanding of work motivation profiles. Applied Psychology: An International Review, 70(3), 9861017.10.1111/apps.12256CrossRefGoogle Scholar
Tsoukas, H. (2017). Don’t simplify, complexify: From disjunctive to conjunctive theorizing in organization and management studies. Journal of Management Studies, 54(2), 132153.10.1111/joms.12219CrossRefGoogle Scholar
Woo, S. E., Tay, L., & Proctor, R. W. (Eds.). (2020). Big data in psychological research. American Psychological Association.10.1037/0000193-000CrossRefGoogle Scholar
Zhao, X., Lynch, J. G., & Chen, Q. (2010). Reconsidering Baron and Kenny: Myths and trusts about mediation analysis. Journal of Consumer Research, 37(2), 197206.10.1086/651257CrossRefGoogle Scholar