Hostname: page-component-745bb68f8f-b6zl4 Total loading time: 0 Render date: 2025-02-11T09:51:00.120Z Has data issue: false hasContentIssue false

Introducing a replication-first rule for Ph.D. projects

Published online by Cambridge University Press:  27 July 2018

Arnold R. Kochari
Affiliation:
Institute for Logic, Language and Computation, University of Amsterdam, 1090 GE Amsterdam, The Netherlands. a.kochari@uva.nlhttp://akochari.com/
Markus Ostarek
Affiliation:
Max Planck Institute for Psycholinguistics, Nijmegen, 6500 AH Nijmegen, The Netherlands. markus.ostarek@mpi.nlhttp://www.mpi.nl/people/ostarek-markus

Abstract

Zwaan et al. mention that young researchers should conduct replications as a small part of their portfolio. We extend this proposal and suggest that conducting and reporting replications should become an integral part of Ph.D. projects and be taken into account in their assessment. We discuss how this would help not only scientific advancement, but also Ph.D. candidates' careers.

Type
Open Peer Commentary
Copyright
Copyright © Cambridge University Press 2018 

Commenting on the role that replications should play in a researcher's career, Zwaan et al. briefly suggest that early career researchers should conduct replications “with the goal of building on a finding or as only one small part of their portfolio” (sect. 5.5.1, para. 4). Extending this, we propose that conducting and reporting replications should become an integral part of Ph.D. projects and should be taken into account in their assessment. Specifically, we suggest adopting a replication-first rule, whereby Ph.D. candidates are expected to first conduct a replication when they are building on a previous finding, and only then collect data in their novel study.

One reason we consider it important to specifically address the role of replications for early career researchers is that they face enormous pressure to establish themselves in the scientific community and often fear that their careers could end before they really begin (Maher & Anfres Reference Maher and Anfres2016; “Many junior scientists” 2017). Currently, to secure a job in academia after obtaining a doctoral degree, one needs to build an impressive portfolio of publications (Lawrence Reference Lawrence2003). Based on our observations of how research projects are carried out in practice, Ph.D. candidates often directly attempt innovative extensions of previous experimental work in the hope of answering a novel research question, because novelty strongly increases publishability (Nosek et al. Reference Nosek, Spies and Motyl2012). When such extensions fail to produce the expected results, they tend to collect more data in several variations of their own experiments before turning to examine the replicability of the original effect. However, it may often turn out that they cannot reproduce the original finding, possibly because the original effect is, in fact, not robust. In these cases, replicating the original effect first would prevent what may turn out to be a substantial waste of time and resources on follow-up experiments. Moreover, the time saved as a result of replicating first can be used to further examine the robustness of the original effect, for example, by conducting an additional high-powered replication. Such replications contribute to a better estimate of effect sizes, which are currently often overestimated on account of publication bias, sampling error, or p-hacking (Fanelli Reference Fanelli2011; Ferguson & Brannick Reference Ferguson and Brannick2012; Szucs & Ioannidis Reference Szucs and Ioannidis2017a). As such, replications constitute an important scientific contribution and should be regarded as such by Ph.D. project advisors.

The above arguments demonstrate the advantages of replicating first in the case of a failed replication. Likewise, successful replications provide a great opportunity. Pressure to publish operating simultaneously with publication bias means that early career researchers are currently pressed to obtain specifically positive findings to publish papers. As a result, in our experience, not knowing whether an experiment will yield positive results causes anxiety in Ph.D. candidates. Incorporating replications as a first step of any new research project can help alleviate this anxiety. If an extension shows no effect or supports the null hypothesis after a successful replication of the original effect, it should be easier to interpret the theoretical significance of this outcome. For example, suppose that one replicates a previously observed priming effect but does not obtain it when the primes are masked. In this case, one can directly compare the effect in both conditions and make a convincing case about the role of visibility for the effect. These two experiments can likely be put together in a strong paper. Similarly, a successful replication and extension make for a solid package that will convince Ph.D. candidates themselves and the fellow researchers who read their work. In this way, replicating first shifts the focus from the results to the underlying scientific process (how well the work is carried out). In combination with the registered reports format (Chambers Reference Chambers2013), we believe a replication-first rule would minimize Ph.D. candidates' stress caused by the anticipation of negative results and increase the quality of their work.

Finally, we hope that adopting the proposed replication-first rule would result in an important shift in the necessity for early career researchers to learn and demonstrate the ability to conduct replications appropriately. Specifically, evaluating the outcome of replications often involves assessing the strength of accumulated evidence using state-of-the-art meta-analytic tools. We hope demonstration of such skills will increasingly be taken into account in quality assessment of theses and in hiring decisions. Widespread application of the replication-first rule would also generate pressure on graduate schools to organize corresponding courses and seminars.

Even though adopting the replication-first rule may be difficult in cases where data collection is costly for the budget or resources available for a Ph.D. project, this should not be seen as a sufficient reason to omit replications, as also pointed out by Zwaan et al. Because such studies often have smaller sample sizes and more room for arbitrary data analysis choices, replicability is an even larger issue for them (see Poldrack et al. [Reference Poldrack, Baker, Durnez, Gorgolewski, Matthews, Munafò, Nichols, Poline, Vul and Yarkoni2017] for a discussion of this for fMRI findings). The growing awareness of this state of affairs in the field will likely lead to greater appreciation and higher rewards for replication in these cases. Ph.D. candidates are thus well advised to go the extra mile and replicate first. If two separate experiments are not feasible, incorporating a replication into the novel study design would be an option.

In sum, we believe that adopting the replication-first rule for Ph.D. projects would not only contribute to scientific progress in the way Zwaan et al. lay out, but also would be beneficial for the Ph.D. candidates themselves. We predict that this will result in a larger number of solid findings and publishable papers, as well as incentivize Ph.D.'s to master the necessary meta-analytic statistical tools for assessing evidence in cumulative science. In this way, we believe conducting replications could be a great boost for early researchers' careers rather than only a “service to the field.” That said, we of course do not suggest obliterating the value of creativity and original thinking in doctoral theses and their assessment. The replication-first rule is intended as a constant reminder that a balance between the two is needed to ensure solid science.

References

Chambers, C. D. (2013) Registered reports: A new publishing initiative at Cortex. Cortex 49(3):609–10. Available at: http://doi.org/10.1016/j.cortex.2012.12.016.Google Scholar
Fanelli, D. (2011) Negative results are disappearing from most disciplines and countries. Scientometrics 90(3):891904. Available at: http://doi.org/10.1007/s11192-011-0494-7.Google Scholar
Ferguson, C. J. & Brannick, M. T. (2012) Publication bias in psychological science: Prevalence, methods for identifying and controlling, and implications for the use of meta-analyses. Psychological Methods 17(1):120–28. Available at: http://doi.org/10.1037/a0024445.Google Scholar
Lawrence, P. A. (2003) The politics of publication. Nature 422:259–61. Available at: http://doi.org/10.1038/422259a.Google Scholar
Maher, B. & Anfres, M. S. (2016) Young scientists under pressure: What the data show. Nature 538:444–45. Available at: http://doi.org/10.1038/538444aGoogle Scholar
Many junior scientists need to take a hard look at their job prospects. Editorial. (2017) Nature 550(7677):429. Available at: http://doi.org/10.1038/550429a.Google Scholar
Nosek, B. A., Spies, J. R. & Motyl, M. (2012) Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science 7(6):615–31. Available at: http://doi.org/10.1177/1745691612459058.Google Scholar
Poldrack, R. A., Baker, C. I., Durnez, J., Gorgolewski, K. J., Matthews, P. M., Munafò, M. R., Nichols, T. E., Poline, J. B., Vul, E. & Yarkoni, T. (2017) Scanning the horizon: Towards transparent and reproducible neuroimaging research. Nature Reviews Neuroscience 18(2):115–26. Available at: http://doi.org/10.1038/nrn.2016.167Google Scholar
Szucs, D. & Ioannidis, J. P. (2017a) Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature. PLoS Biology 15(3):e2000797. Available at: http//doi.org/10.1371/journal.pbio.2000797.Google Scholar