The perfectly replicable study

  1. 1.  Dipartimento di Psicologia Generale, Padova University


In this essay the requirements for a perfectly replicable study are described as well as the active initiatives which promote the open practices for data and materials availability


We must first clarify the difference between the replicability of a study and the replicability of its results. To be replicable, the study must include all the necessary information for each step of its execution, unlike a conceptual replication (see Schmidt, 2009; Stroebe and Strack, 2014) which is aimed at testing the interpretation of the observed results and hence it includes variations in the procedure, materials, etc.

To facilitate a direct or exact replication, all the details regarding the procedure, materials, as well as the software used, should be made available unless copyright protected. In the case of copyright, the study must contain all the instructions for either their reproduction or availability.

How to facilitate replicability

The best procedure which allows the best conditions for a replication is the “adversarial or collaborative collaboration” (Kahneman, 2003) is the preferred one, in that it allows a direct cooperation between the original authors and the replicators in order to reproduce the study in the best possible way.

The role of editors and reviewers is particularly critical. The recent “Peer Reviewers’ Open Initiative” (Morey et al., 2016) is a big step forward. Those who take part in this initiative are committed do  “not offer comprehensive review for, nor recommend the publication of, any manuscript that does not meet the quality requirements for an open scientific manuscript” which consist of data, stimuli, materials with documents containing details for interpreting any files or code, and how to compile and run any software programs that are publicly available or, where this is not possible, give clear reasons why.

Another recent initiative is the Transparency and Openness Promotion (TOP) guidelines (Nosek et al. 2015), directed at the editors of scientific journals in order to facilitate the adoption of different levels of guidelines to increase the standards of transparency, openness, and replicability of all published studies.

An example

An example on how editors can boost the use of these open practices is that put forward by Erich Eich, former Editor in Chief of Psychological Science, the flagship journal of the Association for Psychological Science. In 2014 he published an Editorial with the title “Business Not as Usual” (Eich, 2014), wherein, together with other changes in the journal’s publication standards and practices aimed at enhancing the reporting of research findings and methodology, he promoted the following open practices: Open Data, Open Materials and Pre-registration in accordance with the Center for Open Science’s(2013) badge-certified acknowledgement that authors can obtain before publishing their studies.

The outcome of the journal’s new publication standards and practices caused an increase in the use of Open Data practice from 4% (for 2013, prior to implementation of new standards) to 39% (from 2015, after new standards took effect) and in the use of Open Materials from 7% to 31%. For the same relative periods, in the Journal of Experimental Psychology-General, which did not promote similar practices, the Open Data practice changed from1% to 3% and the use of Open Materials dropped from 11% to 5% (Tressoldi, Cumming and Giofré, under revision). 

Concluding remarks

A true advancement in science for fostering the replicability of studies seems possible but only with the coordinated contribution of journal editors, reviewers and authors.


Center for Open Science, Badges to acknowledge open practices, 2013. Retrieved from


Erich Eich, “Business not as usual”, Psychological Science, 25(2014): 3-6.

Daniel Kahneman, “Experiences of collaborative research”, American Psychologist, 58 (2003): 723.


Richard D. Morey, et al., “The Peer Reviewers' Openness Initiative: incentivizing open research practices through peer review”, Royal Society Open Science, 3(2016): 150547.

Nosek, B. A., et al. "Promoting an open research culture: Author guidelines for journals could help to promote transparency, openness, and reproducibility." Science (New York, NY) 348.6242 (2015): 1422.

Stephan Schmidt, S. (2009). Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology, 13(2009): 90-100.


Stroebe, Wolfgang, and Fritz Strack. "The alleged crisis and the illusion of exact replication." Perspectives on Psychological Science 9.1 (2014): 59-71.

Patrizio Tressoldi, Geoff Cumming and David Giofré, “Changes in the reporting of statistics and the use of open practices in Psychological Science from 2013 to 2015” (under revision).



This article and its reviews are distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and redistribution in any medium, provided that the original author and source are credited.