In June 2014, the NIH held a joint workshop with the Nature Publishing Group and Science on the issue of reproducibility and rigor of research findings. The workshop’s goal was to strengthen approaches to support biomedical research that is reproducible, robust, and transparent. An editorial appears in the November 5, 2014, online edition of Nature.
Workshop participants included journal editors representing more than 30 basic/preclinical science journals in which NIH-funded investigators have most often published. Attendees reached consensus on a set of principles and guidelines to facilitate the interpretation and repetition of experiments as they have been conducted in published studies. Principles endorsed by the group cover five areas, recommended to be delineated in each journal’s Information for Authors section or other public place:
- Rigorous statistical analysis: Outline the journal’s policy for statistical analysis and have a method of checking the statistical accuracy of submissions
- Transparency in reporting: Provide a checklist of reporting standards (replicates, statistics, randomization, blinding, sample-size estimation, inclusion/exclusion criteria) and require authors to state where this information is located in the manuscript
- Data and material sharing: Stipulate that all datasets on which the conclusions of the paper rely must be made available upon request, where appropriate, during manuscript review and upon publication
- Consideration of refutations: Include the journal’s policy for considering refutations of the paper, subject to its usual standards of quality
- Best practices guidelines: Establish methods for dealing with image-based data and biological material (antibodies, cell lines, animals)
The existence of these guidelines does not preclude the need for replication or independent verification of research results, but should make it easier to perform such replication. Journals endorsing the proposed principles and guidelines are listed here.