Replication is key to the credibility and confidence in research findings. As falsification checks of past evidence, replication efforts contribute in essential ways to the production of scientific knowledge. They allow us to assess which findings are robust, making science a self-correcting system, with major downstream effects on policy-making. Despite these benefits, reproducibility and replicability rates are surprisingly low, and direct replications rarely published.
The Journal of Development Economics ( JDE) is piloting a new approach in which authors have the opportunity to submit empirical research designs for review and approval before the results of the study are known. While the JDE is the first journal in economics to implement this approach—referred to as “pre-results review”—it joins over 100 other journals from across the sciences.
Academia has been abuzz in recent years with new initiatives focusing on research transparency, replication and reproducibility of research. Notable in this regard are the Berkeley Initiative for Transparency in the Social Sciences, and the Reproducibility Initiative which PLOS and Science Exchange are involved, but there are many others. Psychology and political science have had a number of new initiatives that are shaking up the scientific research and publication process.
[THIS BLOG ORIGINALLY APPEARED ON THE BITSS WEBSITE] As advocates for open data, my colleagues and I often point to re-use of data for further research as a major benefit of data-sharing. In fact there are many cases in which shared data was clearly very useful for further research. Take the Sloan Digital Sky Survey ( SDSS) data, which researchers have used for nearly 6,000 papers.
The Berkeley Initiative for Transparency in the Social Sciences ( BITSS) was formed in late 2012 after a meeting in Berkeley that led to the publication of an article in Science on ways to increase transparency and improve reproducibility in research across the social sciences. BITSS is part of Berkeley’s Center for Effective Global Action ( CEGA), and is led by development economist Edward Miguel and advised by a group of leaders in transparent research from economics, psychology, political science, and public health.
(THIS BLOG IS REPOSTED FROM THE BITSS WEBSITE) I became interested in methodological issues as a University of Michigan graduate student from 1967 to 1970, watching the economics faculty build an econometric macro model in the basement of the building (The Michigan Model), and comparing how these same faculty members described what they were doing when they taught econometric theory on the top floor of the building.