Reproducibility is held as the gold standard for scientific research. The legitimacy of any published work depends on the question: can we replicate the analysis and come to the same results? Therefore, authors have to provide information on how exactly they collected the data and conducted the analysis. Without such transparency about the research process, scholars cannot evaluate work, fully understand the value of results, and move on to build further, new knowledge, as King (1995) has pointed out. But why are so few scholars doing replication studies?
“Political science is a community enterprise and the community of empirical political scientists need access to the body of data necessary to replicate existing studies to understand, evaluate, and especially build on this work. Unfortunately, the norms we have in place now do not encourage, or in some cases even permit, this aim.” (King, Gary. 1995. Replication, Replication. PS: Political Science and Politics 28: 443–499 | pdf)
While most scholars in political science agree on the benefits of reproducibility and replication studies, there is still no consensus about how to implement these principles in practice. Two main problems are:
- not all researchers work transparently
- there are few professional rewards for those who conduct replication studies to cross-check previous work
Problem 1: Not all researchers work transparently
There have been several instances in the past where original authors would or could not provide their data and analysis details. Even those authors who were happy to share their data did not archive them effectively so that others can easily work with them, as Lupia and Elman (2014) have noted. There are several reasons for that.
- One is that the political science field still lacks clear guidelines on how research information should be shared. A recent study of reproducibility and data transparency in political science found that only 18 of 120 journals have a replication policy asking authors to upload their datasets and other information.
- Being transparent means investing time to maintain detailed logs of data collection, variable coding, and the analysis. Due to teaching commitments and the pressure to publish, there is often little incentive to record research steps and make data accessible.
- There might be potential embarrassment and a fear of reputational loss for authors when the replication fails and errors are detected (Lupia and Elman 2014)
Problem 2: Few rewards for doing replications
Even if all research were transparent and data available online, there is often little incentive to conduct replications of existing research. A common criterion in the peer-review process of journals is the presentation of new, original research, which systematically marginalizes the re-analysis of already published journal articles (Carsey 2014). There is little motivation for scholars to engage in a replication study when the prospect of publication is low.
The solution? Teaching!
This blog (and my ISA paper) argues that both challenges, the lack of reproducibility, and the lack of replication studies, can be tackled through a fundamental change in graduate student teaching. By establishing clear transparency standards in teaching, and by encouraging students to replicate existing work, the gold standard for scientific research can be implemented more efficiently than in the past.
When students are urged to keep logs and provide their own data early on, they can develop a working transparency routine for the rest of their career. And when they conduct replication studies as part of their methods training, they not only understand methods better. But through the – sometimes frustrating – process of trying to re-analyze published work based on the provided data and methods description, they learn first hand when provided files are really reproducible, and when they are not.
This would, ideally, encourage them to improve transparency in their own work. Only when universities nurture a reproducibility and replication culture, we can ensure that the gold standard of reliable, credible and valid and results is not just an empty phrase.
Discuss this at ISA in Toronto
ISA’s 55th Annual Convention
Spaces and Places – Geopolitics in an Era of Globalization
March 26th – 29th, 2014, Toronto, Canada
Friday, March 28, 2014
8:15 AM – 10:00 AM
FA10: Replication in International Relations: How Journal Data Policies and Replication in Teaching Can Improve Reproducibility Standards
Willow Centre, Sheraton Centre Toronto
Immediately followed by:
10:30 AM – 12:15 PM
FB10: Replication in International Relations: Successes and Failures in Practice