Scholars in political science commonly agree that we need more researchers doing replications of published work. But how to motivate them? In the recent symposium on the topic in PS: Political Science & Politics (Vol 47, Issue 1), Thomas M. Carsey discusses concrete steps researchers and journals can take. This blog post is part of a series of posts about the PS symposium, which summarizes and discusses the current state of reproducibility in political science.
The American Political Science Association (APSA)
recently developed new guidelines for data access and research transparency. The association has also appointed the Data Access and Research Transparency (DA-RT) ad hoc committee
to continue the discussion. In his article “Making DA-RT a Reality“, Thomas M. Carsey discusses concrete actions that universities, scholars and teachers can take to improve data access and research transparency in the field. So far, while everyone agrees that transparency and reproducibility are important, many have “limited time and resources” to actually work transparently. Only few do replication studies: We all seem to want “others” to do replications.
How to get more scholars to do replication studies
If replication is critical to the progress of knowledge, scholarly journals should be encouraged to publish replication studies, and departments and universities should give credit to researchers who produce such work. Doing this would encourage more scholars to engage in such activities.
For example, journals should develop an idea of what kind of replication studies should be published. More specifically, what is a publishable replication study? How can we make sure that those cross-checking work get rewarded in the form of journal publication? Carsey points out that a pure re-analysis based on the exact same data and methods might not be material for a leading journal.
However, scholars should be encouraged to confirm or challenge findings using similar data and similar methods as a means of assessing the robustness of a published result. Efforts that uncover findings that cannot be replicated also deserve publication.
In my opinion, and for my students in the replication workshop at Cambridge, that could mean the following:
- Re-analyze the paper based on the same data and methods – if the replication fails, this is publishable, or (maybe as a next step):
- Re-analyze the research question based on similar data, recollected data, validation data sets using similar or improved methods – if the replication succeeds or fails, both cases could lead to a publication
Would journals publish only failed replications?
The above two options touch on the important question if we can get replications published. There are some replication studies out there, and some even receive an answer by the original author in form of a ‘replication chain‘. But I have the impression that published replications seldom confirm all results.
And actually, I wonder if journals would find it interesting enough to publish replication studies confirming previous results? Or would journals rather publish ‘surprising’ discoveries, i.e. results refuting the original article? I have a feeling the latter might be the case. This would mean that those replicating published work have a strong incentive to hunt down and highlight errors to get published. Carsey’s idea would have to be further discussed with journal editors and in the political science community to make that clear.
Read the article by Carsey
Thomas M. Carsey (2014). Making DA-RT a Reality. PS: Political Science & Politics, 47, pp 72-77. doi:10.1017/S1049096513001753.
Abstract: Calls for greater data access and research transparency have emerged on many fronts within professional social science. For example, the American Political Science Association (APSA) recently adopted new guidelines for data access and research transparency. APSA has also appointed the Data Access and Research Transparency (DA-RT) ad hoc committee to continue exploring these issues. DA-RT sponsored this symposium. In addition, funding agencies like the National Institutes for Health (NIH) and the National Science Foundation (NSF) have expanded requirements for data management and data distribution. These pressures present challenges to researchers, but they also present opportunities.
Other articles from the PS Symposium
Arthur Lupia and Colin Elman write about New ethics guidelines for data Access & research transparency.