New APSA ethics guidelines
In January 2014, the American Political Science Association published new ethics guidelines. The guidelines for Data Access and Research Transparency (DA-RT) called for:
- Data access: Researchers making evidence-based knowledge claims should reference the data they used to make those claims.
- Production transparency: Researchers providing access to data they themselves generated or collected.
- Analytic Transparency: Researchers making evidence-based knowledge claims should provide a full account of how they draw their analytic conclusions from the data.
Computer-generated papers retracted
Early in 2014, over 100 articles that were computer-generated nonsense had to be retracted from conference proceedings. A french computer scientist had submitted the articles to demonstrate that conferences would accept meaningless papers.
A special issue in Social Psychology published failed and successful replications instead of typical research papers. A Cambridge scholar, Simone Schnall, whose paper could not be replicated, feelt treated unfairly by the “data detectives.” She said that the replicators had set out to “declare the verdict” that they failed to reproduce her results. This sparked a discussion about how to treat original authors whose study failed to replicate, which is still ongoing (see here and here, here, on Andrew Gelman’s blog, and on Rolf Zwaan’s blog).
Rules of Replication
Reacting to criticism of replicaiton, in 2014 more and more ideas were discussed on how to do replications ‘right’, for example in The Scientist, in this paper, at the World Bank blog on Development, and on the Political Science Replication blog.
After fields like Psychology have explored pre-registration of studies, Political Science followed the trend. Comparative Political Studies invited submissions for a special issue in which all papers have to be pre-registered and are submitted without the results section. I look forward to hearing their experiences. Meanwhile, International Studies Quarterly, the leading IR journal, started tracking down replication data for their published studies (as does the Quarterly Journal of Political Science), and the journal Research and Politics called for replication studies to be submitted.
In 2014, I noticed an increasing trend of self-retraction when authors could not replicate their own findings, for example a case in neuroscience and this paper on protein experiments, and lots of more examples on the Retraction Watch blog.
New reproducibility initiative
In the field of political science, the Political Science Replication Initiative now provides a platform to upload replication studies by university students. All original authors are informed and have time to comment on the replication of their work.
News media and the replication trend
The Guardian reported how “psychology is leading reforms that will benefit all life sciences“. The New York Times discussed if most published findings are incorrect. There was also a lot of talk about economist Pickety’s book which was published alongside replication files.
Funders take reproducibility on board
A new funding policy by the US National Science Foundation reflected the importance of transparency, and could be seen as a sea-change in how researchers are evaluated. The US National Institutes of Health took steps to enhance reproducibility as well (see also here). Even the White House showed some interest.
What’s missing from this list?
Please comment below if I missed any major developments in your field.
Read the best reproducibility stories of 2013.