When I started the Political Science Replication blog in early 2013, I mainly wanted to channel my frustration of authors not sharing their data into a blog. 12 months later, the debate on replication and reproducibility in the social sciences has moved towards a strong push for openness and transparency. Here are the best stories of 2013, including #overlyhonestmethods, Reinhart-Rogoff, and the replication initiatives in Psychology and Cancer research.
Overly honest methods on twitter
In early January, the twitter hashtag #overlyhonestmethods revealed the current state of replication and reproducibility in the sciences. Thousands of researchers tweeted about obscure p-values, data manipulation, p-value fishing and the lack of transparency. The ‘confessions’ and cynic remarks were hilarious, but a bit unsettling. Political and social scientists mainly tweeted about: citing papers they haven’t even read; pleasing reviewers with stupid requests; manipulation of p-values and data; “please don’t replicate my study”; qualitative and mixed methods intransparency; “my supervisor told me so”; I don’t know what I’m doing! I collected the main tweets about replication and reproducibility.
Revolution in qualitative work
In February, this blog discussed qualitative research. Reproducibility in qualitative research is not an accepted standard yet. While some researchers try to be transparent and might provide interview transcripts on request, most qualitative work is seldom checked and validated. I talked to political scientists Ingo Rolfing and Todd Landman about how qualitative researchers can be more transparent. The importance of this topic has been acknowledged in a recent issue of PS: Political Science & Politics (Vol 47/Issue 01/January 2014), which contains two articles about transparency and reproducibility in qualitative work.
Only 18 of 120 PolSci journals have replication policy
In March, an important paper in European Political Science: The authors found that out of 120 political science journals only 18 have replication policies. The existence of a data policy was correlated with the impact factor, type of audience, and age of the journal. Their analysis confirmed that the state of reproducibility and political science was substandard.
Open science initiatives in Psychology and Cancer
The field of Psychology was at the forefront of the replication movement: A group of psychologists launched a project in March to make studies radically more transparent (with a pledge of $5.25 million from private supporters). The Center for Open Science is collaborating with an established journal, Perspectives on Psychological Science, to foster research by authors who work transparently and agree to have their studies replicated. Authors will be asked to first publish an experimental design and then, after a public vetting, collect data. The media and academic world reported about the initiative widely.
In the fall of 2013, a similar initiative was launched as an extension of the Psychology initiative: “Reproducibility Project: Cancer Biology”. The top 50 most impactful cancer biology studies published between 2010-2012 are being replicated by the Science Exchange network.
Reinhart-Rogoff and replication in economics
In April, a failed replication of an austerity paper by Reinhart and Rogoff turned into one of the biggest scandals in replication in 2013. Economists Carmen Reinhart and Kenneth Rogoff claimed in “Growth in a Time of Debt” that countries with debt-to-GDP ratios above 90 percent have a slightly negative average growth rate. Their paper was one of the most cited by governments to justify austerity measures – and it was wrong, as a graduate student showed in his replication of the work. The case was picked up by the mainstream media, e.g. Economist Paul Krugman in the New York Times. What we learned from the scandal: Journals should require authors to provide their data set when submitting a manuscript to prevent delays in cross-checking work.
Graduate students publishing replications
2013 showed that graduate students are pushing for more reproducibility by engaging in replications. For example, similar to the graduate student who replicated the Reinhart-Rogoff paper, two students from MIT published their replication work in the Journal of Conflict Resolution in August. Mark Bell and Nicholas Miller changed the common perception of nuclear weapons in their paper “Questioning the Effect of Nuclear Weapons on Conflict”. They wrote the paper as an assignment for Gary King‘s class on Advanced Quantitative Research Methodology. The student who revealed the errors in the Reinhart-Rogoff austerity paper, Thomas Herndon, did his replication study as a class assignment at the University of Massachusetts Amherst. In the Cambridge Replication workshop my students are trying similar things.
Can the replication movement be harmful to research? Do we unfairly damage the reputations of scientist by declaring a finding irreproducible? In November 2013, a comment piece by Mina Bissell on Nature.com lists provoking arguments against replication. Bissell’s article starts a discussion about challenges in replication and reproducibility. For example, American statistician and political scientist Andrew Gelman says that “the push for replication is so strong that now there’s a backlash against it”.
What’s missing from this list?
Please comment below if I missed any major developments in your field.
More replication recap
… in the regularly published “Best of Replication” link collections.