Concrete ideas for increasing your own reproducibility
In March, the Open Science Framework published a draft of “Maximizing the Reproducibility of Your Research” [pdf]. The authors provide concrete ideas for increasing the reproducibility of one’s own research across the research lifecycle: project planning, project implementation, data analysis, reporting, and programmatic research strategies. A great read for anyone aiming at data transparency.
Replication in International Relations
Also in March, the largest International Relations (IR) conference, ISA 2014, discussed replication and reproducibility in IR on two panels (co-convened by me and Nils Petter Gleditsch). It seems that the community agrees that data transparency is important, but still struggles to find ways to implement quality standards.
Replication by students – an untapped resource?
In February, the Monkey Cage Blog published survey results about university courses in political science that assign replication studies as homework. Many replication studies done by graduate students are never published and do not reach the community. Another blog post widens the approach to undergraduate students and discusses how R Markdown, a statistical package, improves reproducibility in the class room at Duke University.
Computer-generated fake papers
In the same month, it was revealed that 122 fake papers were published by e.g. Springer, even thought they were computer-generated by computer scientist Cyril Labbé of Joseph Fourier University in Grenoble, France. He did it to show that conferences would accept meaningless papers, reflecting the state of peer-review.
Where are my files?
In his post “A Tale from the Replication Crypt” on the Dart-Throwing Chimp blog, political scientist Jay Ulfelder told a story about replication that started with a confession: “I got an email this morning from a colleague asking for the replication files for a paper I published in 2005 (PDF). Sheepishly, I had to admit that I didn’t have them.”
Large institutions push for reproducibility
A new development was discussed on the blog Research Remix: For grant applications, the National Science Foundation now asks for research “products” (starting January 2014), and not just published papers, which includes data sets and software as well. This could be a great step towards more reproducibility and data transparency. In a blog post, David Funder discussed how the NSF reacted to reproducibility concerns at a workshop. Similarly, the U.S. National Institutes of Health (NIH) discussed several efforts to improve the reproducibility of preclinical research, as reported in the Science article “NIH Takes Steps to Improve Reproducibility“.
Symposium on Replication
Early in 2014, the journal PS: Political Science & Politics (Vol 47, Issue 1) published a symposium to discuss the state of reproducibility in the field. The topics included assigning replications to students; journal publication of failed replications; and new ethics guidelines on data sharing and reproducibility developed by the American Political Science Association (APSA).
In Management Studies, replication is also a challenge, as a January 2014 article finds. The authors criticize the increasing competition among scholars for publication opportunities, which has caused systemic ethical issues regarding the accuracy, legitimacy replicability and transparency of management scholarship.
Journals and Research Quality
Several pieces at the end of 2013 discussed the role of journals for quality in science. The article “Deep impact: unintended consequences of journal rank” discussed the willingness of journals to issue retractions, and if there is an association between statistical power and journal impact factor. The Guardian also published a piece on how academic journals aggressively curate their brands while discouraging other important work, such as replication studies. As an answer to these challenges, the new journal Research & Politics now invites replication studies. The journal PLOS, in the meantime, had to clarify its replication policy. In a bold step, it had announced that all data and digital materials that had been collected for a paper had to be made transparent. This stirred a debate on what exactly that meant, and how much they can force authors to make data available. The journal then apologized and revised the policy. Retraction Watch tells the whole story.
Back in December 2013, Mina Bissell from the Lawrence Berkeley National Laboratory was wondering if reproducibility and replication studies can hinder promising research. Andrew Gelman (Columbia University) picked up the discussion and noted that we should not think of replication in a defensive way. The discussion is still ongoing, at least in political science and international relations. For example, at the ISA conference 2014 in Toronto, concerns were raised over letting students do replication studies (this might create a cohort of young scholars who try to hunt down rogue scientists instead of producing their own quality research).
More ‘best of’ collections
… are here.