A great way to promote reproducibility in political science is to let graduate students replicate published work. This way, they learn ‘hands-on’ statistics, but also how frustrating it is when authors don’t keep a log of their files and code. In the best case scenario, this encourages a new cohort of political scientists to be transparent in their future work. In the recent symposium on reproducibility in PS: Political Science & Politics (Vol 47, Issue 1), Thomas M. Carsey discusses his experiences of assigning replications to students – a thought-provoking and excellent read! [This blog post is part of a series of posts about the PS symposium, which summarizes and discusses the current state of reproducibility in political science.]
Replications in graduate student teaching
Thomas M. Carsey wrote an excellent article, “Making DA-RT a Reality“, in the recent symposium on replication and reproducibility in PS: Political Science & Politics. I already summarized some of his ideas for a change in journal policies in an earlier post.
Another important point raised by Carsey is how replications and reproducibility can be implemented through graduate student teaching. This includes:
- encouraging students to value reproducibility
- encouraging students to work transparently in their own seminar papers and assignments
- encouraging students to engage in replications to learn methods
Referring to the “growing trend within quantitative methods” teaching of letting students replicate published work, Carsey reports from his own experience:
I have given such assignments regularly for the last decade. Unfortunately, one of the lessons students generally learn from this is how poorly existing research is documented and how difficult it is to replicate published results. Data is not made available, different versions of the data exist but are not clearly documented, decisions used to transform or recode variables are not well documented, and code used to conduct the actual analysis is not provided. Most of the original authors try to be helpful, but occasionally they are entirely nonresponsive. My students often come away from this assignment frustrated, shocked, and rather disappointed by what they see as practices that undermine the credibility of the research they read.
Students in my replication workshop made similar experiences. Even though the experience can be frustrating, Carsey is right in saying that: “I also hope these experiences instill a resolve in them to make sure that their own work meets higher standards for data access and research transparency.”
He suggests that replications can be embedded in teaching through collaborations between departments and organizations with interdisciplinary training centres. “Some of the training necessary might be delivered online or through workshops rather than in traditional classes and seminars.”
I think it is nextremely imortant to find ways on how to best integrate replication into the classroom.
- Should we assign replications in methods courses, or establish more stand-alone replication workshops?
- How do we motivate students to keep going when the task is frustrating? Can we really ‘promise’ students that their replication study will be publishable?
- How can a replication workshop work smoothly?
- How can those teachers assinging replication exchange their experiences about challenges and pitfalls to improve replication in the classroom? Is there a network, email-list, repository for this purpose specifically – or do we need one?
- What software should students use to achieve maximum reproducibility? (I’d say R…)
I would love to see comments below or on twitter (@PolSciReplicate) on these issues.
I am also raising some of these questions in a paper that is in preparation for the next ISA 2014 conference in Toronto, for which Nils Petter Gleditsch and I organized two panels on reproducibility (see full program as pdf, search for “replication” or “Nicole Janz”). Gary King also gives good ideas in his 1995 article on facilitating replication.
Read the article by Carsey
Thomas M. Carsey (2014). Making DA-RT a Reality. PS: Political Science & Politics, 47, pp 72-77. doi:10.1017/S1049096513001753.
Abstract: Calls for greater data access and research transparency have emerged on many fronts within professional social science. For example, the American Political Science Association (APSA) recently adopted new guidelines for data access and research transparency. APSA has also appointed the Data Access and Research Transparency (DA-RT) ad hoc committee to continue exploring these issues. DA-RT sponsored this symposium. In addition, funding agencies like the National Institutes for Health (NIH) and the National Science Foundation (NSF) have expanded requirements for data management and data distribution. These pressures present challenges to researchers, but they also present opportunities.
Other articles from the PS Symposium
Arthur Lupia and Colin Elman write about New ethics guidelines for data Access & research transparency.