Best of replication & data sharing Collection 3

bestofThese are the best pieces I came across in the last months on replication, reproducibility & data sharing. While not strictly on political science, they are inspiring and reflect how other fields deal with the lack of reproducibility. Collection #3 (August/September 2013).

Lack of specificity

On the reproducibility of science: unique identification of research resources in the biomedical literature. Research article on PeerJ stating that: “Scientific reproducibility has been at the forefront of many news stories and there exist numerous initiatives to help address this problem. We posit that a contributor is simply a lack of specificity that is required to enable adequate research reproducibility. In particular, the inability to uniquely identify research resources, such as antibodies and model organisms, makes it difficult or impossible to reproduce experiments even where the science is otherwise sound.”

Toxic research environment

Science is in a reproducibility crisis: How do we resolve it? (Sept 20, 2013) by
Fiona Fidler and Ascelin Gordon on The Conversation. The authors state that the ‘publish or perish’ research environment, funding bodies and academic journals (which value novelty over replication) “deserve blame” for the reproducibility crisis in psychology.

Record all steps instead of p-value fishing

Let’s Go Fishing (Sept 25, 2013) by Tom Pepinsky. I like the last paragraph: Pepinsiky suggests that disclosing all code and making not only the results, but the complete process reproducible, will prevent p-value fishing. This would mean a researcher discloses his/her complete code log – from downloading data, transforming the variables, preliminary analysis, playing around with models, fishing for p-values (ups!), final model, table and figures. This is a great idea, which calls for reproducibility of decision making and research process, not just reproducibility of final results. “Our publication practices should not be structured in ways that hide the scientific process, and I’d be interested to hear of ways to accomplish this. One is to embrace the strongest version of the Replication Standard, in which every keystroke and computer command is recorded from the first conceptualization of a research project until publication.” Hands up – who’s in?

Should U.S. federal policy regulate open data and code disclosure?

Changes in the Research Process Must Come From the Scientific Community, not Federal Regulation (Sept 24, 2013) by Victoria Stodden. When President Obama issued an executive order requiring government data to be disclosed publicly, it sounded like a great federal initiative for more open data and code disclosure. Stodden is sceptical: “At first blush, the federal policies changes appear to support these scientific goals, but the scope of government action is limited in ways that impair its ability to respond directly to these concerns. The scientific community cannot rely on federal policy to bring about changes that enable reproducible computational research.” [Interestingly, Victoria Stodden wrote this piece as an invited policy article for a major journal, but publication was declined. She now published it on her blog so the text does not become dated.]

Tight schedules and budgets

Why Most Scientific Papers Contain Irreproducible Results and What We Can Do About It (Sept 23, 2013) by Alisa Moskaleva on myscizzle.com. Main point: “Why are so few results reproducible? Setting aside deliberate scientific misconduct, perhaps the ultimate cause is pressure on researchers from funding agencies to produce many, preferably high-impact papers on tight schedules and budgets.”

Prerequisite for publication: replicability

Nanochemistry Reproducibility (Sept. 19, 2013) by Geoffrey Ozin on materialsviews.com. Explains what reproducibility means in the context of nanomaterials. Core statement: “The crux (…) is a lack of standards and procedures for quantifying reproducibility of known and new nanomaterials.” and “One could resolve this problem by requiring evidence of the degree of reproducibility to be a prerequisite for publication of papers containing a nanomaterials synthesis.”

Incomplete descriptions of methods

Better living through transparency: Improving the reproducibility of fMRI results through comprehensive methods reporting (Cognitive, Affective, & Behavioral Neuroscience, 2013, Vol 13 Issue 3) by Joshua Carp: “While false positive results in neuroimaging studies stem from a variety of causes, incomplete methodological reporting is perhaps the most obvious: Most published reports of neuroimaging studies provide ambiguous or incomplete descriptions of their methods and results.”

Institutions are responsible

Data reproducibility: now is the time for institutions to take greater responsibility (Aug 19. 2013) by Glenn Begley. He says: “[T]he principal responsibility for data quality and integrity rests with the investigator and their host institution. The investigator alone knows whether experiments were performed and interpreted appropriately, or whether they represent a post-hoc construction woven together to ‘tell the best story’. The pressures on individual researchers to achieve fame and grant success are great, and this is an important driver in generation of poor quality, irreproducible research.”

 

More ‘best of’ collections

… are here.

Tagged , ,

One thought on “Best of replication & data sharing Collection 3

Leave a comment