Open access to both the scientific process and results should be the default, not the exception. In the first of this two-part episode, Brian Nosek and Tim Errington from the Center for Open Science talk about the important role of open science in accelerating scientific progress, as discussed in their open-access article “Reproducibility in Cancer Biology: Making sense of replications.” In part two of this episode, they share stories from a cancer replication conducted as part of the Center’s attempt to reproduce impactful published research.
Websites
- The Center for Open Science
- The Open Science Framework
- The Reproducibility Project: Cancer Biology
- Many analysts, one dataset: Making transparent how variations in analytical choices affect results (paper regarding red carding in soccer)
- Investigating Variation in Replicability: A “Many Labs” Replication Project
- Reproducibility Project partner — Science Exchange
Press coverage
- Undark‘s Go Forth and Replicate: On Creating Incentives for Repeat Studies
- The Atlantic‘s How Reliable Are Psychology Studies?
- Reason magazine’s Broken Science
- Science magazine’s Estimating the reproducibility of psychological science
- APS‘s Reproducibility Project Named Among Top Scientific Achievements of 2015
- What does research reproducibility mean? (by Steven N. Goodman, Daniele Fanelli and John P. A. Ioannidis)
Bonus Clips
Patrons of Parsing Science gain exclusive access to bonus clips from all our episodes and can also download mp3s of every individual episode.
Support us for as little as $1 per month at Patreon. Cancel anytime.
Patrons can access bonus content here.
We’re not a registered tax-exempt organization, so unfortunately gifts aren’t tax deductible.
Hosts / Producers
Ryan Watkins & Doug Leigh
How to Cite
Watkins, R., Leigh, D., Nosek, B., & Errington, T.. (2017, October 17). Parsing Science – Open Science and Replications (Part 1 of 2). figshare. https://doi.org/10.6084/m9.figshare.5907961
Music
What’s The Angle? by Shane Ivers
Lastly, Doug and I asked Brian to talk with us about what kinds of work is done at the Center for Open Science.@rwatkins says:
Next, Tim elaborated further on how transparency aids in detecting and addressing discrepancies in research.@rwatkins says:
In their paper "Making sense of replications," Brian and Tim distinguish "methodological discrepancies" from "errors." Doug and I asked Brian to explain what this distinction means to him.@rwatkins says:
So-called "successful" replications are those that arrive at the same conclusions as the original study, though in reality, a replication is "successful" so long as it's carried out as planned. Next, Tim talked with us about what can be learned from replication studies regardless of whether or not they result in the same conclusions as the original study.@rwatkins says:
Doug and I asked Brian why the inferences that come from a replication might differ between replications, even when the methods, results and perhaps even the data are the same.@rwatkins says:
Next, Tim elaborated on Brian's example of replicating a study that was originally done in Germany.@rwatkins says:
Replications might differ from the original studies that inspired them. Brian talked with us about how - and why - such decisions might occur, as well as what the results of replications can tell us.@rwatkins says:
There's a fair amount of confusion about what "reproducability" and "replication" are. Ryan and I asked Brian to explain how the terms have come to understood - and perhaps misunderstood.@rwatkins says:
If people have been aware of the shortcomings with which research has been carried out for so long, then why hasn't something been done about it? Brian explains.@rwatkins says:
Next, Tim talked with us how he came to work on replication projects, and how he sees that replication and transparency can improve science.@rwatkins says:
Brian is perhaps most well known for leading an attempt to replicate 100 papers published in top-tier psychology journals, an undertaking which "Science" magazine identified as a leading scientific breakthrough in 2015. We wondered how this interest in the replicability of scientific research came about.