Published by the Students of Johns Hopkins since 1896
November 14, 2024

The reproducibility crisis in neuroscience

By DUY PHAN | March 2, 2017

ajor characteristic of good science is that it is reproducible. This means that the same data can be obtained over and over again by independent efforts. Data that cannot be replicated are thought to be the result of careless science, or more insidiously, fraud and fabrication.

Most of the time, there is not much of a push to validate data. In fact, the majority of scientific work escapes the test of reproduction, since scientific journals are focused on novelty rather than replication. Works that simply try to reproduce the data without adding new insight are difficult to publish.

Several recent high-profile cases have brought to attention the issue of reproducible science. One famous example is that of the Stimulus-Triggered Acquisition of Pluripotency (STAP) stem cell research, in which a Japanese scientist claimed to have been able to turn any mature cell in the body back into a stem cell simply by dipping the mature cell in acid. The STAP stem cell swept the scientific world like a storm, bringing promise of new paradigms for stem cell research and perhaps even stem-cell based treatments for diseases.

However, the STAP stem cell story was ultimately deemed to be fake data when multiple attempts failed to reproduce the seemingly simple method. Formal investigations into the research concluded that much of the data had been fabricated by Haruko Obokata, the principal scientist involved.

Beyond the highly publicized case of the STAP stem cell, other data suggest that reproducibility might actually be an epidemic across all of science. Amgen, a biotechnology company, claimed that it failed to replicate the results of 47 out of 53 “landmark” cancer studies.

Another project was also initiated by a former geneticist to validate 29 famous cancer studies published in prestigious journals such as Nature, Science and Cell. Out of five published studies that the project tried to validate, so far only two were considered substantially reproduced.

When we are confronted with such failed replication attempts, it is crucial to ask why some of these studies can’t be replicated.

Although an obvious and cynical answer to this question is fraud and fabrication of data, we need to remember that science is extremely difficult and complex. Many of the landmark studies required extensive and technically challenging experiments that can be very difficult for others to replicate.

Observing a significant effect might therefore require a set of experienced hands rather than a newbie who has never attempted the experiment before. It is possible that failure to reproduce an effect may have come from the lack of experience of the individuals who are trying to replicate the data in doing the challenging experiments.

Therefore one way to address the reproducibility crisis is to advocate for the simplest experiments possible that can still provide evidence for a hypothesis. Although the surge of new cutting-edge neuroscience tools has made it possible to address questions that were previously not testable due to lack of technologies, an over-reliance on overly complicated methods and techniques can make it very difficult to reproduce data.

Bringing up the Brainwave’s Golden Rule for science, the truth is simple, and therefore it is not always necessary to use complicated experiments.

Another crucial point about claims of failing to replicate data is that the previously mentioned examples represent only one attempt at replicating the data. Science can be random, and replications are required to wash out the inherent stochasticity. Before we point fingers and place blame on people, it is necessary to see multiple attempts to replicate the data first.

Some journals are trying to encourage such repeated attempts to replicate the data. For example, eNeuro now allows for publications of articles that show failure to replicate previously published data. By having a publication pipeline for such replication studies, journals can play a role in encouraging multiple labs to try to rigorously challenge scientific results.


Have a tip or story idea?
Let us know!

Leisure Interactive Food Map