Can Cancer Research Findings Be Replicated? It’s Complicated

The Center for Open Science, headed by University of Virginia psychology professor Brian Nosek, has published the first results of its Reproducibility Project on cancer biology.

Printed in the journal eLife, the aim of the work is to assess the reproducibility of published, high-impact cancer biology studies in an open and transparent fashion, part of Nosek’s larger effort to replicate research findings across many fields of academia.

The journal published assessments of five, pre-clinical studies in the project. More than 20 others will be released in the coming months.

Tim Errington, who’s leading the center’s cancer biology project, said two of the studies leave open the question of whether or not they can be successfully replicated. The three remaining replications, he said, revealed several stark differences from the original work. All of the studies were mouse experiments, ranging from a test of a novel cancer inhibitor to understanding mutations that drive cancer progression.

Errington said the mixed results highlight how difficult it is to reproduce studies and how important it is to make all phases of studies public, for the good of science.

“The reason we think this is important is it’s not just the results. It’s the process,” he said, speaking at the center’s offices in downtown Charlottesville. “If a paper publishes a result and publishes a methodology to get to that result, one would hope that the next person who uses that information doesn’t have to start from ground zero again.”

Why Cancer Biology?

In 2015, Nosek’s group published findings that two-thirds of a collection of psychology studies could not be replicated, news that was widely reported and, in some quarters of the scientific community, disputed. His center turned to cancer biology after two prominent 2011 papers from industrial labs Amgen and Bayer found the reproducibility of cancer biology results and basic pre-clinical work to be very low.

“They were summarizing results from their internal laboratory work of attempts to replicate landmark findings in cancer biology that they would then extend into trying to get to clinical trials and pharmaceuticals,” he said.

But Nosek said Amgen and Bayer found that results of approximately 70 percent of the studies could not be replicated. And the papers, published in the journal Nature, left the basic research community wanting.

“The problem with those results is that they didn’t share any information, anything that they did,” he said. “It could have been non-disclosure agreements. There could have been lots of good reasons, but for whatever reason, it wasn’t shared.

“It sort of threw down a gauntlet of ‘reproducibility is a big problem.’ We need to address it,” he said. “The entire goal of this project is to say, ‘We are going to investigate reproducibility, see if we run into problems, where those problems are and try to identify what are the potential causes of irreproducibility? What gets in the way of being able to confirm a result?’”

How Did They Do It?

Nosek’s group began by identifying “high-impact” papers in cancer biology published between 2010 and 2012. “That selection process used different criteria,” he said. “How many citations did the article receive? How many readers do they get on some popular social media citation networks?”

His team ultimately identified 50 papers, 16 or 17 from each of the three years being looked at, and winnowed those to about 29 papers after taking into account the time and money it would take to do the replications.

“Basically, it ends up being papers published in Nature, Science and Cell,” Nosek said.

The team then enlisted a community of post-docs, grad students and early faculty to read the papers. “There’s no way a small team could do that efficiently,” Errington said, “so we outsourced that and we got some grad students and post-docs at UVA to help us out.”

With their feedback in hand, the center then reached out to the authors of the studies and shared any questions they had. “So as you can imagine, now we get the back-and-forth,” Errington said. “Sometimes they don’t engage, sometimes they do engage. And sometimes they share the materials and sometimes they don’t – sometimes they don’t have the materials,” because years have gone by since the original work was done.

If that is the case, the center estimates data “just like any other reader would,” Errington said, and lets the author know how they are proceeding.

With protocols in hand, Errington’s team turned to a group called Science Exchange, which identified labs equipped to do the experiments. “We basically give the labs the protocol and ask, “If you need to change anything, tell us what you plan to do.”

This is where the journal eLife enters the picture, peer-reviewing the protocols agreed upon by the Center for Open Science and the labs identified by Science Exchange. Once eLife gives the green light, work begins at the labs.

“The whole point of this is to front-load all the work, to figure out what are we doing and how are we doing it,” Errington said. “The truth is that, for the most part, this stuff is just occurring on the fly when anybody is doing research. The point of following this approach is we want to make that process independent from the results. We want to remove that bias.”

The project is being organized on the Center for Open Science’s Open Science Framework, a free online service where all the experimental protocols, materials, data, analysis, and results will be made available to the public.

What is next for the center? Tropical ecology. “Emilio Bruna is a tropical ecologist at the University of Florida and the editor-in-chief of Biotropica, which is one of the top journals in that field,” Errington said. “He essentially wants to do the same thing in that field.”


Substack subscription form sign up