Efforts to repeat key cancer biology experiments reveal challenges and opportunities to improve reproducibility – .

0
15
Efforts to repeat key cancer biology experiments reveal challenges and opportunities to improve reproducibility – .


CHARLOTTESVILLE, Virginie., December 7, 2021 / PRNewswire / – A large-scale systematic investigation to replicate high-impact preclinical cancer biology experiments identified barriers to making replications and observed weaker evidence of the results compared to the original experiments. Unnecessary friction in the research process can slow down the advancement of knowledge, solutions and treatments.
Today, eLife publishes the final results of the Reproducibility project: Cancer biology, an 8-year effort to replicate the experiences of 53 high-impact papers published between 2010 and 2012. Tim Errington, the research director at the Center for Open Science and project leader said: “The aim of the project was to transparently assess to what extent there are challenges in performing replications and obtaining similar evidence from the results published in cancer biology research. “

Launched in 2013, the Reproducibility Project: Cancer Biology was a collaboration between the Center for Open Science, a nonprofit culture change organization with a mission to improve the openness, integrity and reproducibility of research. , and Science Exchange, the world’s leading online R&D marketplace whose mission is to accelerate scientific discovery. With support from Arnold Ventures (formerly the Laura and John Arnold Foundation), the team conducted a systematic process to select high-impact cancer research articles published between 2010 and 2012. Based on the screening criteria , most of the articles were from high level articles. journals such as Nature, Science, and Cell. A total of 193 experiments were selected for replication.

The team designed replication experiments of the main results of each article by reviewing the methodology and requesting information on protocols and reagent availability. Then, the appropriate expertise to conduct the experiments was obtained through the Science Exchange market.

For each article, the detailed protocols of the replication experiments were written in the form of a recorded report and submitted to eLife for peer review; Additionally, work on the replication experiments could not begin until the saved report was accepted for publication. The completed replication experiments were then written up as a replication study, peer reviewed, and published in eLife. Two of the articles published today are executive summaries of the entire project.

The first article “Challenges for Assessing Replicability in Preclinical Cancer Biology” reports on the challenges encountered when preparing and performing replications of 193 experiments from 53 articles. None of the experiments were described in sufficient detail to design a replication without seeking clarification from the original authors. Some authors (26%) were extremely helpful and generous with their comments, and some authors (32%) were either not helpful at all or did not respond to requests. During the experiment, about two-thirds of the experiments required some modification of the protocols because, for example, the model systems behaved differently from what was originally reported. In the end, 50 replication experiments of 23 articles were performed, a small proportion of what was expected. Errington noted, “We had challenges every step of the way in the research process to design and conduct the replications. It was difficult to understand what had been done originally, we could not always access the original data or reagents to run the experiments and model the systems often did not behave as originally stated. Limited transparency and incomplete reporting made efforts to replicate the results much more difficult than necessary. ”

The second article, “Investiging the Replicability of Preclinical Cancer Biology”, reports a meta-analysis of the results of the 50 replication experiments that were performed. Many of these experiments involved measuring more than one effect (for example, measuring the influence of an intervention on tumor burden and overall survival), and the 50 experiments that were performed included a total of 158 effects. Most of these effects (136) were reported as positive effects in the original articles, 22 being reported as no effects. The meta-analysis also had to take into account that 41 of the effects were reported as images rather than numerical values ​​in the original articles. Replications provided much weaker evidence of results compared to the original experiments. For example, for the original positive results, the replication effect sizes were on average 85% smaller than the original effect sizes.

The team also used a number of binary criteria to assess whether a replication was successful or not. A total of 112 effects could be evaluated according to five of these criteria, and 18% succeeded out of five, 15% succeeded in four, 13% succeeded in three, 21% succeeded in two, 13% succeeded in one and 20% failed out of the five. Collectively, 46% of replications passed on more criteria than they failed, and 54% of replications failed on more criteria than they passed.

Summarizing, Errington noted: “Among the replication experiments we were able to complete, the evidence was on average much weaker than the original results, even though all replications were peer reviewed prior to release. conduct the experiments to maximize their quality and thoroughness. Our results suggest that there is room to improve reproducibility in preclinical cancer research. “

Brian Nosek, Executive Director of the Center for Open Science and co-author, added, “Science is making substantial progress in meeting the challenges of global health. Evidence from this project suggests we could do even better. There is unnecessary friction in the research process which interferes with the advancement of knowledge, solutions and treatments. Investing in improving the transparency, sharing and rigor of preclinical research could generate huge returns on investment by removing sources of friction and accelerating science. For example, the open sharing of data, materials and code will make it easier to understand, criticize and build on each other’s work. In addition, pre-registering experiments and test designs will reduce the negative effects of publication bias and distinguish between planned tests and unanticipated findings. “

These articles identify important challenges for cancer research, but they come as part of a reform of science to tackle dysfunctional incentives, improve the research culture, increase transparency and sharing, and improve performance. rigor in the design and conduct of research. Science is at its best when it confronts itself and identifies ways to improve the quality and credibility of research results. The Reproducibility project: Cancer biology is only a contribution to an ongoing self-examination of research practices and opportunities for improvement.

Supporting information
Additional information on the project is also available via this OSF link. This includes a fact sheet, background information, a list of independent researchers who have agreed to be listed as possible contacts for interviews, and a guide with links to navigate the content of articles, journals and reviews. additional information from the RP: CB.

Previously published recorded reports, replication studies and associated comments are all available on the eLife site. Reproducibility project: Cancer biology collection page, and all data, codes and media are available in COS Reproducibility Project: Cancer Biology Collection. Summary project information and links to key resources are also available at http://cos.io/rpcb.

About the Center for Open Science
Founded in 2013, COS is a non-profit organization for technological and cultural change whose mission is to increase the openness, integrity and reproducibility of scientific research. COS pursues this mission by creating communities around open scientific practices, supporting metascience research, and developing and maintaining free and open source software tools, including the Open Science Framework (OSF). For more information, visit cos.io.

About Science Exchange
Founded in 2011 with the aim of accelerating scientific discovery, Science Exchange is an online marketplace that promotes scientific outsourcing for the R&D industry, providing companies with instant access to scientific services from a network of trust of contract research organizations. The Science Exchange R&D Marketplace simplifies scientific outsourcing and eliminates procurement delays so scientists can access innovation without the administrative burden. Since 2011, Science Exchange has collected more than $ 70 million of Norwest Venture Partners, Maverick Ventures, Union Square Ventures, Collaborative Fund, Windham Ventures, OATV, YC Continuity Fund and others. For more information, visit scienceexchange.com.

About eLife
eLife is a nonprofit organization created by funders and run by researchers. Our mission is to accelerate discovery by harnessing a research communication platform that encourages and recognizes the most responsible behaviors. We review selected preprints from all areas of biology and medicine, while exploring new ways to improve the way research is evaluated and published. eLife receives financial support and strategic advice from the Howard Hughes Medical Institute, the Knut and Alice Wallenberg Foundation, the Max Planck Society and Wellcome. Learn more about elifesciences.org/about.

About Arnold Ventures
Arnold Ventures is a philanthropic organization dedicated to solving some of the most pressing issues in United States. We invest in lasting change, building it from the ground up on the basis of research, deep thinking and a strong evidence base. We lead public conversation, shape policy and inspire action through education and advocacy. For more information visit arnoldventures.org.

SOURCE Center for Open Science

LEAVE A REPLY

Please enter your comment!
Please enter your name here