Written Evidence Submitted by the Academy of Medical Sciences
(RRE0055)
Summary
- The Academy is concerned about the extent of irreproducibility in science.[1],[2] A certain degree of irreproducibility is to be expected in scientific research, especially in the field of biomedical science. However, it is the pervasiveness of irreproducibility that is concerning to the scientific community. This reproducibility crisis poses several major risks, including:
- Hindering scientific progress.
- Delaying the translation of useful and beneficial findings into clinical and other practical applications.
- Wasting time and resources on unreliable scientific findings.
- Impacting public trust in science.
- There are a range of previous and ongoing activities, initiatives and bodies concerned with addressing irreproducibility in science, however, more needs to be done and progress should be accelerated.
- There are many contributors to irreproducibility in scientific research. These contributions span the entire hierarchy of academia – from the development of new methodologies, individual researchers using wrong reagents and misusing statistics, up to publishers focussing on novelty rather than robustness, and funders prioritising research output over other metrics.
- Communicating the strengths and weaknesses of the scientific process (including the nuances of the issues around reproducibility) effectively and transparently to the public represents a way to maintain trust in science.
- There is no single ‘quick fix’ for irreproducibility in biomedical research. The numerous causes of irreproducibility mean that we require multifaceted solutions across many dimensions, including:
- Improving statistical literacy in the biomedical science community, with funders and publishers requiring precise and appropriate use of statistics in methodological design and in publications.
- Enhancing the adoption and standard of quality control mechanisms utilised by researchers to validate reagents and experimental models.
- Cultivating a healthier research environment in which scientists are not pressured and funnelled into exploring only novel research findings. Publishers have a responsibility to provide outlets for negative or null results, or replication efforts. When researchers submit grant applications, funders should take into consideration metrics of productivity other than publication record, such as teaching quality or patents submitted.
- Promoting the use of protocol pre-registration, and other initiatives designed to improve research reliability and reproducibility.
- The burden of responsibility is shared across science. Many actors (from individual researchers to funding bodies and publishers) share responsibility for improving integrity and reproducibility across scientific research. Any implemented measures should be developed in direct consultation with the research community. All these entities must work together to provide solutions and bring about change.
Introduction
- The Academy of Medical Sciences promotes advances in biomedical and health research and supports efforts to translate these advances into healthcare benefits for society. Our mission is to promote medical science and its translation into benefits for society. The Academy’s elected Fellows are the UK’s leading medical scientists from hospitals, academia, industry, and the public service. We work with them to promote excellence, influence policy to improve health and wealth, nurture the next generation of medical researchers, link academia, industry, and the NHS, seize international opportunities and encourage dialogue about the medical sciences.
- Medical research has been at the centre of the UK’s response to the COVID-19 pandemic, offering a route towards recovery by informing diagnostics, treatments, vaccines, a public health response and a better fundamental understanding of the biology of the virus. Research and innovation will continue to be central to our collective recovery from the pandemic, through economic benefits, improved health outcomes, and preparedness for future outbreaks. The Academy recognises that ongoing issues around reproducibility and research integrity play a role hindering scientific progress, delaying translation into clinical applications, and wasting valuable resources. We therefore strongly support the Science and Technology Committee’s aim to examine issues of research reproducibility and integrity with a focus on solutions and stakeholder-specific actions to address the problem, so that the UK can reap the maximum benefits from research and innovation in the years to come.
- Whilst issues of reproducibility affect many scientific disciplines, this response will primarily focus on the biomedical sciences and the issues and solutions that relate to this area of science. This response is based on our previous policy work, including our 2015 Reproducibility and reliability of biomedical research: improving research practice symposium report, and informed by the expertise of our Fellowship, which includes some of the UK’s foremost experts in clinical and academic medical research.[3] We would be pleased to expand on any of the points raised.
The extent of reproducibility issues in science
- Whilst it can prove difficult to quantify the scale of irreproducibility in science, partially due to the sheer volume of work and resources required to accurately replicate experiments on a large scale, there have been a number of high profile studies on the topic, especially within biomedical and social sciences.[4],[5] These reproducibility issues do not seem to be restricted to certain scientific disciplines, with similar concerns raised in the fields of artificial intelligence (AI) and quantum physics.[6],[7],[8] It is likely that irreproducibility impacts a diverse range of scientific fields, yet remains an issue that is extremely difficult to quantify. The evidence base around irreproducibility in science will need to be built further to better understand the extent of the problem and the strategies for mitigation which will have most impact.
- Whilst irreproducibility in science can be difficult to quantify, scientists from a diverse range of disciplines have reported personal experiences of irreproducibility, with a 2016 Nature survey highlighting that, in addition to around 75% of biologists, over 85% of chemists and over 65% of physicists and engineers have failed to reproduce results from another scientist’s experiment.[9] The same survey revealed that, of the 1,500 respondents, 90% agreed that some form of reproducibility crisis exists in science.9 It should be noted however that in biomedical research studying systems in which there exists a large degree of natural variability and heterogeneity, some amount of irreproducibility is to be expected and can occur for legitimate reasons.[10]
- Irreproducibility poses several significant risks to science. Recent surveys suggest public trust in researchers and the scientific process remains high.[11] However, pervasive irreproducibility across science could result in damaging effects on its reputation and consequently impact public trust in science.6 It is also important to address the irreproducibility to ensure that innovative research findings that are prioritised for translation to the clinic, or for public benefit, are built on a foundation of reliable and replicable science. By using estimates of irreproducibility rates calculated from previous studies, it was predicted that each year in the United States $28 billion is spent on preclinical research that is not reproducible.[12] Addressing irreproducibility can help to ensure that time and resources are more likely to be dedicated to reliable scientific findings, potentially speeding up the rate at which findings are translated into tangible benefits for society.
Issues of reproducibility in academia
There are many sources of irreproducibility in scientific research
-
- In a 2015 symposium focussing on the issues and potential solutions to irreproducibility in science, co-hosted by the Academy of Medical Sciences, the Medical Research Council (MRC), the Wellcome Trust and the Biotechnology and Biological Sciences Research Council (BBSRC), it was emphasised that there is no single cause of irreproducibility, and that many factors are likely contributing to the irreproducible results observed throughout scientific research.[13],[14] The origin of irreproducibility in science is multifaceted and is usually distinct from misconduct and outright fraud. Potential solutions and strategies to address the problem must take this into account and seek to address the issue from many angles.
Legitimate sources of irreproducibility
- In the biomedical sciences, irreproducibility can occur for legitimate reasons. Due to the inherent variability exhibited in the natural world, specifically in biological systems, studies are likely to yield results that are difficult to reproduce.13 Subtle changes in the environment or in experimental conditions can make it difficult to accurately replicate certain studies. However, although the scientific community generally agrees that some degree of irreproducibility is to be expected, the extent to which it is prevalent throughout research is a cause for concern and it remains an issue which must be addressed.13
Researcher-based causes of irreproducibility
- Many of the issues associated with irreproducibility arise from poorly designed experimental methods or inappropriate use of statistics. This can involve studies that possess small sample sizes, which lower the statistical power of an experiment. This consequently reduces the sensitivity of a study to detect any predicted effect.13 In general, inappropriate use of statistical methods can result in more unreliable or irreproducible findings. For example, ‘p-hacking’, which involves continuing to reanalyse or search datasets to find a statistically significant result, or any other method of statistical misuse, enhances the likelihood of such significant findings being unreliable.13 Factors relating to experimental design can impact the reliability and reproducibility of any findings, with inappropriate analyses potentially leading to seemingly significant results that are merely statistical artefacts.
- Simple technical errors, whether lab-based or computational, can also affect research reproducibility. This can include unknowingly relying on contaminated or misidentified chemical reagents and antibodies in experiments or performing studies using unvalidated cell lines.13 Issues regarding cell lines can be particularly damaging in terms of reproducibility, as misidentification or contamination can be difficult to notice. Although the degree of cell misidentification can be difficult to quantify, estimates have shown that between one-fifth and one-third of cell cultures contain a misidentified cell type or species.[15],[16] Even if all other aspects of a study have been conducted appropriately, thoroughly and reported accurately, using misidentified or contaminated cells or reagents can lead to conclusions that are irrelevant to the original hypothesis of the experiment.
- To replicate experiments or carry out similar studies from other research groups, researchers rely on methods and protocols being accurately and precisely described in scientific articles. Even if a study has been designed carefully and performed with extreme rigour, the reproducibility of the experiment is impacted if it is not reported on in an accurate or transparent manner.[17] Talks from the 2015 Academy symposium highlighted that a lack of sufficient detail in the materials and methods or data analysis section of an academic publication can render replicating such a study impossible.17
Research culture can exacerbate issues of irreproducibility
- Outside of the laboratory, there are higher-level cultural factors that influence reproducibility. Academic researchers are particularly incentivised and pressured to produce innovative and novel findings that can be published in high-impact, reputable journals. This in turn is influenced by the difficulty of publishing negative findings in academic journals. These issues can result in biases forming as scientists begin to focus on only the most exciting – but not necessarily the most reliable – findings in their research. Indeed, ‘selective reporting’ and ‘pressure to publish’ were noted as the top two factors that contribute to irreproducible research in a 2016 Nature survey.[18] Such pressures can also lead to researchers rushing to release publications without subjecting their work to the appropriate level of scrutiny.
- There are a range of ongoing activities, initiatives and bodies concerned with addressing irreproducibility in science and several of these are discussed below. These include but are not limited to:
Improving methodological design and statistical literacy in the biomedical science community
-
- Our 2015 workshop suggested that individual researchers would benefit from a greater degree of statistical understanding and knowledge, which could be provided by continual training and education in statistical methods, and increased collaboration with statisticians during the design, implementation, and analysis stage of a study.[25] Funding bodies have started to work in collaboration with higher education institutes to provide relevant training in statistics and experimental design for individuals at all career stages; including PhD students.[26] Improving statistical comprehension and ability in the biomedical sciences would in turn likely improve the reliability and robustness of results published.
- To help confirm the validity of the statistical analyses utilised in research articles, many biomedical journals and publishers work with statistical experts during the review process. A recent study surveyed 107 different biomedical journals, with 23% of journals making use of statisticians during the review process for all articles.[27] However, results from older studies suggest that the prevalence of this kind of statistical review amongst academic biomedical journals has not dramatically changed over the past 20 years, indicating that there is room for improvement in this area.[28] Encouraging publishers to work more closely with statisticians during the peer review process may raise the standard of statistical analyses in publications and improve the reliability of published work.
- In modern science, there are a wide variety of study designs that can each result in valuable findings. The Academy recommends that funding bodies prioritise research into improving how we correctly interpret data and results that arise from various methodologies or study design – including randomised controlled trials, observational studies or novel approaches.[29] To complement this methodological research, funders should invest in increasing the scientific community’s capacity and skillset for managing and analysing large data sets.29
Enhancing quality control mechanisms in place to validate experimental reagents and research models
- Scientists have a responsibility to ensure that the reagents and tools they use in the laboratory have been validated for their required purpose prior to using them in important experiments. Several journals now request that reagents such as antibodies have been appropriately profiled for the application that they have been utilised for in a particular publication.[30] Such online sources of information are vital for propagating information to the scientific community, to allow individual researchers to make informed decisions regarding the experimental tools they use.
Improving the transparency of published scientific work
- Improving openness and transparency throughout the scientific process would help to address irreproducibility. Publishers have a responsibility to ensure the research they publish is sufficiently robust and reproducible. This includes requiring authors to submit, as part of the final publication, detailed and accurate protocols that should allow independent researchers to reproduce the work described in the article.[31]
- Encouragingly, the UK Government’s R&D Roadmap has outlined steps to improve transparency and reproducibility of publicly funded research, by mandating open publication and strongly incentivising data sharing where appropriate.[32] Making publicly funded research outputs available to all is a reassuring step towards maintaining public trust in science. As more journals implement measures that promote the publication of open, transparent science, the ability of other research groups to replicate the findings of a published piece of work should greatly increase.
- Despite public trust in science and researchers in general remaining high, the way irreproducibility impacts public trust in science is not a topic that has been extensively studied or explored. The scientific community must remain open and transparent about the nuances surrounding irreproducibility. More effort must be made to communicate the complexities of irreproducible research, and the scientific process in general, to the public. For example, to avoid the tendency of media outlets to overstate or exaggerate the conclusions of scientific work, such outlets should coordinate more closely with scientists or science press officers to convey the conclusions of a particular study more accurately, whether that study discusses novel findings, negative results, or failed replication work.[33]
Cultivating a healthier research environment
- Issues of research culture and their impact on research reliability and reproducibility were explored in our 2015 workshop. At the symposium, we heard that publishing scientific findings in high-impact factor journals is still considered one of the most important components for researchers applying for funding and advancing their careers.33 This can lead to issues where researchers prioritise obtaining exciting, innovative data, and often neglect publishing negative results, or other null findings that may be more robust. The highly competitive ‘environment can result in little professional recognition for other forms of productivity, such as teaching, developing open-source software and tools for the research community, or submitting patents.33 To cultivate a healthier research environment, funders and institutes should re-evaluate how to integrate these other vital components of academia into grant applications or job expectations. Whilst some progress has been made through initiatives such as the Wellcome Research Culture Initiative and UK Government’s People and Culture Strategies, there is still more to be done.
- Scientists must be emboldened to publish or disseminate any negative or null findings, or indeed any pieces of published work they could not replicate. This includes submitting their work to journals that are receptive to negative or null results, discussing negative results in seminars and conferences, or incorporating relevant negative or null findings in other publications. Funders and publishers may be able to encourage good practice by promoting attempts to share negative or null results. For example, the Wellcome Trust allows their grant holders to publish negative or null research findings – including failed replication attempts - in their journal Wellcome Open Research[34],[35] and the scientific journal PLOS ONE considers and accepts negative or null findings, assembling such articles into a collection titled Missing Pieces.[36],[37] Initiatives like this promote open science and can allow researchers to share any of their results, regardless of novelty or ‘impact’.
Promoting and improving protocol pre-registration
- Protocol pre-registration is a relatively novel initiative aimed at improving reproducibility and removing biases that may form during data analysis. Protocol pre-registration involves researchers outlining their hypotheses, outcome measures and methodologies prior to the collection and analysis of the data. Over 300 journals currently make use of pre-registration in some form, as part of the Registered Reports publishing format.[38]
- Pre-registration precludes the use of post-hoc analysis, which can introduce bias into the study and reduce the reliability of any findings. It recognises and rewards comprehensive and methodical science rather than novelty and acts to uncouple publication success from the specific result of the study.
- Given the rapid expansion of journals utilising Registered Reports, this represents a vitally important time for publishers to standardise the process. Although the Center for Open Science provides systematic guidelines for journals making use of Registered Reports, individual authors and journals are not obligated to adhere to them, which has partially resulted in different journals implementing the format in different and potentially confusing ways.[39] It has been suggested that the development and common use of a central independent registry, such as the Open Science Framework Registries, may lead to a more standardised format that could lead to improved transparency and reproducibility, akin to how ClinicalTrials.gov operates to provide a central hub of standardised clinical trial records.39
Promoting collaboration with the UK Reproducibility Network
- The UK Reproducibility Network (UKRN) is a cross-disciplinary consortium of 18 universities that aims to promote the practice of robust, accessible, reliable, and reproducible scientific research.[40] The 2015 Academy-led symposium directly influenced the establishment of the UKRN, and the Academy supports the UKRN in kind as a member of its Stakeholder Engagement Group.[41] In a bid supported by the Academy, the UKRN recently received £4.5m in funding from the Research England Development Fund to drive uptake of open research practices across the sector.[42] Some of their activities include:
- Acting as a hub to bring together expertise and knowledge from broad disciplines and sector to tackle sources of irreproducibility.
- Providing online workshops and training for good scientific practice and reproducibility.
- Supporting and disseminating initiatives that encourage reproducibility – such as Open Research Working Groups designed to reform research to improve transparency and accessibility, Registered Reports and Octopus, a free, accessible, and instant portal on which to publish scientific work.[43]
- The UKRN and its collaborative approach will play an increasingly important role in the coming years to address the issues around reproducibility.
Global and sector-wide collaboration required to mediate effective action
- The ongoing issues that are impacting research integrity and reproducibility in scientific research are not restricted to a particular sector, entity, or nation. Attendees of the 2015 Academy-led symposium from across the world – North America, Europe, and Asia – all confirmed reproducibility is a global issue requiring international coordination to stimulate positive change.[44] A global, cross-sector approach is required to combat irreproducibility in science. Any implemented measures should be developed in direct consultation with the research community. Funders, research institutes, publishers, professional bodies, and individual scientists must all come together on an international level to deliver solutions for the numerous causes of irreproducibility in scientific research.44,[45]
- The establishment of the UK Committee on Research Integrity (UK CORI) may represent an important opportunity to drive coordination and oversight of the various activities happening in response to the irreproducibility problem.[46] The ambition for UK CORI to provide a forum to discuss how standards and expectations can be set across the sector will be valuable.
(September 2021)
[1] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[2] Academy of Medical Sciences (2017). Response to the Science and Technology Committee Inquiry on Research Integrity. https://acmedsci.ac.uk/file-download/69294217
[3] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[4] Open Science Collaboration (2015). Estimating the reproducibility of psychological science. Science 349, aac4716
[5] Begley CG & Ellis LM (2012). Raise standards for preclinical cancer research. Nature 483, 531-533.
[6] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[7] Hudson M (2018). Artificial intelligence faces reproducibility crisis. Science 359, 725-726.
[8] Frolov S (2021). Quantum computing’s reproducibility crisis: Majorana fermions. Nature 592, 350-352.
[9] Baker M (2016). 1,500 scientists lift the lid on reproducibility. Nature 533, 452-454.
[10] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[11] Department for Business, Energy and Industrial Strategy (2020). Public attitudes to science 2019. https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/905466/public-attitudes-to-science-2019.pdf
[12] Freedman LP, Cockburn IM & Simcoe TS (2015). The Economics of Reproducibility in Preclinical Research. PLOS Biology 16(4), e1002626.
[13] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[14] Academy of Medical Sciences (2017). Response to the Science and Technology Committee Inquiry on Research Integrity. https://acmedsci.ac.uk/file-download/69294217
[15] Neimark J (2015). Line of attack. Science 347 938-940.
[16] Nature (2009). Identity crisis. 18 February https://www.nature.com/articles/457935b
[17] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[18] Baker M (2016). 1,500 scientists lift the lid on reproducibility. Nature 533, 452-454.
[19] The UK Reproducibility Network (n.d.). https://www.ukrn.org/ [accessed 1 September 2021]
[20] The UK Research Integrity Office (n.d.). https://ukrio.org/ [accessed 1 September 2021]
[21] UK Research and Innovation (2021) Promoting research integrity across the UK. https://www.ukri.org/news/promoting-research-integrity-across-the-uk/
[22] Wellcome (n.d.). Research culture: let’s reimagine how we work together. https://wellcome.org/what-we-do/our-work/research-culture [accessed 1 September 2021]
[23] Wellcome Open Research (n.d.). How it Works. 1. Aims and Scope. https://wellcomeopenresearch.org/about [accessed 1 September 2021]
[24] Department of Business, Energy and Industrial Strategy (2020). UK Research and Development Roadmap. https://www.gov.uk/government/publications/uk-research-and-development-roadmap
[25] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[26] Academy of Medical Sciences (2016). Improving research reproducibility and reliability: progress update from symposium sponsors. https://acmedsci.ac.uk/file-download/41615-5836c0640fd92.pdf
[27] Hardwicke TE, Goodman SN (2020). How often do leading biomedical journals use statistical experts to evaluate statistical methods? The results of a survey. PLOS ONE 15(10), e0239598.
[28] Goodman SN, Altman DG & George SL (1998). Statistical Reviewing Policies of Medical Journals. Journal of General Internal Medicine 13, 753-756.
[29] Academy of Medical Sciences (2017). Enhancing the use of scientific evidence to judge the potential benefits and harms of medicines. https://acmedsci.ac.uk/file-download/44970096
[30] Baker M (2015). Reproducibility crisis: Blame it on the antibodies. Nature 521, 274-276.
[31] Academy of Medical Sciences (2017). Enhancing the use of scientific evidence to judge the potential benefits and harms of medicines. https://acmedsci.ac.uk/file-download/44970096
[32] Department of Business, Energy and Industrial Strategy (2020). UK Research and Development Roadmap. https://www.gov.uk/government/publications/uk-research-and-development-roadmap
[33] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[34] Wellcome Open Research (n.d.). How it Works. 1. Aims and Scope. https://wellcomeopenresearch.org/about [accessed 1 September 2021]
[35] Academy of Medical Sciences (2016). Improving research reproducibility and reliability: progress update from symposium sponsors. https://acmedsci.ac.uk/file-download/41615-5836c0640fd92.pdf
[36] PLOS ONE (n.d.). Criteria for Publication. https://journals.plos.org/plosone/s/criteria-for-publication [accessed 2 September 2021]
[37] PLOS ONE (2015). Positively Negative: A New PLOS ONE Collection focusing on Negative, Null and Inconclusive Results. 25 February https://everyone.plos.org/2015/02/25/positively-negative-new-plos-one-collection-focusing-negative-null-inconclusive-results [accessed 2 September 2021]
[38] Center for Open Science (n.d.). Registered Reports: Peer review before results are known to align scientific values and practices. https://www.cos.io/initiatives/registered-reports [accessed 2 September]
[39] Hardwicke TE & Ioannidis JPA (2018). Mapping the universe of registered reports. Nature Human Behaviour 2, 793-796.
[40] UK Reproducibility Network (n.d.). Welcome to the UK Reproducibility Network. https://www.ukrn.org [accessed 2 September]
[41] The UK Reproducibility Network (n.d.). https://www.ukrn.org/ [accessed 1 September 2021]
[42] UK Reproducibility Network (2021). Major funding boost for UK’s open research agenda. https://www.ukrn.org/2021/09/15/major-funding-boost-for-uks-open-research-agenda/
[43] UK Reproducibility Network (n.d.). Initiatives. https://www.ukrn.org/initiatives [accessed 2 September]
[44] Academy of Medical Sciences (2015). Reproducibility and reliability of biomedical research: improving research practice. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[45] Academy of Medical Sciences (2017). Response to the Science and Technology Committee Inquiry on Research Integrity. https://acmedsci.ac.uk/file-download/69294217
[46] UK Research and Innovation (2021) Promoting research integrity across the UK. https://www.ukri.org/news/promoting-research-integrity-across-the-uk/