Written Evidence Submitted by Dr Simon Kolstoe, Reader in Bioethics, University of Portsmouth
(RRE0014)
There is not a “Crisis of Reproducibility”, there is a crisis of “Research Waste”.
AUTHOR:
Dr Simon Kolstoe in a personal capacity. I am an academic conducting research in the areas of research governance, ethics and integrity (https://orcid.org/0000-0003-1472-3966). I was called to give oral evidence to the 2017/19 Inquiry into Research Integrity. Following the Inquiry I was appointed to research integrity working groups by the Health Research Authority and UKRI. Although the following evidence does not reflect the official views of the organisations I work for, for context I am a Reader in Bioethics at the University of Portsmouth, Chair of the MOD Research Ethics Committee (MODREC), Chair of UKHSA/PHE Research Ethics and Governance Group (REGG), and Chair the NHS Health Research Authority’s “Fast Track” research ethics committee.
SUMMARY
The 2017 to 2019 Inquiry into Research Integrity (and Clinical Trials Transparency) was a land mark moment for UK science. These influential reports catalysed actions that have led to improvements in key areas such as clinical trial registration and the attention paid to research integrity especially by UKRI and Universities. However, this inquiry was certainly not the last word on the issue. Research is still subject to complex social, political and cultural pressures, and “85% of research is wasted because it asks the wrong questions, is badly designed, not published or poorly reported[1]”. The consequence of continuing research waste is the production of unreliable data which is difficult or impossible to reproduce. It is important that the Reproducibility problem highlighted by this new Inquiry is seen in the wider context of ongoing efforts to identify and address the larger systemic problem of Research Waste.
1) Science is not objective – it is as much based on politics and culture as any other human activity
Reliable data is critical for good decision making. This important principle has been illustrated by the Coronavirus pandemic, where key social and political decisions have been closely based on the production of high quality and accurate scientific data. However, the production of reliable scientific data is not straightforward. The idea that science can produce objective data that politicians and others can use to make subjective political and social decisions is naive. Fifty plus years of work by Philosophers and Social Scientists have shown that the practice of Science is subject to as many social, political and cultural influences as any other human activity.
When answering the question “What is this thing called Science” a linear process is taught in school rooms (often attributed to Francis Bacon). Undergraduates at University are introduced to a slightly more complex model of alternative theories developed by Philosophers such as Popper, Kuhn and Feyerabend, but to summarise brutally, the direction of most contemporary UK (and international) science is influenced by the quest for funding (Figure 1). In some ways this is not a surprise as science costs money, and scientists need to be paid, but it is naïve to consider science, or scientific data, without also holding in mind the incentives that led to the production of the data. This is why, as a response to the earlier inquiry into research integrity, UKRI commissioned the UK Research Integrity Office (UKRIO) and the UK Reproducibility Network (UKRN) “to conduct a study into the effects of incentives in the research system on researcher behaviour”[2]. This 2020 study has been influential in highlighting the issues that incentivise science but also lead to poor research behaviour (including misconduct), research waste and subsequent reproducibility problems.
Figure 1: a) The linear, Baconian model of Scientific progress. b) a more accurate representation of the contemporary scientific process.
2) Scientific rewards are based on proxies of reliable science, not producing reliable science itself
One of the central roles of government is to distribute resources through funding. Clearly when funding is limited, allocation (also known as rationing) issues are important. Political Science is essentially the study of how to solve the rationing problem. Within scientific research, competition through research grants is broadly recognised as an effective ways of solving the rationing problem. In brief, a budget holder (such as Government directly, a Government funded research council, industry directly, or through some other funder (e.g. a charity)) proposes an area of research, and then interested scientists bid for the funding. Research questions may or may not have been created with input from the research community itself (see section 3 below). The successful bidder then conducts the research and publishes the results, either publicly or privately (the issue of who gets to see the results is again relevant, see section 3 below). As a consequence “achieving research grants” has become the main benchmark for scientists as this is how Universities and other research organisations receive income. The weakness of the system is immediately obvious, as the reward is clearly linked to obtaining the successful grant, not producing reliable (and reproducible) results. Such research grants are therefore “proxy” measures that when awarded indicate someone able to convince others that they can conduct reliable science, but is not a measure of reliable science itself. To solve this problem funders ask for evidence of reliable science by referring to high impact publications or “impact”, but again both are also proxy measures. Neither grant income, high impact publications or anecdotes about impact are the same as reliable science itself. It is this gap between the proxies of reliable science and reliable science itself that often leads to research waste, and hence the so-called reproducibility crisis.
For completeness, however, it must be acknowledged that only a small amount of research is conducted using government grant funding. More research by both volume and cost is conducted within a commercial setting where proxies measures are no less important. Here, while many industry scientists (certainly within the Pharmaceutical/Medical sector) are driven by broadly humanitarian motives, the security of their job and thus personal income is directly linked to the research agenda of their employer. As a consequence the public is rightly wary of commercially sponsored research because they know a central purpose is to generate profit, but at the same time the public does generally acknowledge that commercial research is necessary to drive invention and the availability of new technology. But here it is critical that good regulators are used to keep a close eye on commercial research, ensuring that fraud and other profit maximising activities are prevented, or at least minimised. An example of this is the excellent work conducted by the Health Research Authority in the area of Clinical Trials Transparency, as requested in the previous Research Integrity Inquiry[3].
3) 85% of Research is Wasted
If research waste is often caused by the difference between ways of measuring research (grants, papers, impact & profits) and reliable research itself, how do we better understand this difference? In a seminal 2009 paper titled “Avoidable waste in the production and reporting of research evidence”[4], Sir Ian Chalmers and Paul Glasziou outlined the problem as they saw it in medical research. In a subsequent BMJ editorial[5] they strongly defended their estimation that “85% of research funding was wasted because it asks the wrong questions, is badly designed, not published or poorly reported”. This astonishing figure caught the attention of many concerned academics and researchers at the time, alongside a series of five papers in the Lancet that analysed the problem under the series title of: “Increasing Value, Reducing Waste”[6]. A subsequent 2015 conference in Edinburgh established the REWARD Alliance (REducing Waste And Rewarding Diligence) to promote the following statement among scientists, publishers, funders and researchers:
“We recognise that, while we strive for excellence in research, there is much that needs to be done to reduce waste and increase the value of our contributions. We maximise our research potential when:
• we set the right research priorities;
• we use robust research design, conduct and analysis;
• regulation and management are proportionate to risks;
• all information on research methods and findings are accessible;
• reports of research are complete and usable.
We believe we have a responsibility not just to seek to advance knowledge, but also to advance the practice of research itself. This will contribute to improvement in the health and lives of all peoples, everywhere. As funders, regulators, commercial organisations, publishers, editors, researchers, research users and others – we commit to playing our part in increasing value and reducing waste in research.”
In subsequent publications and work by those associated with the REWARD Alliance, the five areas that lead to research waste were summarised as:
1) Setting unnecessary (or low priority) research questions
2) Using inappropriate research methods/design
3) Inefficient regulation and management of research
4) Not sharing results with other researchers
5) Not reporting results fully
These were well illustrated in a 2014 Lancet paper (Figure 2)[7]:
Figure 2: The five areas of research waste from MacLeod et al. (2014) Biomedical research: increasing value, reducing waste. Lancet https://doi.org/10.1016/S0140-6736(13)62329-6
This issue of research waste has not gone unnoticed within the Scientific community, but the fact that this inquiry is being held demonstrates that it is perhaps less well understood outside of the research community. Clearly, work is still needed to highlight this aspect of the scientific process, but even once highlighted, finding the solution is a different matter. Almost everyone agrees that research waste as articulated above is a problem, but with the current incentive structure, it is difficult to turn such sentiments into workable solutions.
It Is of note, however, that attempts to find solutions are ongoing with significant work and analysis being conducted by a wide variety of organisations under a number of different headings including: “Research Culture” (Royal Society[8] & Wellcome Trust[9]), “Research Incentives” (UKRI[10]), “Research Integrity” (UKRIO[11] & Universities UK[12]) and “Reproducibility” (UKRN[13]). There have also been other closely linked international initiatives that focus on specific areas of research waste such as “Plan S” for promoting open access data and publications[14], the Alltrials campaign for promoting research Transparency initially in Clinical Trials but also more broadly[15], and the Evidence Based Research Consortium (EVBRES)[16]. This work is all encouraging, but can cause confusion as although these different headings are used, all such activities can be identified as aspects of Figure 2, and thus considered initiatives for addressing the broader problem of research waste.
4) How can this inquiry add to the efforts?
Although this author disagrees with the comment in this Inquiries brief that “the specific issue of reproducible research has been overlooked[17]”, any additional Inquiry in this area is to be welcomed. While many academics, scientists and clinicians are worried about research waste, we only have minor influence over the environment within which our research is conducted.
This lack of control has been emphasised during the coronavirus pandemic as a great deal of research has been commissioned and conducted, but not all of it has been found to be reliable (see for instance the BMJ editorial “A deluge of poor quality research is sabotaging an effective evidence based response”[18]). While it is understandable why Government has decided to rapidly hand out large amounts of research funding to coronavirus research, in written evidence to the inquiry by this committee on UK Science, Research and Technology Capability and Influence in Global Disease Outbreaks, I highlighted a number of concerns that this has caused also in relation to research waste[19]. History will show whether the figure for coronavirus research waste is also 85% or potentially even higher. This issue is particularly important as it risks public confidence in science as manifest in issues such as vaccine uptake or following advice such as face masks, physical distancing etc.
In conclusion, the report from this inquiry could helpfully:
1) Highlight the similarity between initiatives using titles such as “Reproducibility”, “Integrity”, “Research Culture”, “Open Access”, “Research Transparency” and “Research Incentives”. It is this authors opinion that “Research Waste” is the best umbrella term, however regardless of the title, it is important to recognise that all these activities and initiatives are linked.
2) Publicise the five known structural problems that lead to research waste perhaps using the REWARD statement or similar, and make a public call for solutions.
3) Direct UKRI and other funders to conduct further work, and propose alternative solutions, to incentive structures that distort research/science.
4) Revisit the conclusions of the 2017/19 Research Integrity Inquiry and seek evidence for progress. Where none has been made require action plans, with clear timelines.
5) Encourage the normalisation of grant funds addressing “research on research”, making this a central and ongoing component of funding portfolios rather than one off initiatives.
6) Conduct a new inquiry looking specifically at Coronavirus research, and determining how much has been wasted. Determine and analyse the cause for this waste.
(September 2021)
[1] Chalmers, Iain et al. (2009) Avoidable waste in the production and reporting of research evidence The Lancet, Volume 374, Issue 9683, 86 - 89
[2] https://www.ukri.org/our-work/supporting-healthy-research-and-innovation-culture/research-integrity/
[3] https://www.hra.nhs.uk/planning-and-improving-research/policies-standards-legislation/research-transparency/make-it-public-transparency-and-openness-health-and-social-care-research/
[4] https://www.thelancet.com/journals/lancet/article/PIIS0140-6736%2809%2960329-9/fulltext
[5] https://blogs.bmj.com/bmj/2016/01/14/paul-glasziou-and-iain-chalmers-is-85-of-health-research-really-wasted/
[6] https://www.thelancet.com/series/research
[7] https://www.thelancet.com/journals/lancet/article/PIIS0140-6736(13)62329-6/fulltext
[8] https://royalsociety.org/topics-policy/projects/research-culture/
[9] https://wellcome.org/what-we-do/our-work/research-culture
[10] https://www.ukri.org/our-work/supporting-healthy-research-and-innovation-culture/research-integrity/
[11] https://ukrio.org/
[12] https://www.universitiesuk.ac.uk/topics/research-and-innovation/concordat-research-integrity
[13] https://www.ukrn.org/
[14] https://www.coalition-s.org/
[15] https://www.alltrials.net/
[16] https://evbres.eu/
[17] https://committees.parliament.uk/work/1433/reproducibility-and-research-integrity/
[18] https://www.bmj.com/content/369/bmj.m1847
[19] https://committees.parliament.uk/writtenevidence/9536/html/