Written Evidence Submitted by the UK Reproducibility Network Institutional Leads

(RRE0043)

 

 

Authors

 

Dr Andrew J. Stewart, University of Manchester, Manchester, UK

 

Professor Emily K. Farran, University of Surrey, Surrey, UK

 

Dr James A. Grange, Keele University, Keele, UK

 

Professor Malcolm Macleod, University of Edinburgh, Edinburgh, UK

 

Professor Marcus Munafò, University of Bristol, Bristol, UK

 

Dr Phil Newton, University of Reading, Reading, UK

 

Professor David Shanks, University College London, London, UK

 

On behalf of the UKRN Institutional Leads.

 

UKRN Institutional Members are universities (or equivalent) that have formally joined UKRN through the creation of a senior management role focused on research improvement (the Institutional Lead). The aim is to coordinate efforts to improve research quality – through training, incentives, and other activity – across this consortium of institutions. The UKRN currently has 20 Institutional Members.

 

Summary

 

 

 

 

 

 

 

 

 

Introduction

 

1. The UK Reproducibility Network (UKRN; www.ukrn.org) is a national peer-led consortium that aims to ensure the UK retains its place as a centre for world-leading research. Currently, twenty UK HE Institutions (https://www.ukrn.org/institutional-leads/) are formal members of the UKRN. Institutional membership requires the creation of a senior academic role within each institution with responsibility for research improvement and research integrity reporting to the PVC for Research (or equivalent). This document has been developed by these institutional leads, and represents the position of this part of the UKRN structure.

 

2. The UKRN Institutional Leads welcome this inquiry. In particular, we feel that the discussions that have led to this inquiry reflect a broader need for research transparency, such that all stages of the research pipeline (from data collection processes through to data and analysis code) are made openly available. In other words, and in response to the second topic of the inquiry’s call for evidence, the so-called ‘replication crisis’ has arisen partly as a result of a lack of research transparency at various stages of the research pipeline, and a lack of incentives at both individual researcher and institutional level to adopt those open and transparent research practices.

 

3. It is worth noting that concerns about transparency and reproducibility in science (and the role of how researchers are incentivised) are not new - and were even mentioned by Charles Babbage in 1833. Paul Meehl wrote about the issues in 1967, including how the incentive structure in academia rewarded quantity over quality in scientific research outputs. A similar point was made by Doug Altman in the British Medical Journal in 1994: “…as the system encourages poor research it is the system that should be changed. We need less research, better research, and research done for the right reasons. Abandoning using the number of publications as a measure of ability would be a start.” Scientists behave in such a way that is optimal for them in the environment in which they operate. Many scientists will engage in behaviours in their research work that are most likely to result in reward.

 

4. Rather than viewing this as a ‘crisis’ it is more appropriate to view it as a consequence of a lack of transparency and openness in the research pipeline. We therefore encourage the inquiry to focus on transparency rather than reproducibility. In our response, we focus on the aspect of the inquiry’s call for evidence related to research transparency (which is the foundation for research reproducibility) and the role played by research funders, research institutions, individual researchers, publishers, and Government in possible solutions.

 

5. Below we provide recommendations to increase transparency and openness in research, focusing on the role of funders and government, institutions and individuals in ensuring procedures, incentives and training and learning from others contribute to improving the robustness of research findings.

 

The Role of UKRI and Other Funders

 

6. UKRI’s policy on open access (and the related requirements for outputs in REF2020 submissions) has had a dramatic impact on the proportion of final research outputs (e.g., publications) that are openly available. UKRI and other funders should place a similar strong emphasis on intermediate research outputs, involving transparency via full reporting of research workflows, analysis code, and FAIR (Findable, Accessible, Interoperable, and Reusable) data, as this is likely to bring about a similar increase in the proportion of more granular research outputs that are open and transparent, and in turn reproducible and that report results that are more likely to be replicable.

 

7. Improved transparency will engender greater trust from both the public and the research community, which aligns with the UK Government Research & Development roadmap: “We will...strongly incentivise open data sharing where appropriate, so that reproducibility is enabled, and knowledge is shared and spread collaboratively. Second, we will ensure that more modern research outputs are recognised and rewarded. For example, we will ensure that digital software and datasets are properly recognised as research outputs, so that we can minimise efforts spent translating digital outputs into more traditional formats.” In the same way that UKRI funding councils require grant applications to detail a research data management plan, funders should require researchers applying for funding (not just from UKRI but also from other sources) to develop a detailed plan for how they will ensure the research reported at the point of publication is fully transparent. Consistency across funding bodies and post-award auditing to ensure compliance with respect to this requirement will be important to ensure it has been properly implemented.

 

8. In turn, it will be important that the skills required to produce research workflows that are transparent (and ultimately likely to produce research findings that are both reproducible and replicable) are fully funded as a component of the project. This could include data curation time, expertise in developing reproducible and transparent research workflows, infrastructure for data curation etc. Whilst most of UKRI’s focus on open research has so far been on open access journal articles, UKRI and other funders should place a similarly strong focus on open data, methods and code that will signify the next stage of UKRI’s open research activity. Indeed, we were pleased to see that the Science Minister recently presented a broader view of research openness that is consistent with our own perspective.  Crucially, funders must make sure that policies for transparent and open science are accompanied by training and funding.

 

9. UKRI should play a key role in ensuring that research submitted to future research assessment exercises is fully transparent. This will require particular (and focused) expertise on the future equivalent of REF panels.

 

10. Funding councils could create specific funding calls for assessing and improving reproducibility in various fields. The German Research Foundation (Deutsche Forschungsgemeinschaft, DFG) had such a call in 2021 (allocating about 5 million Euros). Projects that successfully improve reproducibility will likely have a very strong impact.

 

Avoiding a ‘One-Size Fits All’ Approach

 

11. We recognise that for some areas within STEM disciplines, producing research that is open and transparent will be more straightforward than in other areas, and therefore government, funders and institutions should avoid a one-size-fits-all approach to research transparency and ensure research is transparent in a manner that is appropriate for the relevant research discipline and methodology. Mandating that different types of research have to be transparent in the same way could result in a lowest common denominator approach, or could turn into a box-ticking exercise. Neither of these is likely to result in the desired outcome and would simply be performative transparency.

 

12. Journals and publishers have a role to play in auditing transparency at the point of article peer-review, and researchers and funders during the end-of-research-grant reporting period. It is critically important that this auditing is done thoroughly to ensure that appropriate transparency in the research workflow has been accomplished. It would be too easy for a researcher to claim their research reporting is transparent when this is not the case. 13The journal peer review process currently does not typically ensure that the research is reported in a manner that is sufficiently transparent. Indeed, the scientific literature is full of journal articles that claim the underlying data and analysis code are openly available when this is either not the case, or not delivered in a form that allows either to be (re)usable. Even when data are made available, they can be unusable due to a lack of meta-data and accompanying analysis code. Similarly, analysis code is often unusable as a result of researchers not providing a sufficiently detailed description of the computational environment (including software dependencies) needed to run the analysis code on a computer other than the researchers’ own.

 

14. When final research outputs are submitted for peer review via the traditional journal route, it is important that journals and publishers use the review process to ensure that research is reported in a transparent manner – and, if data and analysis code are both provided, that the results can be reproduced. We note that the CODECHECK initiative has the potential to play a key role in this. Researchers submit their analysis code (to https://codecheck.org.uk/) where the team runs the code independently to provide a certificate of executable computation. This approach was used to confirm the COVID modelling work carried out at Imperial. Ensuring computational reproducibility should be a standard part of the peer review process.

 

15. Research openness and transparency both have a key role to play in innovation and commercialisation. This was highlighted in a recent report by Elixir in the context of the Life Sciences in terms of breakthrough discoveries, scientific excellence and entrepreneurial endeavours that follows from research openness. We recommend engaging with stakeholders in industry to determine the role of research openness and transparency in subsequent innovation and commercialisation.

 

Academic Incentive Structures

 

16. It is important that institutions ensure that organisational structures within which researchers work reward engagement with and adoption of open and transparent research practices. Academic hiring decisions, annual performance reviews, and promotion are often informed by easy-to-calculate research metrics such as the number of research outputs an academic has produced, or the amount of grant income an academic has generated within a particular period. A high score on these metrics does not mean that the underlying science is transparent and robust (often simply that there is a lot of it). Academics need to be incentivised to produce research that is both high-quality and transparent. Many hiring and promotion decisions are based on metrics (e.g., number of publications, h-index etc.) that do not necessarily reflect research quality. Indeed, some of the journals with the highest Impact Factors (indicating how often a typical research output that they publish is cited) also have the highest retraction rates.

 

17. As competition for academic positions increases, academics are incentivised to behave in a way that will increase their chances of being appointed to a permanent position – which often simply means publishing a large number of research articles, irrespective of their quality. This can encourage a short-term focus on citations and volume. Institutional recruitment and promotion should prioritise and reward conducting research the right way (i.e., with high certainty), rather than getting exciting (but uncertain) research published. Reproducible research takes longer, so something needs to be done at the institutional level to raise awareness of this and change assessment criteria accordingly.

 

18. In the same way that researchers’ behaviour will change as a result of changes in how those individual researchers are incentivised, Universities' behaviours and processes will change only if the ways in which those Universities are incentivised changes. Research income (e.g., via research councils and REF-related QR income) should become more dependent on research transparency, then University processes with respect to hiring, performance review, and promotion will inevitably adapt to incentivise researchers to adopt transparent practices in their research workflows.

 

Team Science and Skills Development

 

19. Many of the computational and data skills needed for researchers to conduct research in a fully open and transparent manner are lacking in the scientific community. The 2020 R&D Roadmap highlighted a broad lack of digital skills across the UK workforce; the same holds true in the scientific workforce. Many of the skills needed to engage in research that follows transparent and open principles are computational/digital in nature, and are also highlighted in the 2020 Roadmap and in the 2021 R&D People and Culture Strategy. In order for the UK to remain globally competitive with respect to STEM research, and to produce research outputs that are open, transparent, and robust a joined-up approach across all aspects of R&D (including training) is needed. We are delighted that Research England has provided the UKRN with funding to support our ambitious 5-year project which contains a particular focus on training and the sharing of good practice. We recommend a sustained focus on and investment in digital skills training and infrastructure to ensure the UK scientific workforce remains globally competitive

 

20.Building open and reproducible research workflows is not a trivial task and often requires researchers to have competence in software coding, data management etc. We recognise it is an unrealistic goal for researchers to be software engineers in addition to being experts in their science. Rather than each individual researcher having the full range of computational and data skills needed for open and reproducible research, we recognise that it is the research team that should have these skills.  The days of the ‘lone genius’ as the model of a scientific researcher are fading fast, if not gone already. Therefore, there should be wider recognition and reward of team-based nature of science and recognition of the critical role that research software engineers and data stewards play in scientific research.

 

21. Often, too much emphasis is placed on the role of the lead scientist in many scientific endeavours. This results in individuals receiving the credit for what is more often than not a team-based discovery. As Professor Dame Ottoline Leyser, Chief Executive of UKRI, has said: “We need to build a truly inclusive system that values and nurtures a much wider range of careers and career paths”.

 

Learning From Others (and our past selves)

 

22. While organisations within the UK are successfully raising awareness in issues around transparency and reproducibility in science, it is important to recognise that other countries are also working in this space, and arguably are further developed in terms of a coherent national science policy. France has recently launched the Second National Plan for Open Science to run from 2021-2024. “[Open Science] is firmly attached to a European-wide vision and, in the context of the French presidency of the European Union, proposes to act in favour of open science being effectively taken into account in both individual and collective assessments for research. This involves initiating a process of sustainable transformation in order to ensure that open science becomes a common and shared practice, encouraged by the whole international ecosystem of higher education, research and innovation.”

 

23. In 2018 the League of European Universities published an advice paper detailing a roadmap for change in research culture that captures issues related to transparency and reproducibility under the broader banner of Open Science. The roadmap provides 41 recommendations detailing how this change can be brought about and is built upon the European Commission’s eight ambitions on Open Science. One of these focuses entirely on reproducibility and research integrity. Indeed, the EU has recently produced a scoping report on the topic of reproducibility in science.

 

24. The UKRN Institutional Members welcome the publication of the G7 Research Compact which recognises the importance of transparency, openness, and collaboration in research.

 

25. Networks modelled on the UKRN have been created in other countries, thus providing the opportunity to share knowledge and for globally-integrated approach to challenges related to research openness and transparency.  Activities to encourage international dialogue around a globally integrated approach should be promoted as action that promotes transparency and openness in research must occur not just within the UK, but across the global scientific community.

 

26. There have been several attempts at addressing issues around research transparency and rigour in the UK. In 2015 the BBSRC, MRC, Wellcome Trust, and Academy of Medical Sciences commissioned a report into reproducibility and reliability in biomedical research. In 2016 Jo Johnson (the then Minister of State for Universities and Science) established the Open Research Data Task Force that reported in 2018. Both of these reports produced guidance about how best to improve openness, transparency, and reproducibility in science.

 

27. We suggest that the UK Government starts putting into practice the recommendations of previous reports into reproducible and open research, otherwise we will be danger of experiencing an endless cycle of re-invention and consultation with respect to scientific transparency while, at the same time, other countries develop and put into practice a detailed and financially sustainable long-term and joined-up research strategy focused on openness, transparency, and reproducibility. Failure of the UK to do this will have negative consequences not just for UK science, but also for innovation and subsequent commercialisation of scientific discovery.

 

Conclusion

 

28. This inquiry provides a unique opportunity for the development of an ambitious UK science vision centred on research openness and transparency that will improve the robustness of research findings, improve public trust in science, maximise the effectiveness and impact of research funding, and provide a strong foundation that places research openness and transparency at the heart of UK innovation. We hope that the committee engages in action. 

 

 

(September 2021)