Written Evidence Submitted by the Russell Group
(RRE0086)
1. Integrity in research is crucial to maintaining scientific excellence and retaining the public’s trust in science. Russell Group universities are committed to conducting rigorous and open research, delivered through high-quality methods and high standards of research integrity.
2. Many of the factors which have an impact on reproducibility stem from wider issues in fostering a positive overall research culture and environment. We support an inclusive and nuanced understanding of reproducibility that is expanded to include the importance of replicability, reliability and transparency.
3. Systematic incentives in the research landscape contribute to the reproducibility issue, including over-reliance on certain publication or citation metrics, promotion criteria, and publication bias. Researchers need to be equipped with the right skills, and greater acknowledgement should be given to the financial and time resources required for fully reproducible and transparent study.
4. Funders, universities, researchers and publishers should work together to support reproducibility and transparency in research, for example:
5. We support an approach of coordinated multilateral action by all actors in the sector on reproducibility and transparency in research as opposed to any unilateral action from Government, which may generate far greater costs for UK research.
6. We welcome the establishment of UK CORI and propose it use its unique convening power as a catalyst for national discussion on the reproducibility and transparency of UK research.
The Russell Group would like to thank the members of the Russell Group Research Integrity Forum for their contribution to this submission.
1. Introduction
The purpose of The Russell Group is to provide strategic direction, policy development and communications for 24 major research-intensive universities in the UK; we aim to ensure that policy development in a wide range of issues relating to higher education is underpinned by a robust evidence base and a commitment to civic responsibility, improving life chances, raising aspirations, and contributing to economic prosperity and innovation.
We welcome the opportunity to provide evidence to this inquiry. This submission reflects input from the Russell Group Research Integrity Forum (RGRIF). The RGRIF was established in 2013 and is a network of the professionals with lead responsibility within their universities to foster research integrity and associated areas including reproducibility and transparency, research ethics and the development of research environments which support the delivery of world-leading research and impact.
The work of RGRIF is driven by external policy developments such as the Concordat to Support Research Integrity, coupled with an internal recognition of the importance of ensuring continued public and regulatory confidence in research. The RGRIF is therefore well-placed to advise the Committee on this enquiry.
Integrity in research is crucial to maintaining scientific excellence and retaining the public’s trust in science. Russell Group universities are committed to conducting rigorous and open research, delivered through high-quality methods and high standards of research integrity. Many of the issues which have an impact on reproducibility stem from wider issues in fostering a positive overall research culture and environment.
In our report “Realising Our Potential: Backing Talent and Strengthening UK Research Culture and Environment”1 we set out some of the drivers which shape our research environment, and in the accompanying Toolkit of Ideas we have collated practical suggestions for all stakeholders in the research ecosystem to foster a positive research culture and environment, which in turn helps to uphold high standards of research integrity.
We wish to highlight that there is no agreement within the researcher community across disciplines that ‘reproducibility’ is meaningful as a concept in all disciplines. While ‘reproducibility’ is often considered synonymous with being ‘trustworthy’, the term itself is a catch-all term for a range of different ways of evaluating the quality of research. For example, it could be taken to mean; a study which gives the same results, a study which can be run again via repeatable methods, data which can be re-analysed reliably (computational/analytic reproducibility), and sometimes even generalisability (results can be (re)applied to a wider population than the study sample). Indeed, there will always be some types of research where reproducibility would be practically impossible, but the results produced would remain valid.
Therefore, we support an approach of a more nuanced and inclusive understanding of the term ‘reproducibility’ which acknowledges the concepts of replicability, reliability and transparency of results as equally important components. This more nuanced understanding of reproducibility is important to reflect the research landscape in which results are valid but where imperfect information means it is irreproducible. Without this clear understanding,
1Russell Group, Realising Our Potential: Backing Talent and Strengthening UK Research Culture and Environment (2021) https://realisingourpotential.russellgroup.ac.uk/
issues of reproducibility could be viewed as reflecting poor practice and fraud, which is rarely the case.
In response to the Committee’s question about the breadth of the “reproducibility crisis”, there is a challenge in assessing the lack of robust data to support this notion. While there are several insightful studies into reproducibility, more work is needed for a full understanding of whether and where problems lie. Whilst anecdotal evidence suggests STEM subjects are more impacted, this is no doubt in part due to the progress made in these disciplines in making more results and data openly accessible, and that they lend themselves more easily to reproducibility in theory.
There are certain existing incentives in the research landscape which are not currently well- aligned to support reproducibility, arising from organisational structures and hiring and promotion practices within universities, as well as from publishing processes and formats.
There is a long-established critique of the research community’s disproportionate focus on securing publication in specific journals or with particular monograph publishers and over- reliance on certain publication or citation metrics.2 Systematic pressures such as a culture of long working hours to meet a succession of short deadlines, high levels of bureaucracy and reporting, and a pressure to publish scientific outputs regularly to secure grant funding and promotion lead to a drive for volume at speed, which can mean that time-intensive processes such as curation of data for publication and re-use can sometimes be neglected, leading to mistakes or questionable research practices.
A culture shift is required to address the wider incentives affecting reproducibility and transparency, and many universities are working hard to consider the wide range of valuable contributions and research outputs made by researchers. For example, all Russell Group universities are signed up to the San Francisco Declaration on Research Assessment (DORA), which advocates for more holistic and balanced ways to assess the outputs of scholarly research (and thus researchers themselves), including discouraging the use of inappropriate research metrics as proxies for quality.3
Publishing industry processes have not yet fully moved to support reproducibility and transparency, and there are further steps which could be taken. For example, it is not yet universal practice for data and methodology to be published alongside research outputs, or for the peer review process to include these supplementary materials. A preference for publications which demonstrate novel, positive outcomes can create a perverse incentive, and in some circumstances could encourage researchers to apply selective models and methodologies to the data which generate the preferred types of outcomes.
Russell Group Universities recognise the importance of training and skills for researchers to support a culture of research integrity. There are many existing initiatives and efforts to equip
2 DORA (2021). San Francisco Declaration on Research Assessment (DORA). [online] Available at: https://sfdora.org/. For example, the San Francisco Declaration on Research Assessment (DORA) notes that “There is a pressing need to improve the ways in which the output of scientific research evaluated by funding agencies, academic institutions, and other parties.” One of its key recommendations is to assess research on its own merits rather than on the basis of the journal in which the research is published and the limitations of the use of the Journal Impact Factor and similar metrics are highlighted.
3 DORA (2021). San Francisco Declaration on Research Assessment (DORA). [online] Available at: https://sfdora.org/
researchers with the skills for good research, such as appropriate research methodology, research data management (including balancing openness with participant rights), open access and consideration of ethical issues. Many of our universities have set up dedicated research methods and advice centres aligned with the UK Reproducibility Network, and other initiatives include embedding statistical support in research units. There are also examples of Russell Group universities running specific sessions to more clearly articulate expectations around reproducibility.
This type of training is integral to academic training programmes and has become more prevalent across universities. However, it is not yet at the desired level in all cases and in all universities and more can be done for the skills and training that affect reproducibility to be delivered with greater consistency to ensure continuous updating of skills to support best practice. Strengthening training provision in universities will have resource implications.
The resources required to produce a fully reproducible and transparent study are significant and must be appropriately acknowledged by universities, publishers, and funders. Activities which support reproducibility and transparency have a significant financial and time cost for the researchers and their institutions. For example, financial resources are required in the hosting of data, and it takes time to deliver enhanced management of data and additional project documentation. Funds to support these activities are not always available from external funders, and even where they are, there is a perception that applying for such funding will make applications less competitive.
The research sector is likely to be much more successful in maintaining a culture of good practice and high standards of research integrity if funders, universities, researchers and publishers all work together in a concerted and sustained effort. Below we set out the role of the different actors in the sector in supporting reproducibility and transparency in research, existing examples of good practice and further steps that can be taken.
One of the most powerful drivers shaping our research environment is the way funding decisions are made, which incentivise researchers and organisations to value the behaviours and outputs that funders reward. We welcome the active role that funders play in the wider discussions around improving research culture and environment.
Some of the underpinning apparatus of reproducibility is better provided at funder level, both due to financial economies of scale and because they would more effectively support visibility and access. Facilitation of pre-registration of research studies in certain disciplines is an example of good practice that promotes transparency and helps avoid questionable research practices such as hypothesising after the results are known. For example, the Wellcome Trust’s AllTrials registry of clinical trials and funder requirements to register clinical trials in a recognised registry4, the ESRC data repository5, and Health Research Authority’s 'Make it public: transparency and openness in health and social care research' strategy.6
4 https://wellcome.org/grant-funding/guidance/clinical-trials-policy
5 https://www.ukri.org/manage-your-award/publishing-your-research-findings/submit-datasets-if-you-have-funding-from- esrc/
Good practice in research can also be supported via a co-ordinated approach across different funders to simplify the process and requirements of funding. An alignment of their policy requirements across different issues (not limited to reproducibility and transparency) would help in pulling together best practice across the research sector and reducing the bureaucracy load for researchers. This could, for example, ultimately result in the development of a reproducibility standard akin to Open Access.
Further steps funders could take to support reproducibility and transparency include changes to funding requirements, for example through a more explicit acknowledgement of the acceptability of negative research outcomes. Support could also be effectively demonstrated by recognising the additional costs in making research reproducible and transparent, encouraging reproducibility in research grant applications by including funds to resource this. Funds could also be made available for replication studies or to research which supports reproducibility or enhances its reputation and value to the academic community. They could ask funding applicants to evidence their commitment to open research.
As the country’s leading research-intensive universities, home to half of all academics carrying out research at UK higher education institutions7, Russell Group universities have a central role to play in driving a positive research culture and environment. This includes putting in place effective policy frameworks, standards and expectations for upholding the values of research integrity, reproducibility and transparency.
One of the ways in which universities are already upholding these standards is through demonstrating commitment to responsible research assessment in line with the San Francisco Declaration on Research Assessment (DORA) principles. It is important that universities translate these principles into action, and in addition seek to ensure their hiring and promotion criteria, application processes for internal funding, and internal awards are incentivising positive culture and good practice. Universities should acknowledge the resources and time required to ensure research is reproducible and transparent, and to align workload and incentives to support it.
Universities have a responsibility to provide appropriate levels of support, training and resource to researchers to ensure they have the skills and tools needed for good practice in data management and research practices. As noted above, whilst there are already many examples of this in universities, more can be done to ensure this is embedded into researcher training in a consistent manner.
Individual researchers can play a crucial role in driving bottom-up change. Principal Investigators and research leaders are well-placed to promote and normalise the values of transparency and integrity amongst their teams. They can lead by example on reproducible and transparent research, giving researchers the time to do careful work, undertake appropriate training, and support mentoring schemes and discussion groups.
They are also able to build transparency into the research design process from the beginning, which can include having open discussions with the research team and funders about what transparency looks like for individual projects (taking account of disciplinary
7 HESA Staff Data 2019/20. (2019/2020). Russell Group universities employ 49% of academic staff (FPE) who are on ‘research-only’ and ‘research and teaching’ contracts at UK universities
differences), and whether data can and should be made freely available, and if so, in what format.
Researchers also have a crucial role in helping the sector understand the scale of the issue and in highlighting good and bad practice. As researchers are also peer reviewers and journal editors, they are essential in asserting and promoting the more nuanced understanding of reproducibility and encouraging journals and other publishers to uphold high standards of transparency in research.
As stewards of the academic research record, publishers share the responsibility to support a productive research culture by supporting the publication only of high-quality, transparent research. The Russell Group Toolkit of Ideas which accompanies our report “Realising Our Potential: Backing Talent and Strengthening UK Research Culture and Environment”,8 sets out some suggestions and ideas that publishers could explore for how research culture and environment aims could be built into the assessment of grant applications, and ways to recognise and reward a wider range of activities that contribute to an internationally excellent research environment.
We propose that publishers take steps to implement policies which address publication bias towards significant results, at the expense of negative outcome and replication studies. One way this could be achieved is through requiring interpretable data sets as part of publication, as well as clear descriptions of methods and practices, making data easier to understand and replicate.
To further address this issue, publishers could explore and invest in new formats for publishing null results, in collaboration with universities. They could also consider asking authors to provide data availability statements (indicating whether research data have been shared and if so, how to access it) and publish annual figures on the proportion of publications for which data have been made available. Finally, encouraging or requiring the pre-registration of an experimental design or data analysis plan could also help to encourage more open and transparent research practices.
We welcome the leadership that the Science and Technology Committee has shown around key issues in research, including on reproducibility and research integrity which are vital to the health of the UK research landscape. The Government has also taken positive action around Open Access and research culture, which will bring tangible benefits to researchers and the sector more widely. On the issue of reproducibility and transparency in research, we support an approach of coordinated multilateral action by all actors in the sector, including universities, funders and publishers, to work together to make positive progress in addressing these challenges, as opposed to any unilateral action from Government which may generate far greater costs for UK research.
We welcome the establishment of the new UK Committee on Research Integrity (UK CORI) and would encourage this committee to engage fully with the reproducibility agenda as a core part of its work. UK CORI will have a unique convening power in the UK research integrity landscape. We would encourage UK CORI to work with other national-level stakeholders such as the UK Reproducibility Network, UKRIO and the RGRIF to act as a
8 Toolkit of Ideas: https://russellgroup.ac.uk/media/5924/rce-toolkit-final-compressed.pdf?=section2
catalyst for national discussion on the reproducibility and transparency of UK research, bringing together universities, publishers, funders, industry and others. This might involve leading on focussed consultations on key challenges, arranging national conferences and events and developing national best practice guidance.
UK CORI can also help to facilitate the sharing of good practice examples across disciplines and help to explore the unique and shared challenges of quantitative and qualitative research fields. This might include a registry of useful resources, the development of baseline universal expectations, and/or guidance on some of the limits to reproducibility such as limits to what data can be made public.
UK CORI may also choose to take a leadership role in monitoring and reporting on reproducibility and efforts to encourage it in UK research. We would encourage this to take a sector-level approach and not to focus down on the reproducibility or otherwise of individual studies. We would also encourage CORI, in this area and others, to work with funders to co- ordinate and, where possible, align funding conditions so that universities and researchers can easily understand what is expected.
(September 2021)