Written Evidence Submitted by ELIXIR-UK





Prof. Carole Goble

University of Manchester / Head of Node for ELIXIR-UK


Prof. Neil Hall

Earlham Institute / Head of Node for ELIXIR-UK




ELIXIR-UK  is the UK node of ELIXIR, an intergovernmental organisation that brings together life science resources from across Europe. These resources include databases, software tools, training materials, cloud storage and supercomputers. Currently ELIXIR-UK has 20 member organisations across the UK. This document has been developed by the heads of the ELIXIR-UK node and reflects the position of ELIXIR-UK.


ELIXIR-UK welcomes this inquiry. Our evidence has three major threads:


the breadth of the reproducibility crisis and what research areas it is most prevalent in;


As part of a recent 2021 pilot study for BioFAIR, a proposed data commons for the whole of the UK biosciences research community, ELIXIR-UK and Technopolis Inc carried out a survey of around 300 life scientists. We found that the majority (75%) of respondents are sharing their research data with the community ‘wherever possible’, but only a minority (30%) are applying FAIR principles as far as they are able. In particular fewer respondents were making sure that their data was interoperable and reusable, impacting its reusability. 

the issues in academia that have led to the reproducibility crisis;


Our survey of life science researchers found that barriers to people sharing their research data were insufficient resources (time or money), lack of knowledge and training and a lack of relevant automated pipelines to facilitate data preparation and transfer.


When asked about capability, 45% of respondents stated they had never participated in any formal training activity relating to research data management, with most relying on the advice or support of colleagues and otherwise being self-taught.  


When asked what resources they need to be more reproducible they cited access to training, access to data and data analysis infrastructure and technical stewardship support. 


the role of the following in addressing the reproducibility crisis: - research funders, including public funding bodies; research institutions and groups; individual researchers; publishers; and Governments and the need for a unilateral response / action.


Groups: Organisations such as European Research Infrastructures that the UK participates in provide the opportunity to ensure that openness and reproducibility are prioritised not just in the UK but across Europe and beyond. The ELIXIR-UK node and its 20 university and research institute members are working with UKRI and the Wellcome Trust, as well as other UK organisation such as UKRN to support knowledge sharing, promote transparency and openness in research and provide practical digital  infrastructure and data/software stewardship support to enable reproducibility. An internationally integrated approach to challenges avoids replication of effort. Science does not respect national borders - promoting transparency and openness in research must occur not only within the UK, but across the international scientific community.


Research Institutions: Open research professionals such as data stewards and research software engineers (RSE) in or available to our research performing organisations are essential in the adoption of open and transparent research practices. Our European competitors recognise this investing in national data stewardship and RSE programmes. These career paths must be formally recognised and supported by institutions, and the individuals on these paths must be funded specifically for this activity. 


Funders: UKRI councils should require the inclusion of a costed activity related to research transparency in funding requests, with a research transparency plan extending the current data management plans, and investigators held to account that they deliver on the plan. Open and FAIR data is only feasible if there is the digital infrastructure readily available to support the registration of data and its access to transparent computational processing software.  


Universities, national assessments such as REF and publishers have a duty to recognise and reward the adoption of reproducible and transparent research practice, incentivise the right behaviours and provide the infrastructure and support for these behaviours to flourish. Currently questionable research practices are rewarded by metrics the value rapid publishing above all other criteria and by the emphasis on assessing the individual rather than the team. One of the most famous workers in the field of (non)reproducible science John Ioannidis of Stanford University declared that “sloppy science wins”, meaning that those who are careful scientists are slowed down and publish less by their practices whereas sloppy scientists who rapidly publish later unreproducible or even flawed research have still beaten their more careful competitors in the recognition race.


what policies or schemes could have a positive impact on academia’s approach to reproducible research;


There is a need for an increase in computational and data skills that are required for researchers to carry out open research.  This was highlighted on a broad level in the 2020 R&D Roadmap , and the same applies to the scientific community. There must be sufficient accessible training in open data management for researchers at all career stages. This has been recently recognised by UKRI with the Innovation Scholars: Data Science Training in Health and Bioscience call, but this area requires ongoing funding to ensure that there is sustainable and accessible training for all in this area. This is crucial to ensure that the UK retains its globally competitive STEM research sector with open and transparent research outputs. We recommend a sustained focus on and investment in digital skills training and infrastructure to ensure the UK scientific workforce remains globally competitive


Research professionals must be financially supported by institutions and UK funding bodies to build a pool of national capability. Professionalisation of research data and software management is essential to lift our research teams to be reproducible without burdening biologists, chemists, geoscientists to become data stewards and software engineers as well.


Research assessment at all career stages and across all stakeholders need to reward research transparency rather than penalise it.


Systematic and continued investment in infrastructure for supporting open and FAIR data and open and FAIR data processing using software is essential. Otherwise, we are making a policy with no route to practice.


how establishing a national committee on research integrity under UKRI could impact the reproducibility crisis.


We generally welcome the creation of CORI within UKRI as an independent body to examine issues around research integrity, realising that issues around research transparency are not necessarily the same as issues related to research integrity. 


In terms of researcher behaviour, focus should be on developing norms around research transparency, a process that supports/mandates these, and a culture that values people who work to improve on norms. Criticising hard working individuals for what is actually an organisational and multi-stakeholder issue is not helpful or productive. A report from the NIH 2021 workshop on changing research culture (that over 1500 delegates attended) highlights that FAIR data “requires a village”, identifying 7 dimensions in play with stakeholders and pointing out that this is “not a simple matter of individual practice, but one of infrastructure, institutions, and economics. Governments, funding agencies, and international science organizations all will need to invest in commons approaches for data sharing to develop into a sustainable international ecosystem”. 


We should also learn from our national competitors and collaborators. France, they have recently launched their excellent  Second National Plan for Open Science and the EU has recently produced a scoping report on the topic of reproducibility in science.


We urge CORI to build on the extensive number of prior inquiries, reviews, policy documents and efforts, particularly around open and FAIR data and software and action recommendations.


(September 2021)