Written Evidence Submitted by the Royal Academy of Engineering




  1. As the UK’s national academy for engineering, the Royal Academy of Engineering welcomes the Science and Technology Committee’s inquiry as drawing attention to the importance of reproducibility and research integrity. The Academy’s submission has been informed by the expertise of its Fellowship, which represents some of the nation’s best practicing engineers, including leading researchers, industrialists, innovators and entrepreneurs. Committed as we are to making the UK the world’s leading nation for engineering innovation, and to improving public awareness and understanding of engineering, we are concerned about any issues that could undermine the benefits that research has for society or the trust that the public has in research. We previously responded to the Committee’s research integrity inquiry in 2017[1].


  1. The Academy provides leadership for the profession on ethics in professional practice, which reproducibility and integrity link to. The Academy and the Engineering Council have together produced a statement of ethical principles, of which honesty and integrity are first[2]. This statement was produced through discussions with engineers from a number of different engineering institutions and with philosophers specialising in applied ethics. It is intended to be a statement of the values and principles that guide engineering practice and should supplement the codes of practice published by the various engineering institutions. The Academy has ongoing work with the Engineering Council in this space, and in June 2019, a joint Engineering Ethics Reference Group (EERG) was established by the Academy and the Engineering Council, chaired by Prof David Bogle FREng CEng. EERG has a strategic level remit with a leadership and advisory role, to shape the profession’s ethics-related activity and steer an enhanced culture of ethical behaviour amongst those working in engineering.


The breadth of the reproducibility crisis and what research areas it is most prevalent in

  1. The extent of integrity issues in engineering research, including reproducibility, have not been explored or investigated to the same extent as in some other disciplines, such as medical sciences or psychology. We believe it would be complacent to take this current absence of evidence as proof of absence of issues, and therefore further investigation of the extent of these issues in engineering and related fields may be warranted before firm conclusions can be drawn. We believe it will be beneficial if such investigations are undertaken by a variety of stakeholders (such as engineers and non-engineers, employers of researchers, funders of research and so on) to give diversity of experience and evidence.


  1. Some work has been undertaken in engineering to explore reproducibility, mostly at the level of international communities. This includes on computer modelling and simulations research[3], artificial intelligence[4] and machine learning[5], chemical engineering[6], and measurement science[7]. Research – and not just in engineering - is becoming increasingly computational, multi-disciplinary, and global in scope, and this means that the importance of code and data is growing[8] - the importance of reproducibility and replicability, as related to this, will therefore grow too. Standards and best practices related to reproducibility developed in these communities should be shared alongside methods.


  1. The often application-focused nature of engineering also means that there could be spillover impacts from research reproducibility on the success of commercial applications and accurate valuations by investors and markets, such as the risk of investors being mis-sold on the near-term application potential. Our perception is that this would be more of a concern in areas where technology is in its infancy.


  1. There are risks at framing issues with reproducibility as a crisis, especially in the face of insufficient substantiating evidence and diversity of definitions. Framing like this has the potential to increase public mistrust in research – it would be better to frame reproducibility as an ‘ideal’[9] rather than a crisis. A single-minded focus on reproducibility won't necessarily lead to excellence or good results in research; the approach must be carefully thought about and appropriate for the subject matter. In addition, the breadth of definitions of reproducibility reflect the diversity of methods across research disciplines, and therefore a diversity of ideals to be aimed for. It is a spectrum and all areas can learn from one another in aiming for the ideal.


  1. There are different definitions of and related to reproducibility, which need to be factored into discussions and when considering the existence or nature of any potential ‘crisis’. Some examples that resonate with engineering community are:
    1. A recent European Commission scoping report[10] contains the following: “we consider reproducibility as a continuum based on three main research processes: reproduction, replication, and re-use. We use the term ‘reproduction’ (and reproducibility stricto sensu) to refer to the re-enactment of a study by a third party, using the original set-up, data and methodology of analysis (e.g. for certification). We use ’replication’ for more general re-enactment of the results, using the same analytical method, but on different datasets (e.g. for comparison). And we use ‘re-use’ for the more loose possibility to re-use the results beyond the original research context, both inside and outside the original scientific discipline (e.g. also for innovation, for transfer, for transdisciplinary research).”
    2. The National Academies of Sciences, Engineering and Medicine in the USA[11] define reproducibility and replicability as follows:
  1. reproducibility to mean computational reproducibility— obtaining consistent computational results using the same input data, computational steps, methods, code, and conditions of analysis.
  2. replicability to mean obtaining consistent results across studies aimed at answering the same scientific question, each of which has obtained its own data. Non-replicability occurs for a number of reasons that do not necessarily reflect that something is wrong.


  1. It should be recognized in discussions that reproducibility and other issues in disciplines will vary internationally, and working together with ongoing endeavors could contribute not only to improvements to research and reproducibility, but also to other goals such as in sustainable development in the longer term. For example, consideration is being given to regional and continental nuances on open science which are relevant to reproducibility across Africa. Here, science systems operate largely independently of one another leading to incompatible policies, practices and datasets, and there is a need to accommodate indigenous knowledge systems expressed in multilingual formats to expand and assure access to millions of Africans who are likely to exploit the potential scientific knowledge for creativity and innovation[12].


The issues in academia that have led to the reproducibility crisis

  1. The Academy would summarise that many reproducibility-related issues relate to the incentives and publishing system of research, as well as to the timescales involved in research. Incentives underpinning aspects of the research process are shaped by incentives in the publishing system. The resulting focus is on novel, publishable results, and do not incentivize careful thinking on reproducibility - there are also limited rewards for considering, conducting and publishing replication studies. Many processes within research are undertaken or occur rapidly and leave little room for confirmation.


  1. Different stakeholders hold the different levers required to enact change, and these will vary by country, discipline and organization – there are needs for different sets of skills to validate code and data, the current system of publications and citations encourages researchers to keep data to themselves, there is a need for greater standardization, a need for the giving of credit for showing reproducible code or well-described data, and a need for established structure, process, tools and other assistance with regard to metadata.


  1. In some areas there may be concerns around IP constraints and sensitivity of data, particularly where industrial partnerships and collaborations are involved this can limit the opportunities for other researchers to have access to the data, and therefore the ability to test the reproducibility. Some would argue that open disclosures associated with reproducibility limit competitive advantages or ability to profitably commercialise innovations[13]. However, UNESCO has concluded that [Intellectual Property Rights] are not an obstacle to open science, and that correct frameworks can stimulate collaboration and ensure, among others, that all contributors that share their scientific data, information and knowledge are adequately acknowledged and recognized[14]. The Academy does not believe that collaborations between industry and academia are incompatible with policies and schemes related to reproducibility such as open access and other open science initiatives. Approaches which frame open science as the default option, where investigators are required to explicitly request permission for confidentiality with a guaranteed follow-up on whether IP has been protected or exploited, could help.


The role of the following in addressing the reproducibility crisis:

  1. research funders, including public funding bodies;
  2. research institutions and groups;
  3. individual researchers;
  4. publishers; and
  5. Governments and the need for a unilateral response / action.


  1. Issues with reproducibility are impacted by multiple elements of the research systems, and therefore require all components of the system to act in order to be addressed.


  1. Research integrity and reproducibility are intrinsically linked, and there are already initiatives relevant to the former in existence. As a research funder, the Academy expects the host institutions of research awardees to agree to endorse the commitments of the Concordat to Support Research Integrity and must have in place formal written procedures and policies to promote and ensure compliance with the commitments.


  1. More than one year confidence of funding and up to date expectations from government would allow greater focus from funders on issues such as reproducibility and integrity. Ongoing one year spending reviews and uncertainty of recent years have not contributed to stability, and this can have a negative impact on the ability of the sector to invest time and resource into addressing longer-term systemic issues.



What policies or schemes could have a positive impact on academia’s approach to reproducible research:

  1. Some policies or schemes the Academy believes could contribute to improving reproducibility include:
    1. Actively bringing in learning from users and industry In industry publication is not often an incentive, but factors relating to quantifiable milestones and regulator data integrity requirements often are. Within some areas of industry the need to ensure data integrity has moved on from being just a regulatory requirement, and is starting to be seen as a driver of business value through the ability to avoid repeat experimentation and also the ability to re-use experimental data with other data sets to derive further scientific insight. This is particularly driven by the opportunity of artificial intelligence and machine learning. Thinking and tools exist in pharmaceutical and health sectors that can be examined for other settings[15] including academia. However, we are hearing emerging concerns that in some tech sectors the generation of high volumes of publications without due consideration of reproducibility is growing, due to commercial incentives around promoting the tools used rather than the use of the outcomes of the research.
    2. Efforts to shift the incentives - The measurement of research productivity leaves little space for reproducibility, and does nothing to reduce the incentives for researchers to hoard data and code[16]. Work is needed to create and evaluate options for changing the way we measure the impact of research[17], and research communities and funders should be involved in devising incentives to perform and submit reproducible research. Concordats and initiatives related to research integrity, career development, research assessment and bureaucracy review are all relevant to reproducibility, and any reviews of them should occur with a reproducibility lens.
    3. Seeking a viable, long-term business model for reproducible research - Research communities, publishers, funders, and others should work together to achieve this. Conversations around 80% Full Economic Costing and whether research funding should be opened to competition outside universities link to this.
    4. Enforcing progress towards open science open science should be the norm with rare exceptions such as national security grounds or where there is a demonstrable plan to exploit IP through which people can be held accountable. Mechanisms to achieve this could relate to FAIR/open data, Preprints, registered reports (more psychology/life sciences), open licensing and requirements for information on how it was produced and how to interpret it should accompany data[18]. These could be written into terms of grants.
    5. Replication studies: In 2016, DARPA Biological tech office funded independent validation and verification (IV&V) of 8 groups they funded. Achieving something like this would be relatively expensive, and funding such research reproducibility and related activities is a major challenge in a time of flat budgets for R&D[19]. The cost-benefit balance of such activities also depends on the research being focused on.


  1. Bodies including the European Commission have also conducted work on reproducibility complete with actions which could be useful here[20].


How establishing a national committee on research integrity under UKRI could impact the reproducibility crisis

  1. The UK is world-leading on setting an example in many areas related to R&D e.g. ethics and quality. A national committee on research integrity could allow us to take a similar approach with reproducibility, whilst also keeping abreast of related initiatives and work. It would be important for the committee to link in with those already doing good work in this space, particularly in light of the global nature of researchincluding NPL (who are also responding to this call) and the European Commission.


  1. Research cannot be considered to have integrity or be excellent if it is irreproducible. Irreproducibility can often be the result of honest mistakes, but there may be occasions where it is deliberate. Something that could be considered is if the committee had a clear mission to educate on best practice, but also to penalize those who deliberately mislead others as to the quality of their research.




(September 2021)

[1] Royal Academy of Engineering. (2017). Research integrity.

[2] Engineering Council and Royal Academy of Engineering. (2005). Statement of ethical principles for the engineering profession.

[3] Coveney, P. V., Groen, D., & Hoekstra, A. G. (2021). Reliability and reproducibility in computational science: implementing validation, verification and uncertainty quantification in silico. Phil. Trans. R. Soc. A.3792020040920200409

[4] Haibe-Kains, B., Adam, G. A., Hosny, A., Khodakarami, F., Waldron, L., Wang, B., ... & Aerts, H. J. (2020). Transparency and reproducibility in artificial intelligence. Nature, 586(7829), E14-E16.

[5] Pineau, J., Vincent-Lamarre, P., Sinha, K., Larivière, V., Beygelzimer, A., d'Alché-Buc, F., ... & Larochelle, H. (2020). Improving reproducibility in machine learning research (a report from the neurips 2019 reproducibility program). arXiv preprint arXiv:2003.12206.

[6] Han, R., Walton, K. S., & Sholl, D. S. (2019). Does chemical engineering research have a reproducibility problem?. Annual review of chemical and biomolecular engineering, 10, 43-57.

[7] Hanisch, R. J., Gilmore, I. S., & Plant, A. L. (2019). Improving reproducibility in research: The role of measurement science. Journal of Research of the National Institute of Standards and Technology, 124, 1-13.

[8] Baillieul, J. B., Hall, L. O., Moura, J. M., Hemami, S. S., Setti, G., Grenier, G., ... & Moore, K. L. (2017). The first IEEE workshop on the Future of Research Curation and Research Reproducibility.

[9] Lusoli, W. (Ed.). (2020). Reproducibility of Scientific Results in the EU: Scoping Report. Publications Office of the European Union.

[10] Lusoli, W. (Ed.). (2020). Reproducibility of Scientific Results in the EU: Scoping Report. Publications Office of the European Union.

[11] National Academies of Sciences, Engineering, and Medicine. (2019). Reproducibility and replicability in science. National Academies Press.

[12] Mwelwa, J., Boulton, G., Wafula, J. M., & Loucoubar, C. (2020). Developing open science in Africa: barriers, solutions and opportunities. Data Science Journal, 19: 31, pp. 1–17.

[13] Gans, J. S., Murray, F. E., & Stern, S. (2017). Contracting over the disclosure of scientific knowledge: Intellectual property and academic publication. Research Policy, 46(4), 820-835.

[14] UNESCO. (2021). Towards a UNESCO Recommendation on Open Science. In Online Expert Meeting on Open Science and Intellectual Property Rights.

[15] Lusoli, W. (Ed.). (2020). Reproducibility of Scientific Results in the EU: Scoping Report. Publications Office of the European Union.

[16] Baillieul, J. B., Hall, L. O., Moura, J. M., Hemami, S. S., Setti, G., Grenier, G., ... & Moore, K. L. (2017). The first IEEE workshop on the Future of Research Curation and Research Reproducibility.

[17] Baillieul, J. B., Hall, L. O., Moura, J. M., Hemami, S. S., Setti, G., Grenier, G., ... & Moore, K. L. (2017). The first IEEE workshop on the Future of Research Curation and Research Reproducibility.

[18] Baillieul, J. B., Hall, L. O., Moura, J. M., Hemami, S. S., Setti, G., Grenier, G., ... & Moore, K. L. (2017). The first IEEE workshop on the Future of Research Curation and Research Reproducibility.

[19] Baillieul, J. B., Hall, L. O., Moura, J. M., Hemami, S. S., Setti, G., Grenier, G., ... & Moore, K. L. (2017). The first IEEE workshop on the Future of Research Curation and Research Reproducibility.

[20] Lusoli, W. (Ed.). (2020). Reproducibility of Scientific Results in the EU: Scoping Report. Publications Office of the European Union.