Written Evidence Submitted by UCL
1. As the sector looks to change our research culture for the better, we need to improve both the experience of researchers and the quality of research itself. There is considerable potential to improve the quality and trustworthiness of our research by promoting greater transparency and reproducibility in research. Doing so builds on the Open Science agenda, which has made enormous progress, notably when it comes to Open Access.
2. UCL is committed to transparency and rigour in research across all disciplines, and to continue to improve the ways in which we conduct research. UCL’s Statement on Transparency in Research sets out expectations for our researchers with regard to how to make their research transparent and reproducible, while our Statement on Research Integrity sets out our commitment to the highest standards of integrity.
3. Since 2019, UCL has been a member institution of the UK Reproducibility Network, committing to sharing best practice and collaborating across the UK.
4. The findings of a research study are reproducible if they can be obtained in an independent study using the same methods and data as those used in the original study.
5. Research is transparent if the methods, analysis and data are reported and disseminated openly, clearly and comprehensively.
Focussing on research transparency
6. While we recognise the importance of reproducibility, we advocate a focus on research transparency over reproducibility, for the following reasons:
6.1. Transparency is the means to reproducibility: Transparent research practices (also known as ‘open research practices’), including extensive documentation of study design, experimental procedures and/or analytical choices, counter the factors contributing to a distorted evidence base (see paras 14-15) and are necessary for other researchers to reproduce findings.
6.2. Transparency is in our control: Reproducibility is often an unrealistic ideal that is outside of researchers’ control (e.g. when findings are specific to a historical context, such as people’s feelings at the outbreak of the pandemic). By contrast, transparency is a practical, realistic goal. Auditing a policy on transparency is much more feasible, through questions such as: Is the research data freely available?
6.3. Transparency is inclusive: In a number of research disciplines or methodologies, such as the humanities or qualitative research, we may not expect nor seek reproducibility in research. For example, when reviewing the same sources, you may not expect two historians to come to the same conclusions, while anthropological research findings may only be applicable to a specific community or culture. The term reproducibility can instead be perceived as exclusionary. However, transparency is applicable to all disciplinary contexts.
6.4. Reproducibility is not synonymous with validity: In research contexts where reproducibility is applicable, we may not expect reproducible results owing to a number of factors beyond scientists’ control, such as variation between research participants. This should not automatically call the validity of the original research into question. Transparency, not reproducibility, is the mechanism that allows trust.
6.5. Reproducibility is subjective: How close do the results of a replication need to be to those of the original study to count as ‘reproducible’? Statistical thresholds (such as the 0.05 p value) can be used here but these are ultimately arbitrary cutoff points.
7. Reproducibility is important in research contexts where findings must be robust and reliable in order to form a solid foundation on which to build further knowledge.
8. Poor reproducibility of findings is a problem that exists in many different research areas. Most severely affected are experimental, quantitative sciences where there are countless analytical choices that can be made when designing a study or collecting and analysing data. These disciplines/methodologies also tend to focus on subject matters that are inherently noisy and influenced by many factors. Sciences such as experimental psychology, medical trials, neuroscience, epidemiology, economics or social sciences are therefore particularly susceptible to irreproducibility.
9. It is difficult to gauge the extent to which poor reproducibility affects other scientific disciplines, since a focus on reproducibility and replication studies has been lacking elsewhere.
10. Lessons from the above research areas in relation to making research transparent are also valuable for other disciplines, as identified by Huebner and Fell’s Transparency, Reproducibility and Quality (TReQ) toolkit for energy research and other applied multidisciplinary subject areas.
11. Reproducibility is less relevant to the arts and humanities, but transparency is as important in these disciplines as in the quantitative sciences.
Focus on novelty and publication over transparency
12. At present stakeholders in the sector – funders, institutions and publishers – tend to reward researchers who publish novel findings in journals that have a high ‘impact factor’, rather than promoting transparency and reproducibility. Since these stakeholders all contribute to the incentives in the system, they need to share ownership in shifting these incentives and promoting a culture of transparency.
13. The sharing of detailed research methods and data is necessary for other researchers to reproduce findings. However, given the focus on the findings of research rather than the process, detailed research methods and data are often not shared (although the publication of study protocols is being promoted to improve the transparency and standard of research, particularly in medical and life sciences). Researchers can also be reluctant to share the details of their work for fear of others finding errors in their work, or of being “scooped” (other researchers publishing findings based on their method or data before they have had the chance to do so themselves).
14. In the present research culture, numerous factors influence the creation, publication and sharing of research findings. The findings that make it through to the literature and are read by other researchers are often influenced by the factors below, which lead to a distorted, rather than comprehensive, evidence base. Published findings can therefore be a less clear, trustworthy reflection of the “true” nature of the topic being researched, instead reflecting these biases.
15. Factors contributing to a distorted evidence base include:
15.1. Publication bias: Articles with statistically significant, novel, or “clean” (i.e. uncomplicated) findings are more likely to be published., This can lead to researchers employing questionable research practices (QRPs; see Box 1 in the former Commons Science and Technology Committee inquiry on Research Integrity) to obtain more “publication-worthy” results, and not submit null findings for publication (the file drawer problem).
15.2. Citation bias: Articles with statistically significant (“novel” or “positive”) findings are also more likely to be cited than null (or “negative”) findings,, which further incentivises QRPs and discourages publication of null findings.
15.3. “Publish or perish” culture: There is pressure from institutions, funders and academics for researchers to publish a large quantity of papers (e.g. a minimum of three papers during a three-year PhD). This pressure to publish can lead researchers to sacrifice research quality over quantity (e.g. use suboptimal methods, cut corners or make mistakes) and apply QRPs.
15.4. Short-term funding/contracts: The duration of grant funding and research contracts is often too short to enable researchers to accomplish the originally proposed research. This time pressure can limit research quality and promote QRPs.
15.5. Researcher bias: Cognitive biases such as apophenia (the tendency to see patterns in random noise) and confirmation bias (the tendency to focus on evidence that is consistent with one’s beliefs) can lead researchers to draw false conclusions, if unchecked.
15.6. Analytic flexibility: The same research question can be analysed in a multitude of different ways and variations in analytic choices can lead to different conclusions,, limiting reproducibility.
16. In addition to the above points, which are relevant across multiple disciplines, there are also discipline-specific issues. For example, validation of commercial antibodies (experimental evidence of an antibody’s suitability for a purpose) is necessary for molecular biologists to generate research findings that others can reproduce. But validation is time-consuming so it is often not conducted by the companies selling them, nor by the scientists, and it is not requested by journals.
17. Grant application forms should include sections in which researchers can detail a) their past efforts in making their research transparent and b) their plans for making their proposed research transparent (and an opportunity to highlight if data sharing may not be appropriate); both should be considered in award decision making.
18. We would welcome a clear message from UKRI and other funders to incorporate time (e.g. 2-3 months) into bids for making research transparent, including through data management, writing up detailed research methods and sharing of data and code. Similarly we would encourage funders to review bids to ensure transparent research practices are factored into the timeline.
19. There is a need for greater monitoring and accountability. Funder policies should require publication of null findings (as a preprint and/or journal article) and sharing of data and code, unless there is a legitimate exception. Ralitsa Madsen of UCL and Chris Chambers of Cardiff University have developed a template Universal Funders Policy that mandates and rewards the open deposition of all records associated with a publication.,
20. We would welcome a) the incorporation of a light touch reminder to publish findings and share research data and code into the end-of-grant communications with grantholders, and b) a follow-up communication (this could be as simple as an automated email) after the end of the grant to ask for a statement on the availability of the findings, data and code.
21. Funder policies should encourage researchers to set up agreements with international and commercial partners at the outset with regards to plans for sharing findings, data and code.
Transparency in publication
22. Funders should consider working with publishers to trial the Registered Reports funding model.
23. UKRI could consider creating an Open Access journal for UKRI-funded research (similar to Open Research Europe for EU-funded research) that has transparency embedded throughout its processes. Publication tiers could ensure prestige, so it is seen as an attractive option for publication.
24. To improve our understanding of reproducibility, funders should support a) replication studies, including as part of research training for student researchers (including PhD students) across disciplines, and b) meta-research i.e. the study of research itself.
25. In its position statement on meta-research, the UK Reproducibility Network notes the lack of funding for meta-research, noting that “investment [in meta-research] will repay itself several times over, by ensuring the quality of the research we produce”. We would welcome the introduction of meta-research funding (which could focus on research on transparency/reproducibility) across the UKRI councils (since meta-research funding is particularly lacking in non-biomedical disciplines), including via specific funding calls that do not compete with the standard grants that often emphasize novelty.
26. Teaching and training on research transparency should be factored into research programmes (especially doctoral programmes) commissioned or delivered by funders, including UKRI.
Future Research Assessment Programme
27. The Research Excellence Framework has provided a major stimulus to Open Access and could do the same for open research practice. The Government, UKRI and devolved funding bodies should remodel the environment statement of the Future Research Assessment Programme to give greater recognition to institutions promoting openness and transparency in research. Colleagues from the University of Glasgow and UCL will shortly be publishing a proposal for how to achieve this.
28. Alongside funders, institutions have a role to play in incentivising, rewarding and enabling transparent research practices as follows:
28.1. Recognition and reward systems, i.e. recruitment processes, hiring criteria, appraisals and promotion criteria, should recognise and reward contributions to research transparency, and be in line with the Declaration on Research Assessment.
28.2. Institutional policies should set out what is expected of researchers in terms of making their research open and transparent, e.g. the UCL Statement on Transparency in Research.
28.3. Training on the importance of transparency in research and how to make research transparent should be provided, including during PhDs, and for those in multidisciplinary fields.
28.4. Infrastructure should be offered by institutions to enable researchers to practise research transparently. This could include data repositories (e.g. the UCL Research Data Repository) and the provision of Electronic Lab Notebooks.
29. Groups of institutions, such as the UK Reproducibility Network, have a role to play in supporting coordination between institutions and sharing of best practice. For example, the UCL Statement on Transparency in Research was adopted by UKRN as a template statement for other institutions to draw from when developing their own policies.
30. Researchers have a responsibility to make their research open and transparent to maximise the value resulting from their research. Transparent research practices vary considerably across disciplines; expectations on researchers should take into account disciplinary variations. Where appropriate, researchers should pursue the following practices:
30.2. Open research: Make research methods, software, outputs and data open (and make data FAIR) and available at the earliest possible point
30.3. Publication: Publish null findings; consider publishing preprints
30.4. Transparent reporting: Report research in line with recognised reporting guidelines, disclosing all tested conditions, analysed measures, results, statistical methods and assumptions
30.5. Replications: Carry out and publish replication studies
31. Exceptions exist where research data should not or cannot be shared, owing to privacy, non-consent, contractual agreements, legislation or practicality; examples are covered in the UCL Statement on Transparency in Research.
32. Publishers can shift the focus to the quality of research rather than its novelty by incorporating research transparency into their acceptance criteria, and encouraging and enabling publication of null findings and replication studies.
33. There are also a number of specific practices that publishers can implement:
33.1. Require the provision of detailed methods in papers submitted for publication, and ensure there is adequate space within publication requirements to do so
33.3. Offer open science badges for ‘preregistration’, ‘open methods’ and ‘open data’
33.4. Mandate data and code availability statements
33.5. Signpost and encourage use of reporting guidelines
33.6. Create a section in journals for the publication of replication studies of research originally published in that journal
33.7. Employ separate reviewers, such as data experts, to review research data and code
34. There is value in publishers coordinating with each other to ensure a consistent approach.
35. The UK Government funds, commissions, conducts and consumes research. With respect to its role as a funder of research, see recommendations under section C(i).
36. In a system with limited resources, researchers and institutions are less able to invest time in the transparency initiatives set out in sections C(ii)-(iii).
37. In particular, limited resources result from the combination of a) the funding of research considerably below its full economic cost (fEC), at an average of 70% of fEC and b) the decline in QR funding by 17% since 2010. As a consequence, UK HEIs experienced a research deficit totalling £4.6bn in 2019/20 alone, and this deficit is growing year on year.13
38. There is an urgent need for the Government to increase fEC (as considered in the R&D roadmap) combined with an increase in QR relative to project funding. This would free up resources for researchers to implement transparent research practices and for institutions to deliver transparency initiatives, promoting quality over QRPs. We appreciate this would come at the expense of the number of projects; we would challenge the notion that quantity of research projects is preferable to quality of projects. Fewer, higher quality research outputs are surely of more benefit to society.
R&D People & Culture Strategy
39. The Government’s R&D People and Culture Strategy notes the importance of the open research agenda. The implementation of the Strategy should include steps to make research more transparent – such as those recommended in this submission – ensuring coordination between stakeholders.
40. To provide transparency and accountability of policy making, Government-commissioned research tenders should encourage/require the use of transparent research practices, while the degree of transparency of a study should be taken into account in evidence synthesis and policy making.
41. Policies are recommended throughout this submission, including relating to research funding (including fEC and QR (paras 36-38) and meta-research (paras 24-25)), funder (paras 19-21) and institutional policies (para. 28.2) on expectations for researchers; policies relating to recognition of transparency in grant applications (paras 17-18), hiring and promotion criteria (para. 28.1) and publication criteria (para. 32); and publisher policies relating to publication practices (para. 33).
42. The national Committee on Research Integrity should seek to facilitate and support the research community rather than taking a top-down or regulatory role. The Committee has a role to play in:
42.1. Developing, identifying and sharing best practice across all disciplines, engaging with sector networks including the UK Reproducibility Network
42.2. Working with funders and publishers to encourage consistency of policies relating to transparency and reproducibility and recognising exceptions
42.3. Working with organisations (e.g. National Academies) to develop discipline-specific expectations and examples, to ensure an inclusive approach
42.4. Facilitating discussions between stakeholders including researchers, funders and publishers
This submission is based on input from individuals working across UCL in a range of disciplines, listed below in alphabetical order:
 Madsen, R. (2019). Scientific impact and the quest for visibility. The Febs Journal 286(20), 3968-3974.
 Fanelli, D. (2010). “Positive” results increase down the Hierarchy of the Sciences. PloS ONE 5, e10068.
 Franco, A., Malhotra, N. and Simonovits, G. (2014). Publication bias in the social sciences: unlocking the file drawer. Science 345, 1502–1505.
 De Vries, Y., Roest, A., de Jonge, P., Cuijpers, P., Munafò, M. and Bastiaansen, J. (2018). The cumulative effect of reporting and citation biases on the apparent efficacy of treatments: the case of depression. Psychological Medicine 48(15), 2453-2455.
 Duyx, B., Urlings, M.J.E., Swaen, G.M.H., Bouter, L.M. and Zeegers, M.P. (2017). Scientific citations favor positive results: a systematic review and meta-analysis. Journal of Clinical Epidemiology 88, 92–101.
 Poppelaars, E.S., Rattel, J.A., Wislowska, M., Hahn, M., Fernández-Cabello, S. and Rassi, E. (2019). Publish-or-perish culture: A PhD student perspective. https://go.nature.com/2ZSknqR
 Baldwin, J., Pingault, J. B., Schoeler, T., Sallis, H., & Munafo, M. (2020). Protecting against researcher bias in secondary data analysis: Challenges and solutions. https://psyarxiv.com/md5pe/
 Botvinik-Nezer, R., Holzmeister, F., Camerer, C.F., Dreber, A., Huber, J., Johannesson, M., . . . Adcock, R.A. (2020). Variability in the analysis of a single neuroimaging dataset by many teams. Nature 582, 84-88.
 Simmons, J.P., Nelson, L.D. and Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science 22(11), 1359-1366.
 Weller, M.G. (2018). Ten Basic Rules of Antibody Validation. Analytical Chemistry Insights 13, 1177390118757462.
 Madsen, R. (2020). Funders must mandate and reward open research records. Nature 586, 200.
 Madsen, R. (2021). Open Research by Default. https://researchdata.springernature.com/posts/open-research-by-default
 Office for Students (2021). Annual TRAC 2019-20. Table 5, p.16. https://www.officeforstudents.org.uk/media/fd84abb4-49fe-4191-bc3a-6b5cae9b66fe/annual-trac-2019-20-sector-summary-and-analysis-by-trac-peer-group.pdf
 Analysis by the Russell Group