Written Evidence Submitted by Open WIN (Wellcome Centre for Integrative Neuroimaging), University of Oxford and Reproducible Research Oxford, University of Oxford
(RRE0067)
Submitted by researchers, administrators, and doctoral students at the University of Oxford:
Louise Bezuidenhout1 , Kathryn Dally2, Tsvetomira Dumbalska3, Laura Fortunato4, Cassandra Gould van Praag5 , Malika Ihle6, Matt Jaquiery7, Niklas Johannes8, Charles Rahal9, Susanna-Assunta Sansone10, Catherine Wheatley11, Mirela Zaneva12, Noa Zilberman13
1 Research Fellow, Institute for Science, Innovation and Society; Reproducible Research Oxford steering group member
2 Research Integrity & Policy Lead, University of Oxford Research Services; Reproducible Research Oxford steering group member
3 Doctoral student, Department of Experimental Psychology; Reproducible Research Oxford Fellow
4 Associate Professor in Evolutionary Anthropology, Institute of Human Sciences; Principal Investigator, Reproducible Research Oxford
5 Open Science Community Engagement Coordinator, Wellcome Centre for Integrative Neuroimaging; Reproducible Research Oxford steering group member
6 Reproducible Research Oxford Coordinator
7 Doctoral student, Department of Experimental Psychology; Reproducible Research Oxford Fellow
8 Postdoctoral researcher, Oxford Internet Institute; Reproducible Research Oxford Fellow
9 Departmental Research Lecturer, Department of Sociology; Reproducible Research Oxford steering group member
10Associate Professor, Data Readiness, Department of Engineering Science; Reproducible Research Oxford advisory board member
11 Policy Engagement Manager, Wellcome Centre for Integrative Neuroimaging
12 Doctoral student, Department of Experimental Psychology; Reproducible Research Oxford Fellow
13 Associate Professor, Computing Infrastructure Group, Department of Engineering Science
Key Recommendations and Statements
● Funders to incentivise reproducibility and research integrity by assessing applicants and proposals against these criteria.
● Institutions to promote research integrity by hiring against wider criteria; to normalise the sharing of data, protocols, and other materials; to provide research integrity training, and to fund data curation/storage.
● Individual researchers to be supported by institutions and funders in the development and running of grass-root reproducibility initiatives.
● Publishers to take greater responsibility for accuracy of the research record by requiring (where applicable) the pre-registration of experimental design and of analyses; requesting research metadata and the deposition of data underpinning results in public repositories; and providing training in the peer-review process.
● We support the establishment of the UK Committee on Research Integrity by UK Research and Innovation (UKRI) and believe it signals the value placed on transparency and reproducibility.
Introduction: summary of expertise and reason for submitting evidence
1. We are a group of University of Oxford researchers, administrators, and doctoral students with an interest in advancing open scholarship and research reproducibility across all disciplines and at every career stage. Collectively, our research groups have received more than £100m in funding in the last decade from organisations including Wellcome Trust, the Leverhulme Trust, the Biotechnology and Biological Sciences Research Council, the Natural Environment Research Council, and the Medical Research Council. We sought views from all academic divisions of the University and other relevant units in preparation of this submission.
2. At the institutional level, we promote reproducibility and research integrity through Reproducible Research Oxford (RROx), the local node of the UK Reproducibility Network (UKRN). RROx coordinates institution-wide grassroot initiatives and organises events such as computing skill workshops, free and open source software meetups, and the Oxford | Berlin Summer School on Open Research.
3. Open WIN is an open science community project, directly funded by the Wellcome Trust, which aims to develop and share tools and policies to enhance the transparency, reproducibility and integrity of research at the Wellcome Centre for Integrative Neuroimaging (WIN).
4. A member of our group co-authored the FAIR (Findable, Accessible, Interoperable, and Reusable) Guiding Principles, which define internationally-adopted standards and policies for scientific data management and stewardship.
5. Much has already been said about the breadth and prevalence of the reproducibility crisis. No field is unaffected: all disciplines are vulnerable to the perverse incentives that can undermine research integrity. That is why RROx coordinates solutions across the University as a whole. We accept that these problems exist, and we believe that they offer an opportunity for positive change. We therefore present responses to the inquiry’s final three questions on how to improve reproducibility and research integrity, based on our collective experience across a diverse range of institutional contexts, roles, and career stages.
6. By revising their funding policies, funding organisations are in a position to incentivise reproducibility and foster a research culture in which integrity can flourish.
7. We support the Royal Society's recommendation that funders assess individual applicants against criteria that include their commitment to open and rigorous research practices.
8. We recommend that funders require project proposals to describe how transparency and reproducibility will be ensured. Specifically, the final portion of an award could be made conditional upon demonstrating adherence to good research practice; where applicable, pre-registering an experimental design and analysis plan (and possibly submitting it to funders) before data collection; and describing how the research team will share sources, data, analysis code, and experimental materials, following the FAIR principles where appropriate.
9. To address gaps in funding for digital research tools and capabilities, funders should consider extending the length and scope of awards to include specific funding for open research practices including constructing, maintaining, and storing data and/or Free and Open Source Software. More generally, funding bodies could provide resources to support meta-research and the continuing professional development of researchers, including training to support the acquisition of skills in line with open research practices.
10. Funders could adopt policies encouraging the publication of negative results, replications, data papers and software, so that all research in a given field can be validated and summarized accurately (e.g. through meta-analyses).
Q3. What is the role of the following in addressing the reproducibility crisis:
11. Research institutions and the research groups they host are in a position to implement structural and cultural change to support reproducibility and research integrity at all organisational levels and career stages.
12. Taking a structural approach, we recommend that research institutions incorporate mandatory training and continuing professional development in open and rigorous research practices at all career stages. We suggest that such training should cover reproducibility and dishonest data manipulation, as well as good Research Data Management (RDM).
13. We further recommend that institutions consider candidates’ commitment to open and rigorous research practices during appointment, promotion, and tenure decisions, and that they move away from using journal impact factors and numbers of publications, following the DORA principles. To facilitate this, we recommend that a) Digital Object Identifiers (DOIs) are appended to all research-related outputs including data sources, datasets, and analysis codebooks using services such as the Zenodo digital archive; and b) project teams use a taxonomy such as CRediT to define and recognise all individual contributions to research outputs.
14. To support transparency and reproducibility, we recommend that institutions invest in data and software infrastructure and research support staff trained in its use, so that data repositories and Free and Open Source Software are accessible and available to share. Institutions should also develop data policies, and good data management practices, and guide their researchers to share and report data in public data repositories using community standards, informed by resources such as FAIRsharing.
15. To normalise open research practices, institutions should support and incentivise (for example, through grants and other awards, leading to enhanced recognition) researcher-led grass-root initiatives to build and sustain communities of practice.
16. At a cultural level, research group leaders are well-placed to promote and normalize values of transparency and integrity. Widening institutional reward and recognition criteria to include contributions to a supportive research environment would incentivise such action. We suggest Principal Investigators lead by example on reproducible studies, and support the organisation of mentoring schemes, discussion groups, and training workshops.
Q3. What is the role of the following in addressing the reproducibility crisis:
17. Individual researchers have the ability to drive bottom-up change in the academic system: they have a moral obligation to take responsibility for individual and collective research integrity by thoroughly, diligently, and critically examining their own and others’ work. Principal Investigators and group leaders have a responsibility to foster a culture of integrity and reproducibility. Constructive strategies include establishing local organisations such as RROx to coordinate grass-root initiatives, and emphasising the importance of key practices, such as pre-registering studies to reduce biases. Early-career researchers can play a part by organising events and journal clubs.
Q3. What is the role of the following in addressing the reproducibility crisis:
18. As stewards of the research record, journal publishers have a responsibility to address problems of transparency and integrity that are, in part, a consequence of the pressure and incentives to publish novel, high-impact work in preference to replication studies or null results. Moreover, publishers’ motivation to profit leads to high charges for Open Access publishing, meaning that research outputs are still too often behind a paywall, and thus not readily accessible by the public.
19. Publishers have a responsibility to ensure research integrity by thoroughly verifying and fact-checking manuscripts. To discharge this obligation, where applicable, publishers should require pre-registration of an experimental design and analysis plan before data collection, or offer the possibility of submitting registered reports. These approaches prevent selection bias, p-hacking (i.e. misuse of data analysis to find patterns in data that can be presented as statistically significant), HARKing (i.e. “hypothesising after the results are known”), and other threats to reliable research. We further recommend that publishers request metadata from all researchers, and also (except in the case of commercial or sensitive information), that they request the deposition of data sets, digital sources, and digital collections in relevant public repositories such as the FAIRsharing platform.
20. Crucially, journals must be responsible for the accuracy and transparency of the work they publish, regardless of any impact on profit. In addition to fact-checking, they must allow an adequate period for peer comment, critique, and replication; remove flawed publications or those found to have used faulty datasets; and publish corrections and retractions promptly.
21. We further recommend that publishers offer training and publish guidelines on the peer review process, and insist that manuscripts adhere to standard reporting checklists and other community standards to ensure codes are reproducible, methods replicable, and data is FAIR. Reviewing associated data and code is necessary but time consuming: we suggest the formation of a working group at national level incorporating funders, institutions, and publishers to agree how this process will be resourced.
22. To address publication bias (i.e. bias in the research record towards statistically significant results, which are more likely to be published compared to negative or null findings), we suggest journals adopt results-neutral peer review and encourage (where relevant) the use of registered reports, in which study designs and protocols are submitted for peer review prior to data collection and analysis.
Q3. What is the role of the following in addressing the reproducibility crisis:
23. Universities have always welcomed Government’s hands-off approach to research culture: institutions and their staff value free thought, free speech, and intellectual independence.
24. We acknowledge the increase in funding allocated to UK Research and Innovation (UKRI) for 2021-22, and note that any future cuts to higher education could have the unintended consequence of reducing resources to promote open research. To ensure researchers understand the importance of open and reliable research, we urge the government to consider ring-fenced funding for research ethics and integrity, encompassing open research practices.
25. Funding policies, research training schemes and tighter regulation of academic publishing could have a positive impact on the approach to research reproducibility as detailed above.
26. Publishers have a vital role to play in maintaining the integrity of the academic record, by correcting errors, and communicating the legitimacy of research after retractions and corrections have been published - yet they have little motivation to do so. We suggest investigating the feasibility of fining or otherwise sanctioning publishers that fail to correct or publish an expression of concern when a paper is found to contain unambiguous major errors.
27. The UK Committee on Research Integrity (UK CORI), now being established by UKRI, has potential to have a positive impact on the reproducibility crisis, depending on its scope, remit and resources. At a minimum, its establishment signals the value placed on research integrity.
28. UK CORI could be tasked with developing and coordinating training courses and establishing best practice guidelines. We encourage UK CORI to collaborate with other key stakeholders such as the UK Reproducibility Network, which has broad representation, including academic journal publishers, funding organisations, and all levels of the academic community.
Supported by the Undersigned
● Dorothy Bishop, Professor of Developmental Neuropsychology, Department of Experimental Psychology; Reproducible Research Oxford steering group member (see also separate response submitted in a personal capacity)
● Laurence Brown, Research Technology Specialist in IT Services; Reproducible Research Oxford Fellow
● Megan Gooch, Head of the Center for Digital Scholarship and Digital Humanities Support; Reproducible Research Oxford steering group member
● Adam Kenny, Postdoctoral researcher in Anthropology, Institute of Human Sciences; Reproducible Research Oxford Fellow
● Conor Keogh, Doctoral student in Functional Neurosurgery; Reproducible Research Oxford Fellow
● Clare Mackay, Professor of Imaging Neuroscience; Open Science Academic Lead, Wellcome Centre for Integrative Neuroimaging
● Ruth Mallalieu, Head of Open Scholarship Support, Bodleian Libraries; Reproducible Research Oxford steering group member
● Sam Parsons, Post-doctoral Research Associate in Experimental Psychology; Reproducible Research Oxford Fellow
● Olly Robertson, Research Assistant in Experimental Psychology; Reproducible Research Oxford Fellow
● Manuel Spitschan, University Research Lecturer, Department of Experimental Psychology; Reproducible Research Oxford Fellow
The views expressed are those of the authors and signatories, and not necessarily those of the University of Oxford. The funding bodies supporting authors and signatories had no role in the writing of this submission.
(September 2021)