Written Evidence Submitted by Springer Nature
We are proud of the role Springer Nature has in supporting and enabling the integrity of UK research. It is an honour and privilege to be responsible for managing quality assessment (QA) of large numbers of UK research contributions, and to be the curators, custodians and maintainers of the scientific record in the form of published papers. Through this role, working in collaboration with governments, funders and institutions, we think that we can make a major contribution to addressing the issues that underlie the reproducibility challenge. (Please note: the term “reproducibility crisis” referenced in the call for evidence is somewhat hyperbolic and counterproductive in terms of finding solutions so instead “reproducibility challenge” is used in this submission.) For more than a decade we have led the public discussion of these issues, shedding light on the problems and potential solutions (see this collection of some of the most influential contributions to this debate). We have also introduced major changes in the way we do our editorial work: integrating requirements for transparent reporting for many aspects of research that underlie challenges in reproducibility and consequently improving the quality of published work. Vitally, this work has been implemented in partnership with the research community: we are leading on driving the discussion and implementing change but in a way that is directly informed by the needs of that community. These improvements have paved the way for developing community consensus and best practice recommendations. Moreover our overarching commitment to open science is directly linked to solving reproducibility issues in research. So through ongoing evolution of editorial policy, improvements to editorial working practices and continued content contributions, our aim is to promote open research practices and address reproducibility problems in all fields of research, including physical sciences, life sciences, humanities and social science.
The Scope of the Reproducibility Challenge
While it has been known for more than half a century that some clinical research findings could not be reproduced, the problems that underpin the issue are by no means limited to this field. Indeed a survey Nature commissioned showed problems in fields spanning the natural sciences and beyond, including medicine and engineering. More than 70% of the 1500 scientists surveyed had failed to reproduce a previous result. While physics and chemistry were areas where there was greater confidence in the published results, most surveyed scientists in these fields indicated doubts around the reproducibility of a non-trivial proportion of results in their field.
Just how consequential these issues could be beyond the lab was brought home in a commentary we published reporting the failure of Amgen scientists to reproduce the vast majority of a set of landmark pre-clinical research results, when looking to develop impactful diagnostics and therapeutics (see also this earlier opinion piece). Similarly much focus has fallen on reproducibility problems in psychology and social sciences: also fields where incorrect findings can have very direct negative consequences for the broader community. Nonetheless all experimental fields are implicated to a lesser or greater extent, due to the generality of the underlying causes (see below). The silver lining to the dark cloud of this ubiquity is that there are things we can learn from best practices in almost any field, from particle physics to economics, that can help us towards general solutions.
Research is a human endeavour and it is a systemic failure to fully account for counterproductive drivers of human behaviour, individually and collectively, that underpins reproducibility problems. The pressure on researchers to produce work of “significance”, both in a statistical sense and a social one, is intense. This in turn can lead to biases that can influence the design, analysis and publication (or not) of research.This pressure arises from the highly competitive environment that is inherent to the prevailing culture of academic research, reinforced by rewards and incentives that drive researchers. Addressing these underlying causes therefore requires concerted, collaborative and coordinated efforts by those that have key roles in the research ecosystem, including funders, institutions and publishers. Open science is a fundamental part of the solution and policies, training and research assessment practices are the levers we have available to pull to move to a research culture with openness at its heart.
Solutions and roles
Each of the stakeholder groups named in the committee’s call for evidence - funders, institutions, researchers and government - have key roles in addressing reproducibility issues. However unilateral action by any one of these stakeholder groups is bound to be less effective - if not counterproductive - compared to a multilateral coordinated approach. Our success in improving reproducibility will be immeasurably improved through collaborative initiatives and co-produced standards.
The UK government can play a vital role in helping develop shared expectations that can be iterated across the research lifecycle at respective stakeholders - institutions, funders and publishers. There are already good examples of the benefits of coordination among these key stakeholder groups such as the flexible consensus framework for transparent reproducible research and open research practices which we contributed to.
To achieve all the benefits of open science for addressing the reproducibility challenge the UK government needs to strongly support the pre-existing mechanisms for quality assessment, including the peer-review and improvement processes that editors and publishers manage, while helping develop new, more open, ways of working that complement and enhance those mechanisms.
Publications, data, code and detailed protocols all have the potential to be made publicly accessible at point of publication and, indeed, much earlier in many cases. We are committed to building on our own existing work in this area, for example our innovative preprint and protocol sharing initiatives. Research assessment processes need to recognize and reward researchers for adopting open research practices. Ultimately, we need to ensure that outputs like data, code etc that underpin specific research findings are certified in peer review at journals and made openly available (see, for example, our initial work on code peer review). In this way the pre-existing certification checkpoint that journals provide can be leveraged to allow a faster and more effective transition to open science informed by community expectations.
Therefore, it is particularly vital that the UK government includes an ongoing mechanism for engagement with research publishers in developing policies and schemes to improve reproducibility through open research.
Policies and schemes
There is huge value in taking a collaborative approach to creating policies or schemes to embed reproducibility in the UK’s research culture. We and other publishers have developed field-specific reporting checklists, innovative approaches for methodology dissemination, pre-registration of research studies, standards for handling image integrity issues and other initiatives relevant to robustness and reproducibility. So we are well placed to contribute to developing policies and schemes for addressing reproducibility issues and helping implement them effectively.
For example the present pandemic has starkly illustrated both the value of increased openness through preprints but also the pitfalls in blurring the lines between these and the peer-reviewed and curated Version of Record. There are ways of dealing with these pitfalls. Preprint platforms that are well-managed can enable rapid withdrawal of worrisome papers and we have contributed to early community efforts on responsible communication of preprints to a broad audience and development of best practice recommendations for the preprint ecosystem. However more work is needed and the UK government can help. In particular the government could lead in developing a framework in collaboration with all stakeholders to ensure support for the early sharing benefits that preprints enable, including early identification of findings that cannot be reproduced, while minimising, ideally eliminating, misuse of early research contributions that have not been certified and improved through the processes that publishers manage.
Better sharing of government funded research data is a key improvement needed to improve reproducibility. Supporting researchers to overcome data sharing challenges is vital to achieve this improvement and there are numerous areas where we and other publishers contribute already, but can do so more effectively in collaboration with government and other stakeholders: use of persistent digital identifiers, mandates and policies, data management planning and stewardship, appropriate data sharing infrastructure, training, accreditation and aligning incentives are some of the most important.
Detection of misconduct and deterrence of potential violations of scientific integrity policies before they occur is a major focus for us in the context of the quality assessment and improvement processes we manage for research publications. As such we already work with institutions and funders on these issues. However a co-produced government framework for expectations on institutions, funders and publishers in this area, with input from all these stakeholder groups, would be useful, as would more government funding and other support for training and accreditation of researchers in relevant areas.
More generally education and training is key to preventing reproducibility issues. So supporting and enhancing continuing professional development (CPD) for researchers should be a key area of focus in future.The UK government needs to dedicate funding in this area and look at setting up a framework that encourages robust training and accreditation from diverse and reputable organizations (public and private) for practices that are vital to research integrity, many linked to the transition to open science. Of course early career researchers are a vital group to focus training/CPD on in order to support research integrity and good research practice but, with rapid developments in the way research is conducted in most areas (increasing involvement of AI, team science and data management just three of many relevant trends), senior scientists also need regular training and updates. UK regulators of other professions (e.g. medicine) understand the importance of ongoing development of practitioners at all career stages, as the demands of their roles are impacted by rapid change, and have committed the resources to achieve this. It is time that scientists receive the same support.
A considered approach to training/CPD could also address endemic issues that researchers face in their career pathway: providing structured support for researchers transitioning to vital roles outside of academia. An academic career path still is the predominant cultural focus within universities, despite it being a minority end point for those early career researchers that generate the bulk of the data that underpins scientific progress: a structured and funded approach to continuing professional development could change that.
Similarly frameworks / mechanisms that ensure institutions provide necessary infrastructure for management of research data and fully support mandates focussed on appropriate management and curation of data would be an important step forward. Explicitly supporting and/or endorsing pre-existing community and publishing industry standards and initiatives (for example COPE core practices and the “Think. Check. Submit. Initiative” to address predatory journals) would also be useful.
The compliance burden for institutions will likely increase considerably as they are required to better ensure reproducibility-linked goals are met and demonstrate that they have done so. There are different potential ways of ensuring institutions have the support to deliver on these requirements and perhaps the simplest and best way to ensure progress is for funding agencies to sequester money for this purpose as a block grant to an institution, set as a percentage of the total grant funds the institution has received from that funder. Sequestered support in this area could be used in a wide variety of ways from core data management resources and training to standardised pre-publication integrity checks (eg for plagiarism or image integrity). Regardless of the implementation details, publishing clear and actionable policies in this area, which - where possible - should be aligned across funders, would be a useful contribution. As mentioned above, a co-created framework for the roles of funders, institutions and publishers in dealing with potential research misconduct, perhaps reflected in an established code of practice for coordination among these stakeholders, would be useful with an aim of maximising transparency, speed and trust. Such a comprehensive framework for UK research could build on pre-existing efforts to coordinate between institutions and publishers.
A national approach
A national committee on research integrity established under UKRI could make a major contribution to addressing reproducibility issues in UK research. As detailed above there are numerous initiatives to address reproducibility issues that would benefit from a national approach in collaboration with all stakeholder groups and such a committee could lead these efforts. Research publishers should be fully involved in such efforts and ideally represented on the committee: our role in managing the primary certification and quality assessment mechanism for research communications can be leveraged to make rapid strides in collaboration with the other key stakeholder groups in moving to a research culture that rewards openness and reproducibility as well as excellence. Proactively establishing such a collaborative approach to change can put UK research in a world-leading position, ensuring global impact though consistently delivering, and being known for, its reliable and reproducible research.