Written Evidence Submitted by the Publishers Association

(RRE0056)

 

Executive Summary

 

 

 

 

 

 

Part One: Industry Context

About the Publishers Association
  1. The Publishers Association is the member organisation for UK publishing, representing companies of all sizes and specialisms. Our members produce digital and print books, research journals and educational resources across genres and subjects.

 

About the UK’s academic publishing sector
  1. Academic publishers underpin the capability and influence of the UKs research cycle. We help researchers uncover new ideas and findings, and then ensure that results are validated, presented effectively, and easily discoverable. 

 

  1. Publishers deliver the publication of research articles and books, but also invest in the broader infrastructures required to serve the research community – such as cross-discipline platforms and highly sophisticated data repositories. These high-quality tools deliver technical data, analytics and technology capabilities to help foster discovery and innovation.

 

  1. Our sector is world leading. In 2018, UK authors produced 7% of all research and 14% of the most-cited research in the world in approximately 188,000 research articlesThis is astounding, given that the UK’s population is approximately 1% of the global total, and is only set to increase given the government’s R&D ambitions.

 

  1. An independent report by Frontier Economics looking into “Publishing’s Contribution to Research and Innovation” found that “in several key areas (metrics, coordinating peer review, establishing precedence and IP), publishing’s contribution may be hard to replicate; without the highly specialised and time-intensive tasks carried out by publishers, innovation may be less effective.”

 

About the academic publishing process
  1. When research has been completed, it is typically subject to editorial scrutiny and peer-review. Through this process, an author's manuscript is critiqued by other experts in the same field and improvements are suggestedWhilst not infallible, peer-review is at the core of how publishers help control and increase the quality, integrity and accuracy of research – building a reliable canon of information that subsequent researchers can trust. Only submissions that pass the peer-review process and satisfy editorial standards are accepted.

 

  1. Once a manuscript has been accepted, editorial teams carry out pre-production checks, including checks for copyright permissions and plagiarism, and establish the author(s)’s ownership of ideas. Production itself then involves copy-editing, proof-reading and type-setting the manuscript, and producing print and electronic formats of the text ready for publication.

 

  1. Publishers also play an integral role in improving the quality and discoverability of research outputs through copy and art editing, indexing, adding enriched metadata, search engine optimisation and operating online platforms. So too do we lead initiatives that improve the findability and usability of content, for example: CrossCheck, CrossRef, CrossMark, ORCID and FundRef.

 

  1. Following publication, publishers continue to maintain and preserve the scientific record by issuing corrections and providing impact metrics for each publication. These metrics help funders, HEIs, researchers and governments to assess the quality of research, anticipate emerging research areas, and make funding decisions.

 

  1. Finally, publishers work to identify pirated material that may undermine the integrity of the research canon. In-house anti-piracy teams send take-down notices where appropriate and consider more co-ordinated action to tackle broader-scale illegal operations.

 

 

Part Two: The Reproducibility Crisis

The breadth of the reproducibility crisis
  1. Publishers acknowledge and understand the broad concerns around research integrity. The research canon relies on the cumulative veracity of each publication. If inaccurate research enters the canon, there will be a cumulative impact on the efficiency and accuracy of future thinking. It is therefore right that the challenges and opportunities with regards to reproducibility are thoroughly explored.
  2. We would though offer some reassurance that fabricated, false, and biased research is not commonplace; nor does it appear to be a growing issue.[2] In some circumstances a rise in retractions is likely because of improvements in detecting these kinds of issues rather than an increase in misconduct itself. Whilst UK research could and should be made more open, discoverable, and reusable, crisis is perhaps an overstatement of the matter at hand. 

 

  1. It is also important to recognise that there is inevitably variance across academic disciplines when it comes to reproducibility. In areas such as medicine and psychology, for example, concerns regarding data sharing are well reported and there have been active movements by these academic communities to improve.[3] Conversely, several communities within experimental physics and astronomy – such as the work conducted at CERN – have a long and thriving culture of sharing research data to support transparency and reproducibility.

 

 

The issues in academia that have led to the reproducibility crisis
  1. There are a range of complex themes that deserve recognition, scrutiny and more evidence gathering. These include, but are not limited to:

 

  1. The increasing sophistication of the research being done in the UK. In some instances, the practical limit on reproducibility is the advanced level of expertise required to engage fully with the complex material. For example, in theoretical physics, the published work itself should provide the necessary information to scrutinise and contest any claims made (i.e there is no deliberate attempt at fraud), but the information may not be meaningfully scrutinised without a high-level of subject-knowledge.

 

  1. The incentives created by the rewards and evaluations systems for researchers. As it stands, the structures in place have led to the under-publication of negative or inconclusive results, which in turn means that failure to replicate a study is often unreported. Researchers can be reluctant to share undesirable findings that do not fit the original hypothesis, leading to publication bias.

 

  1. Different disciplines face different barriers. The form of research data can vary immensely between different disciplines and subject areas. Data can include anything from video to transcripts, questionnaires to slides. This generates different difficulties for ensuring the reproducibility of research. For example, pharmaceutical research may be hampered by data protection regulations and privacy restrictions, or by the inclusion of commercially sensitive data. Conversely, qualitative psychology studies may be challenging to replicate in light of inconsistencies in human behaviour.

 

  1. The expense of maximising reproducibility. There are several costs to providing the infrastructure to facilitate third party reproduction, as well as project overheads with regard to validation, administration, and curation. Compliance with any particular framework or standards can be very expensive.

 

Part Three: Ways Forward

 

Publishers’ role in addressing the reproducibility crisis
  1. Publishers take their responsibility to research integrity very seriously and are in a strong position to collaborate across the research landscape to generate positive change. Our role is multiplicitous, with the key initiatives as follows.

 

The peer review process

  1. The peer review process is a cornerstone of scholarly publishing and is essential part of the publisher’s role. Publishers ensure that a range of appropriate, qualified peers objectively and systematically scrutinise new research. These reviewers consider the relevance and importance of new results, as well as the methodological rigour of the work undertaken. Competent, rigorous and constructive peer review plays a fundamental part in ensuring published research is scientifically sound and sufficiently clear to enable reproducibility.

 

  1. Publishers are always taking steps to further reinforce the integrity of the peer review process. For example, the Institute of Physics has recently launched its ‘peer review excellence’ programme which seeks to provide a systematic programme of training and certification to peer reviewers specifically in the context of physical science research. Researchers completing the programme will be better equipped to undertake high-quality review of research and also receive a form of peer review accreditation.

 

  1. Moreover, 11,000+ journals are members of the Committee on Publication Ethics (COPE), which provides advice to editors and publishers on publication ethics. COPE publishes ethical guidelines for peer reviews which sets out the basic principles and standards to which all peer reviewers should adhere to during the peer review process.

 

  1. In recent years there has also been increased experimentation with open peer review – whereby peer review reports are published alongside the final paper – and acknowledgement of review contributors for research credit. These practices have introduced further transparency to the review process and added important expert context for the benefit of researchers and interested members of the public alike. As an example, in September 2018 Wiley began a collaborative pilot initiative to open the peer review process. Wiley found that on average 86% of authors remained opted-in to a transparent peer review process when given the choice.[4]

 

Open Research

  1. Research that is more widely accessible can receive wider scrutiny and challenge. Hence the promotion of open research practices has the potential to significantly bolster reproducibility.

 

  1. The UK’s transition to Open Access is one critical element of Open Research. We project that 87% of the UK’s research articles will be available via Gold Open Access by 2022, largely through transformative agreements. This is a 57% increase since 2016.[5] We would emphasise here a preference for “Gold” Open Access, as this model gives immediate access to the final published version of record that is maintained and updated in perpetuity. In contrast, “Green” Open Access usually only provides access to the “Accepted Manuscript”.

 

  1. However, Open Science is also about providing the right context in which to understand an article. Hence, publishers are increasingly encouraging and facilitating the research community to make the research data underpinning published research more open. Publishers have developed data policies guiding researchers as to best practice and options for depositing and sharing their data publicly in accordance with ‘FAIR’ (findable, accessible, interoperable, re-usable) principles.[6]

 

  1. For example, Oxford University Press introduced its own set of research data policies in early 2020.[7] The policies create a structure for journals to govern requirements for the transparent reporting and sharing of the data and software underlying the research that they publish, defining standards for the inclusion of Data Availability Statements, for data citation and supporting authors with information about how to select an appropriate repository for their data.

 

  1. Indeed, many publishers now allow or even require researchers to complete a data availability statements to accompany any publications. These statements make clear where data can be found, including, where applicable, hyperlinks to publicly archived datasets that are relevant to the study. Data availability statements can equally indicate whether data is available on request from the authors or where no data is available (i.e. where privacy might be compromised).

 

  1. However, to maximise research integrity in the UK, there will need to be an even greater culture of transparency and openness regarding the methods and data underpinning research. Researchers will need to have clear incentives, training, and financial support to store their research data in such a way that it can be accessed as openly and widely as possible.

 

  1. Different research communities are at different stages of readiness when it comes to sharing data. Many researchers continue to encounter limitations or even barriers to their ability to share data publicly – such as legal, ethical or in some cases entirely legitimate practical obstacles.

 

  1. It is particularly worth noting that there is a continual lack of familiarity with FAIR principles with as many as 40% of researchers. For context, a PWC study undertaken for the European Commission estimates the cost of research data not being FAIR could cost up to 26 billion in Europe alone. This is, in large part, owing to work being duplicated due to a lack of awareness of existing research or negative results.[8] It stands to reason that FAIR compliance should be prioritised as an objective for an evolving research environment.

 

 

Maintaining the scholarly record

 

  1. Publishers continue to take responsibility for maintaining the scholarly record after a work is published. Publishers play a crucial role in the management of ‘publication ethics’ which includes article retraction, erratum, corrigendum and expressions of concern.  Once a query about an article is received, journals will investigate and decide upon the appropriate action to take in accordance with the Committee on Publication Ethics (COPE) guidelines.

 

  1. On a similar theme, tackling piracy is critical to maintaining the integrity of the academic canon. Not only do pirate sites such as SciHub or Z-Library illegally steal content by misappropriating institutional login details, these sites also undermine the continuity of scholarly discourse. Whilst a publisher may amend or retract an article, this context may be lost on a pirate site – leading to confusion for its users. In short, there is little to no accountability for the content hosted on these platforms. Publishers are working to tackle these criminal enterprises, but nevertheless their ongoing presence and usage remains a significant threat to research integrity.

 

 

Promoting best practice

 

  1. Publishers continually encourage best practice through existing and respected brands. For instance, Cell Press, a leading journal published by Elsevier, has developed “STAR Methods” (structured, transparent, accessible, reporting) to “promote rigor and robustness with an intuitive, consistent framework that integrates seamlessly into the scientific information flowmaking reporting easier for the author and replication easier for the reader.

 

  1. In other cases, publishers have sought to encourage negative results and replication studies. Most high-impact journals do not require research to have received positive results, but publishers have also created new journals with a focus on evaluating research purely on the grounds of methodological clarity and rigour rather than on the novelty of the results reported. These include publications dedicated to describing the data or the methods associated with a new piece of research, enabling researchers to gain more recognition for this important dimension of research activity.

 

  1. For example, Cambridge University Press’s Experimental Results offers researchers the opportunity to gain credit for reproducing or failing to reproduce the work of others.  Similarly, IOP SciNotes is a journal which seeks to provide a home for publishing work that will foster reproducibility, such as replication studies, data or methods descriptors, and new results that are assessed solely on the grounds of their relevance and methodological soundness rather than their scientific novelty.

 

The role of pre-prints

  1. In response to Covid-19, publishers have gone to significant lengths to make relevant research freely accessible, whilst also accelerating the peer review of critical articles. Pre-prints have also played a role in rapidly responding to the ongoing public health emergency.

 

  1. However, pre-prints are not peer-reviewed, and therefore their content may be subject to considerable moderation before final publication. We should be cautious about how they are presented and disseminated into the public domain. A balance must be struck between accuracy and timeliness.

 

Proposals regarding policies or schemes that could have a positive impact

 

  1. Publishers are keen to offer constructive suggestions that could have a positive impact for research integrity. The following is a non-exhaustive list of proposals that should be considered:

 

  1. Definitions for key terms should be clearly agreed. It is important to achieve semantic consistency across stakeholder discussions, particularly for terms such as “reproducibility”, “repeatability”, and “replicability”.

 

  1. Co-operation should be a priority. Publishers, researchers, research institutes and funders must continue to work together. Greater collaboration in sharing best practice, agreeing standards and providing more support to extend the latest developments more widely could have a highly positive impact on tackling reproducibility.

 

  1. A flexible approach should be taken. A unilateral policy or solution may not adequately respond to the diverse and nuanced requirements of different research disciplines. Similarly, it is imperative that any co-ordinated response on research integrity does not impinge on academic freedom or innovation.

 

  1. Researchers should have access to the Version of Record. The Version of Record is the final, fully functioning research article which includes integrated open data sets, open methods, open protocols and open metrics. It is the most useful and reliable version of the research. Immediate access to the Version of Record can best be provided through a Gold Open Access model and by properly tackling the availability and use of pirated research.

 

  1. Training and incentives are key factors in academic research culture. Efforts should be focused on improvements to research culture from the earliest stages of any research. Publishers have a key role here in encouraging open research and raising awareness amongst our authors; however, more needs to be done across the academic community (including institutions and funders) to properly reward reproducibility practices.

 

  1. Publishers should continue to provide the infrastructure for varied research outputs. While publishers are already adding extended options for the publication of other key elements of research linked to the research article, more work can always be done. As previously discussed at (39), this would enable researchers to have a greater opportunity to secure credit and visibility without having to change their preferred publication venues. Again, training and incentives are key factors in encouraging researchers to then make use of these facilities – as there has historically been a low-uptake.

 

  1. Technology should be used to assist researchers. There is also a role for technology in helping researchers (whether this be as editors, peer reviewers or authors) to help report research as transparently as possible. The use of tools to check for text similarity is now routine, but this could be extended to image checks and statistical reporting. Publishers should continue to take reasonable steps to ensure that this technology is developed and put to good use.

 

  1. Investments in new initiatives should be properly supported. The development and operation of tools, services, and policies has financial costs that are born by publishers, their partners and, indeed, the whole research community. These stakeholders will require support and appropriate monitoring on the return on their investment (i.e. more efficient, reproducible research), in order to sustain these investments.

 

Thoughts on establishing a national committee on research integrity under UKRI

 

  1. Publishers support the creation of a national committee on research integrity. We would make the following suggestions regarding its operation:

 

  1. The committee should be inclusive. The committee should feature a variety of representation from across all key stakeholders in the research ecosystem, including early and late career researchers, minority researchers, research support staff (e.g. institute research offices, lab technicians), a variety of research institutes, a variety of publishers, data and code repositories, academic societies, existing standards bodies (such as COPE), and other key research funders for UK researchers.

 

  1. The committee should endeavour to be objective. It should approach matters with an open mind, rather than set out to develop an evidence base that reinforces an existing perspective or policy.

 

  1. The committee should provide an information-sharing role. Problematic research is often split over multiple publishers, preprint servers, data repositories, and co-authors’ institutions. In such cases, it is only possible to identify or understand potential issues with the integrity of research by coordinating information across all these parties. This is something that the committee might usefully assist with.

 

 

Final Remarks

 

  1. The Publishers Association and its members are available to further support the committee in its work. Please contact Amy Price (aprice@publishers.org.uk) with any questions raised by our submission.

 

 

(September 2021)

 

 

 


[1] Daniele Fanelli (2018), Opinion: Is science really facing a reproducibility crisis, and do we need it to?

[2] Daniele Fanelli (2018), Opinion: Is science really facing a reproducibility crisis, and do we need it to?

[3] Monya Baker (2016), 1,500 scientists lift the lid on reproducibility

[4] Elizabeth Moylan, Kornelia Junge, Candace Oman, et al. (2020), Transparent Peer Review at Wiley: Two years on what have we learnt?

[5] Scopus data.

[6] M. Wilkinson, M. Dumontier, I. Aalbersberg, et al. (2016), The FAIR Guiding Principles for scientific data management and stewardship

[7] Oxford University Press Research Data Policy.

[8] PWC (2019), Cost-benefit analysis for FAIR research data