Written Evidence Submitted by STM









STM supports our members in their mission to advance research worldwide. Our over 140 members based in over 20 countries around the world collectively publish more than 66% of all journal articles and tens of thousands of monographs and reference works. As academic and professional publishers, learned societies, university presses, start-ups and established players we work together to serve society by developing standards and technology to ensure research is of high quality, trustworthy and easy to access. We promote the contribution that publishers make to innovation, openness and the sharing of knowledge and embrace change to support the growth and sustainability of the research ecosystem. As a common good, we provide data and analysis for all involved in the global activity of research.

STM would like to submit the following comments to this call for evidence and hope that our input is useful for your further discussions. STM is available to provide further insights into our current projects if this should be desired.

STM responses

  1. The breadth of the reproducibility crisis and what research areas it is most prevalent in

Ensuring scientific and scholarly integrity is fundamental to progress across science, medicine, and scholarship and is a crucial contributor to better outcomes for the communities that STM and our member publishers serve. STM and our member publishers are committed to continually improving and innovating the communication of scientific articles, books, and data that enhance research integrity, transparency and reproducibility. STM holds the opinion that the publication process is pivotal to reproducibility and more transparency might improve this process. So, publishers have a key role in facilitating reproducibility by ensuring that the scholarly record is verified, curated, disseminated and can be trusted.

STM believes that reproducibility is relevant for all areas of research but varies across different disciplines. Issues are quite complex, multi-faceted, interrelated and therefore a ‘one size fits all’ approach should be avoided. At the same time, it is important to note that some level of irreproducibility is inherent to science and will therefore always occur to some extent.

Scientists attempting to reproduce each other’s outcomes and results after they are being published is an important way in which science is validated or refuted, and thereby part of the normal research process. Nonetheless, the amount of irreproducible research is perceived by some to be increasing. In our view, this has several causes:

-         The output of scientific articles continues to grow significantly (almost doubling every ten years). As volumes increase, so does the amount of irreproducible research.

-         Publishers are becoming better at detecting issues with reproducibility, and research conduct. For instance, tools are being developed to help detect instances of image manipulation.

-         Publishers are increasingly confronted with manuscripts submitted by paper mills; commercial non-legitimate organisations who manufacture falsified manuscripts and submit them to journals on behalf of researchers for a fee, with the aim of providing an easy publication route for the author, also referred to as authorship for sale. Papers coming from paper mills are predominantly in the area of medical or life sciences journals, which means that these papers could cost lives.


  1. The issues in academia that have led to the reproducibility crisis, i.e. its causes


Here, the reproducibility crisis is defined as the increased submission and publication of manuscripts with results that contain irreproducible research. We are of the opinion that its causes are complex and multifaceted, but that we can identify at least the following elements:

-         The ‘publish or perish’ system of reward. Scientists and academics in general are rewarded in terms of their number of publications and citations, which leads to a pressure to publish articles with impactful results. This pressure can lead to unreliable or even fraudulent (and hence irreproducible) research and can be especially strong in specific regions of the world that are focused on increasing its scientific output, impact and reputation

-         The professionalization of paper mills, leading to ever more sophisticated fake or falsified manuscripts that are hard to detect though current peer review process and editorial checks.


  1. The role of the following in addressing the reproducibility crisis:

We would like to focus on the role of publishers in this section.

Publishers verify scientific articles for their reproducibility and transform them into high quality literature. They make considerable efforts and investments to make the publication process transparent, predictable, responsible, accountable, correctable, thereby supporting research integrity, quality, transparency and Open Science.

In STM’s view there are three levels how reproducibility can be addressed:

3.1.            On an individual level at each publisher

Publishers make a major contribution to research integrity through their significant investments and expertise in organizing and providing the infrastructure for peer review, amongst other things. This is a major enterprise; research in 2018 showed that publishers facilitated 13.7 million reviews to support the quality assessment, improvement and publication of 2.9 million articles.[1] New technological tools are applied by individual publishers to improve the peer-review process by finding and selecting the right reviewers, including those built on Artificial Intelligence.

Publishers and service providers invest heavily in tools to reduce fraud and ensure integrity in the scholarly record.Publishers help to highlight cases of plagiarism, image manipulation, and potential misconduct. These include tools such as Crossref’s Ithenticate service which checks the contents of an article against a global database of content and highlights where text has been copied from another source. Another example is Crossref’s Crossmark helping researchers to identify the current status of an item of content.

Regardless of the field of research, sharing research data is one of the most fundamental aspects of maintaining research integrity of research. The availability of research data plays also a vital role in reproducibility as observed during the COVID 19 crisis. There is an increasing need to ensure the findability, accessibility, interoperability and re-usability of research data (FAIR principle). For many years, STM has been working with its members to improve the effective sharing of research data. Much of this is articulated in the STM Research Data initiative, described at https://www.stm-researchdata.org. One result of this ongoing initiative is that more and more publishers are offering data policies (detailed results can be found in the STM dashboard). These policies include data availability statements offered by each publisher which can be used by authors to include a statement in the article where to find the underlying data.

3.2.            Cooperation between publishers in developing standards

The development of standards and technologies where publishers work together with other stakeholders can also support the aim to improve the reproducibility.

With the move to Open Science it became apparent that the use of certain definitions and terminology is helpful in the peer review process. STM started in the beginning of 2020 a working group on peer-review taxonomy to develop standardized definitions and associated best practice recommendations, to be widely spread by publishers and other scholarly stakeholders. Several further iterations followed – such as a public consultation of the first draft of the taxonomy – and in January 2021 a one-year pilot started with the participants from the working group. More details on the taxonomy can be found here. Finally in July 2021 the National Information Standards Organisation (NISO) and STM have announced the formation of a new NISO Working Group to formalize the Peer Review Taxonomy as an ANSI/NISO standard.

Another challenge in scholarly publishing that impacts the reproducibility is image alterations as well as duplications. Whatever the reason is behind the submission of altered and/or duplicated images to a journal, they should be identified early in the article evaluation process, so that journals can take appropriate action prior to publication and in a best case scenario, before peer review. As opposed to text plagiarism, which usually results in the violation of the research process, image alteration and/or duplication can be much more damaging, as it corrupts actual research results, wastes research money on invalid leads, undermines society’s trust in research, and can even endanger the society in which those “results” are used. The STM Standards and Technology Committee (STEC) has appointed a working group to answer questions around automatic image alteration and/or duplication detection. It will address topics like the minimal requirements for such tools, the current quality of them, how their quality can be measured, and how these tools can be widely, consistently, and effectively applied by scholarly publishers. In preparation of this focus on tools, it will also look at a standard classification of types and severity of image-related issues and propose guidelines on what types of image alteration is allowable under what conditions.

3.3.            Engagement with other stakeholders in the research ecosystem

The research ecosytem comprises many stakeholders and all need to act together to achieve the best possible outcome. This is also vital for the reproducibility issue. With this mind STM and its members are engaging in various initiatives.

For example:

-          Committee of Publication Ethics (COPE) which creates best practices and guidance targeted to publication ethics

-          Think.check.submit, a STM initiative helping researchers identify trusted journals and publishers for their research

-          The FAIRsFAIR project, an EU funded project that aims to supply practical solutions for the use of FAIR data principles throughout the research data life. STM is one of the accredited FAIRsFAIR Champions.

-         The European Open Science Cloud (EOSC), which will offer researchers a virtual environment with open and seamless services for storage, management, analysis and re-use of research data, across borders and scientific disciplines by federating existing data infrastructures, STM is a member of the ESOC Association and was recently elected to join an EOSC Task force on FAIR metrics and Data quality.


  1. What policies or schemes could have a positive impact on academia’s approach to reproducible research

Focussing on the causes that we outlined in our response to question 2, we like to list below some high-level suggestions where policy interventions would be beneficial.

-          The ‘publish or perish’ system of reward. Less focus on the quantity of publications and their impact would lead to less pressure to publish, and hence less untrusted, fraudulent and hence irreproducible research. We recognize that this is an issue that has to be addressed by the entire research ecosystem, including funders, universities and research institutes.

-          The use of paper mills should be strongly discouraged and, where appropriate subject of legal action. Moreover, researchers should be trained on science integrity in general, and how to become good science citizens.


  1. How establishing a national committee on research integrity under UKRI could impact the reproducibility crisis

Scholarly publishing is an international venture and therefore any national activity might consider to align with already existing global initiatives. Close cooperation between all stakeholders will be vital in these endeavours. This is in particular important where standards and technologies are developed that might be also shared between the stakeholders (example STM’s project on peer-review taxonomy).


(September 2021)


[1] According to a 2018 study by Publons