Written Evidence Submitted by Wiley

(RRE0068)

Introduction

Wiley is a leading scholarly publisher that offers different access models to support the wide range of needs and situations that exist across research communities. Wiley publishes c. 1600 journals. Hindawi, which Wiley acquired in January 2021, is one of the largest fully open access journal publishers, publishing c. 230 journals that deliver openness in scholarly publishing and research communication. Both companies are committed to open access and the wider open research movement [1].

Upholding the values of research integrity and reproducibility is fundamental for responsible research and publishing practice. It is also a shared responsibility for all stakeholders involved in research, including funders, researchers, institutions, societies and publishers [2]. No one stakeholder can do this alone. As publishers, Hindawi and Wiley are committed to the researchers we serve, publishing research that drives and advances knowledge. We facilitate the peer review of scholarly work for improvement and validation, and enable discovery and access to published findings, so that others can test ideas and build on them. We work proactively to ensure that publishing practices reflect the highest level of integrity and ethics [3, 4]. Crucially, we support a range of open research initiatives that improve transparency in research and its publication including open access, open metadata and a number of open research practices that increase the integrity and reproducibility of research. We expand on how some of these initiatives can contribute to supporting reproducibility below.

 

A reproducibility crisis?

A recent scoping report frames the concept of reproducibility as a continuum of practices including reproduction, replication and reuse [5]. To support reproducibility in research, the following elements have been recognised to be of particular importance: integrity of datasets, data management and availability of data, transparency in the approaches used (pre-registration, methods, analysis tools) and verification (to validate the data and results). While lack of reproducibility has been portrayed by some as a ‘crisis’, it has been explored in only a few disciplines (e.g. psychology). The extent of the problem needs to be investigated much more systematically to understand its severity across different disciplines and to identify any underlying causes. These are likely to be discipline specific and to apply to the Arts and Humanities as well as to other sciences [6]. At this stage, however, it is more helpful to focus efforts on what can be achieved by driving improvements in research quality more generally.

Research is a complex process; published research can contain uncertainties, outright errors or even fraud. To address some of these challenges, Markowetz sets out a focus on how research could be done - including five selfish reasons for individual researchers to work reproducibly [7]. Collective improvements to the research and publishing process can also increase efficiency and robustness of research findings [8]. Such activities include fostering reproducibility by mitigating against cognitive bias, improving methods training and quality of reporting, promoting pre-registration and adopting open research practices, including the publishing of replication studies also negative, null and inconclusive research. Implementing these activities also requires collaboration, co-creation and collective leadership as well as positive incentives and cultural change so that researchers are recognised and rewarded for behaviours that strengthen research integrity and reproducibility.

 

Rewards and Incentives drive Research Integrity and Reproducibility

The current evaluation system has focused on publications only and the venue of publication as a proxy for the ‘quality’ and integrity of research (and researchers). The perverse incentives that this creates has been well documented [e.g. 9, 10]. If research integrity, good data stewardship and sharing of data are not supported and rewarded, researchers have little incentive to be transparent about their data and findings, especially around what does not work. This is most evident from the estimates of publication bias (in those subjects that have been investigated), which show for example that most negative, null and inconclusive results are not published.

Fundamental to improving reproducibility is therefore a wholescale reform of the evaluation system alongside parallel changes to research culture and practice. This requires policy change and alignment at an institutional, national, and international level including awareness raising, education and support among and for researchers.

While publishers are not responsible for researcher evaluation directly, their marketing and author services reflect what’s important to researchers. Consequently, journals that are indexed by Web of Science or Scopus and whose impact factor is increasing attract more submissions and are financially more profitable than those without (for both open access and paywalled journals). There is little incentive, therefore, for publishers to promote the publication of articles that are ‘sound science’ but likely to accrue few or no citations, such as null, negative and inconclusive results. This contributes to the publication bias across the literature.

Valuing reproducibility, and open research more generally, must be owned, driven and championed primarily by the scholarly community. These values will then define and drive the services that publishers ultimately provide. In the transition, publishers should act responsibly and progressively to support relevant changes to research evaluation and practice that make research more transparent and trustworthy and to raise awareness of research integrity issues among authors and the users of the research they publish.

 

How can publishers support collective efforts to increase reproducibility?

Publishers are service provides – our role is to help ensure that the research we publish is of high quality and the most complete, rigorous and transparent account of what happened as is possible. We have a duty of care to maintain the integrity of the published literature so that users can trust the research that is published and build on it. Publishers have a vital role in putting services and checks in place to support ongoing improvements to research integrity and reproducibility and also to experiment to find out what works and what doesn’t. These services and checks can be grouped under the following three broad areas: a) open outputs; b) supporting changes to the reward system; and c) implementing new relevant technology and open standards for infrastructure.

             

 

 

A. Open Outputs

Open access. While open access does not guarantee reproducibility, ensuring articles are free to access, distribute and reuse enables greater (and independent) scrutiny of published research. Moreover, if open access is provided and paid for as a service (regardless of business model), then publishers can sustainably support the publication of negative, null and inconclusive results, as long as there are incentives for researchers to submit them. Collectively, Hindawi and Wiley publish 468 fully open access journals.

Open data. Open data means sharing data and is made more meaningful by the term FAIR i.e. data that is Findable, Accessible, Interoperable and Reusable. Funders and institutions recognise the value of open data and are increasingly requiring researchers to share data. Open data facilitates scrutiny and verification and so enhances reproducibility. Key to this, however, are the existence of appropriate data repositories and discipline-specific standards for data management and data stewardship. These in turn require researcher training and financial support by institutions and funders for the appropriate infrastructure. Publishers have a responsibility to implement appropriate data sharing policies, make the data available for reviewers during peer review and require that the data underlying the findings are deposited in data repositories that meet appropriate standards. This is very much work in progress, however, and often discipline-specific. As the ecosystem is being developed, new and unforeseen challenges also arise that take time to understand and mitigate against where necessary e.g. around research data ethics [11]. Publishers, including Wiley and Hindawi, are collaborating with a variety of organisations, such as FAIRsharing and several Research Data Alliance working groups, to experiment and create good practice, including the following:

Registered Reports. A Registered Report is an article format where the rationale for a study and the proposed methodology the “study protocol” are pre-registered and peer reviewed before the research takes place and any data are collected [14]. The published protocol is made open access. The peer review process therefore focuses on the importance of the research question and the quality of the proposed methodology. Although not applicable to some explorative research and particular disciplines, Registered Reports can improve study design and address a variety of questionable research practices i.e. selective outcome reporting, “HARKing” (hypothesising after results are known) and conscious or unconscious selective reporting “p-hacking”. Wiley publishes 55 journals which offer the Registered Report article format [15].

Transparent Peer Review. For those journals offering transparent peer review, we enable the open publication of an article’s entire peer review process including the peer reviewers’ reports, authors’ responses, and editors’ decisions. Authors are offered the option to choose transparent peer review and reviewers have the option to disclose their names alongside the reports. The complete peer review history for each article is freely available irrespective of whether the article is published open access or subscription only. Each component of the peer review history has an assigned DOI and is fully citable giving recognition for the work done. Wiley publishes 75 journals which offer transparent peer review [16].

Preprinting. A preprint is a version of a manuscript that has been shared publicly (often posted on a preprint server or repository) prior to formal peer review or publication in a journal. Hindawi and Wiley support the early and open sharing of preprints before (or simultaneous with) submission to a journal. Preprints enable researchers to disseminate their work quickly and widely. The COVID pandemic has shown that preprints can encourage wider research engagement and can directly impact policy [17]. Although the lack of formal peer review of preprints has caused concern, they can potentially aid research integrity and reproducibility because they are available to potentially much wider scrutiny than is enabled by formal peer review. More research needs to be undertaken on the impact and value of preprinting for reproducibility alongside the development of a consistent terminology and best practice [18].

 

B. Changing the reward system

Support for responsible metrics. There are a variety of initiatives that recognise the need to improve how researchers and scholarly outputs are evaluated, in particular stopping the use of journal Impact Factors to assess researchers’ individual contributions and move to a suite of responsible metrics [19]. Hindawi has signed the Declaration On Research Assessment [20] and is in the process of developing a journal dashboard to promote different types of metrics and indicators of openness for measuring the reach of primary literature. Wiley is also exploring becoming a signatory, although its implementation is more challenging because of the number of journals, the different disciplines covered and also because the policies of many of their journals are determined by societies that have a variety of views on research assessment.

CRediT is a Contribution Roles Taxonomy [21] which provides a standardised description of the roles typically played by each contributor to a research output, increasing transparency around who participated in the research and what they did. It also facilitates recognition for the research undertaken and is informative to reviewers, editors and readers. Wiley publishes 211 journals which offer CRediT.

Open research badges on published articles send a signal regarding research integrity and reproducibility by acknowledging published work that contains pre-registered plans, shared data or materials [22]. They act as incentives for researchers to participate in open research initiatives. Currently, over 49 Wiley journals offer open research badges.

Promoting data and software citation will provide a reputational reward system for data sharing through independent citation. Hindawi and Wiley have both been involved in workshops to understand the technology required e.g. via Scholix [23]. Although largely technically straightforward, however, there remain key barriers to the adoption of the practice because of a lack of cultural norms around data and software citation and no rewards or support for researchers to make datasets citable (e.g. by registering datasets for DOIs).

 

C. Technology and Infrastructure

There is an increasing role for technology in helping publishers and researchers (whether this be as editors, peer reviewers or authors) check and report research as transparently as possible. In recent times this is extending from routine checks for text similarity to the use of machine learning and artificial intelligence software to identify discrepancies in image or statistical data that may highlight a problem with reproducibility [24]. Although there will always be a need for human intervention for problematic cases, these tools and services are increasing in sophistication and are beginning to be applied at scale to the literature.

Providing ethical and technical checks, implementing data sharing policies and helping to support the infrastructure to reward, verify and track a range of interconnected research outputs also involves changing workflows across multiple submission systems and other digital platforms and repositories. The services provided by publishers are currently highly variable, in part because there are no community norms and standards with which to compare services and publishers, but also because there are many bespoke workflows at a journal level that mean any policy and workflow change is challenging to implement and scale in a cost-effective manner.

The importance of having an appropriate and interoperable infrastructure to support these services cannot be underestimated. Crucial to this is a set of commonly agreed persistent identifiers (PIDs) for researchers and organisations etc i.e. Open Researcher Contributor (ORCID) [25] and Research Organization Registry (RoR) [25], and for different scholarly outputs (DOI or equivalent), alongside high quality and open metadata that provides relevant provenance (to enable attribution) and machine readable descriptions of the outputs.

An agreed set of scholarly PIDS and open and non-proprietary metadata has many benefits but is key to reproducibility and research integrity. Machine reading can then connect, link and mine different research outputs and connect them to researchers or organisations as well as to grants and potentially different projects. This will enable verification, replication, discovery and the reporting and tracking of research outputs, people, organisation and projects. This will engender more trust in the system and enable new ways to reward outputs (e.g. through data citation) and more open research practices. It will also enable independent ‘research on research’ to provide a more evidence-informed approach to reproducibility and to the effectiveness of workflows and services supporting the scholarly communication landscape.

Hindawi is one of the world’s leading publishers in terms of open infrastructure. It is building an end-to end open source publishing platform (Phenom) that can provide research integrity services at scale. It is also committed to open metadata and ranks among the top publishers in terms of completeness and quality of metadata provision to Crossref [27].

The STM’s Standards and Technologies committee has recently appointed a working group to discuss questions around tools for automatic image alteration or duplication [28].

 

Conclusion and Recommendations

The power of open research initiatives to transform research integrity and reproducibility cannot be underestimated. From a publisher’s perspective, there is a proactive focus on journals to adopt open research policies and implement open research practices. However, we still have some way to go. We also recognise the need for training and support for researchers in conducting responsible research. Funders and institutions have a vital role to play in recognising and rewarding researchers who adhere to open research practices that will make reproducible research the norm. It is encouraging to see that universities are beginning to reward openness, for example, UCL explicitly note that open research practices are rewarded in promotion decisions [29].

Establishing a national committee on research integrity could facilitate stakeholders working together to collectively improve reproducibility in future by aligning priorities. However, the power of local networks driving best practice should not be underestimated. For example, The Embassy of Good Science [30], Center for Open Science [31] and the UK Reproducibility Network (UKRN) [32] connect researchers with various stakeholders to share guidance and change practice. Wiley recently partnered with UKRN to host a series of open research workshops across 10 UK institutions involving over 300 researchers [33].

Hindawi and Wiley are committed to facilitating reproducibility and to piloting a range of different approaches and services to support this. While one approach does not fit all, facilitating discussion, collaboration and co-creation between all stakeholders in the research lifecycle will be key to driving the collective action that is necessary to make research as reliable and trustworthy as it can be [34, 35]. Funders have a vital role in to play in this. In summary, we recommend UKRI work with publishers and other stakeholders to take the following actions:

  1. Commit to reforming research evaluation to reward open outputs and open research practice, including a commitment to work with scholarly societies, institutions and organisations undertaking research to align and implement policy change and report on the outcomes
  2. Support the widespread adoption of community-based, community-governed persistent identifiers for scholarly communication and work with stakeholders to agree on metadata standards, including open metadata.
  3. Commit to enabling a more evidence-informed ‘research on research’ approach to policy change and services to support reproducibility and research integrity. There is insufficient data (quantitative and qualitative) available about such services to understand the longer-term consequences and to see what works and is cost-effective.

 

References

1.       Open Research isn’t just the future of research communications it’s the here and now. https://authorservices.wiley.com/open-research/index.html

2.       Munafò, M.R. (2019). Raising research quality will require collective action. Nature 576, 183 (2019) https://doi.org/10.1038/d41586-019-03750-7

3.       Hindawi Publication Ethics https://www.hindawi.com/publish-research/authors/publication-ethics/

4.       Wiley Best Practice Guidelines on Research Integrity and Publishing Ethics https://authorservices.wiley.com/ethics-guidelines/index.html

5.       Baker, L. et al. (2020). Reproducibility of scientific results in the EU. https://op.europa.eu/en/publication-detail/-/publication/6bc538ad-344f-11eb-b27b-01aa75ed71a1

6.       Peels, R. (2019). Replicability and Replication in the Humanities. Research Integrity and Peer Review, 4(2) https://doi.org/10.1186/s41073-018-0060-4

7.       Markowetz, F. (2015). Five selfish reasons to work reproducibly. Genome Biology, 16(274) https://doi.org/10.1186/s13059-015-0850-7

8.       Munafò, M.R. et al. (2017) A manifesto for reproducible science. Nature Human Behaviour, 1(0021). https://doi.org/10.1038/s41562-016-0021

9.       Moher, D. et al. (2020) The Hong Kong Principles for assessing researchers: Fostering research integrity. PLoS Biology 18(7): e3000737 https://doi.org/10.1371/journal.pbio.3000737

10.   Nuffield Council on Bioethics. The findings of a series of engagement activities exploring the culture of scientific research in the UK. https://www.nuffieldbioethics.org/publications/the-culture-of-scientific-research

11.   Force11 Research Data Publishing Ethics. https://www.force11.org/group/research-data-publishing-ethics

12.   Graf, C. (2018). How and why we’re making research data more open. https://www.wiley.com/network/researchers/licensing-and-open-access/how-and-why-we-re-making-research-data-more-open

13.   Faust, T. (2017) Data availability at Hindawi. https://www.hindawi.com/post/data-availability-at-hindawi/?utm_source=google&utm_medium=cpc&utm_campaign=HDW_MRKT_GBL_SUB_ADWO_PAI_DYNA_JOUR_X&gclid=Cj0KCQjw7MGJBhD-ARIsAMZ0eeuvTUmb5hlectb5MljqhqRYhTLleDC1kvsHzXPfxN4OtUhcauT3-B8aApt0EALw_wcB

14.   Chambers, C. (2014) Registered Reports: A step change in scientific publishing. https://www.elsevier.com/connect/reviewers-update/registered-reports-a-step-change-in-scientific-publishing

15.   Publish a Registered Report for an early peer review of your proposed research. https://authorservices.wiley.com/author-resources/Journal-Authors/submission-peer-review/registered-reports.html

16.   Increasing Transparency in Peer Review. https://authorservices.wiley.com/Reviewers/journal-reviewers/what-is-peer-review/transparent-peer-review.html

17.   Fraser, N. et al. (2021). The evolving role of preprints in the dissemination of COVID-19 research and their impact on the science communication landscape. PLoS Biology 19(4): e3000959 https://doi.org/10.1371/journal.pbio.3000959

18.   Ravinetto, R. et al. (2021). Preprints in times of COVID19: the time is ripe for agreeing on terminology and good practices. BMC Medical Ethics 22:106 https://doi.org/10.1186/s12910-021-00667-7.

19.   Rose, S. (2020). The future of responsibly evaluating research. https://www.hindawi.com/post/future-responsibly-evaluating-research/

20.   Declaration on Research Assessment. https://sfdora.org/

21.   CRediT Contributor Roles Taxonomy. https://credit.niso.org/

22.   Jones, J. (2018). Connecting research with results: Open Science Badges. https://www.wiley.com/network/researchers/licensing-and-open-access/connecting-research-with-results-open-science-badges 

23.   Scholix: A framework for SCHOlarly LInk eXchange http://www.scholix.org/

24.   Graf, C. (2020). Software to improve reliability of research image data: Wiley, Lumina and researchers at Harvard Medical School work together on solutions https://www.wiley.com/network/featured-content/software-to-improve-reliability-of-research-image-data-wiley-lumina-and-researchers-at-harvard-medical-school-work-together-on-solutions

25.   ORCID Connecting research and researchers https://orcid.org/

26.   Research Organization Registry https://ror.org/

27.   Crossref https://www.crossref.org/

28.   STEM STC Working Group on Image Alterations and Duplications. https://www.stm-assoc.org/standards-technology/working-group-on-image-alterations-and-duplications/

29.   UCL Statement on transparency in research (2019) https://www.ucl.ac.uk/research/sites/research/files/ucl_statement_on_transparency_in_research_november_20191.pdf

30.   The Embassy of Good Science https://embassy.science/wiki/Main_Page

31.   Center for Open Science https://www.cos.io/

32.   The UK Reproducibility Network https://www.ukrn.org/

33.   Morris, E. et al. (2020) Are researchers missing out on the benefits that open research publishing can offer? https://www.wiley.com/network/researchers/open-access-week-2020-open-with-purpose/are-researchers-missing-out-on-the-benefits-that-open-research-publishing-can-offer

34.   Research integrity a landscape study. https://www.ukri.org/wp-content/uploads/2020/10/UKRI-020920-ResearchIntegrityLandscapeStudy.pdf

35.   Mejlgaard N. et al. (2020). Research integrity: nine ways to move from talk to walk. Nature 586, 358-360 https://doi.org/10.1038/d41586-020-02847-8

 

(September 2021)