Written Evidence Submitted by Taylor and Francis Group




Taylor & Francis Group is an international academic publisher, publishing more than 2,700 journals and over 5,000 new books each year, with a books backlist in excess of 120,000 specialist titles.


Supporting and fostering research integrity is a priority for Taylor & Francis. Alongside other scholarly publishers, we are actively engaged in developing standards and best practice to support the responsible conduct of research and the publication of quality, reliable research outcomes across the disciplinary spectrum. We work with hundreds of UK-based learned societies and professional member associations to support the publication of trusted and high-quality research outputs which represent the best outcomes from their communities, drive consensus, and advance discourse and developments within their specialised fields.


We believe that research outcomes are most impactful when they are open and have resulted from a transparent research process, and are committed to driving and supporting Open Research. In support of this, we offer a broad range of open access publishing options, including fully open access gateways - including the European Commission’s Open Research Europe platform through our innovative imprint, F1000 - as well as a selection of journals and books across the disciplinary spectrum.


We hope that our inputs to this Call for Evidence will be useful and we would welcome the opportunity to further support the Committee on this issue by developing practical solutions to help resolve any challenges. 




  1. The breadth of the reproducibility crisis and what research areas it is most prevalent in

We understand reproducibility to cover a broad range of themes, including but not limited to computational reproducibility, replication, and the ability for researchers to build on previously published work with faith that it meets high ethical and experimental standards[1]. As such, we encourage the Committee to consider reproducibility in its broadest understanding, based on the foundational premise that all published research findings, irrespective of discipline, should be verifiable, replicable, and credible. Even in the Humanities, where replication may not be considered as valuable a concept, there should still be accountability for research design and potential value from increased trust and impact[2].


This broad scope poses some challenges, most notably the lack of any one size fits allapproach, given the breadth of research methodologies, study designs, formats, and definitions across disciplines. However, we believe that this is much to be positive about. Firstly, actors from across the scholarly communications ecosystem are constantly evolving processes and practices which help to self-correct the record of science, supported by better technologies. Secondly, we are seeing a shift in values within the researcher community itself to improve research integrity, to act with greater transparency and integrity, and to collaborate to find global solutions and standards that will foster efficiency and bring about change. We expand on some of these initiatives in our response.


  1. The issues in academia that have led to the reproducibility crisis

If we were to highlight only one cause of issues around reproducibility, it would be the research system itself, which is complex, fragmented, and not optimised to make research outcomes reusable and reproducible. To combat this, all stakeholders have a role to play in establishing a more holistic approach to the research and publishing process. We signpost some promising initiatives and suggest avenues for further development in our response, covering both values-based and operational initiatives.


Causes include but are not limited to:

there is much progress still to be made across the whole research system that could ensure that research methods, data and all outputs are shared, described and discoverable and accessible in ways that maximise their potential to be used and useful.  As a scholarly publisher, we are working hard to make the research we publish  as FAIR (Findable, Accessible, Interoperable, and Reproducible) as possible, but to do this requires the research to be produced in formats and with the necessary detail and descriptors to make this possible – and this needs coordinated input and guidance from funders and research institutions – and support and training for researchers /authors.


  1. The role of the following in addressing the reproducibility crisis

As a scholarly publisher, we are involved in and/or supportive of a range of initiatives that have emerged to help address many of the challenges and opportunities that we now face to make research reproducible and to assure trust and integrity. We have included a table with illustrative examples of key current initiatives designed to address this issue which we hope will be useful to the Committee.





Detail / existing initiatives

Suggested action


Organisations & initiatives designed to deliver best practice / guidance

  • The EQUATOR Network, which brings together reporting guidelines for a range of study types, such as the ARRIVE guidelines (Animal Research: Reporting of In Vivo Experiments).
  • The FAIR data principles (ensuring that research data is Findable, Accessible, Interoperable, Reusable).
  • The RISRS Project, which aims to stop the inadvertent spread of retracted research material, as well as improving retraction taxonomies and processes.

All stakeholders could endorse / support these and other initiatives (e.g. pre-registration could be a condition of publication in key fields / journals) and institutions may wish to agree on a clearer, standardised governance framework for handling ethical issues.



Global alignment and ways of working, reducing complexity for researchers.

Engagement with researchers, funders, and research integrity offices earlier in the research process

  • Pre-registration of research, including protocols, study design, and methods (see protocols.io).
  • Direct engagement with funders to support publication of all their research outputs (e.g. combining publication of peer reviewed and non-peer reviewed outputs, such as End of Grant Reports available within NIHR Open Research).
  • Raising the profile of studies on research culture and practice (e.g. Research Gateway within F1000Research brings together cross-discipline scholarship on all aspects of the research ecosystem).
  • Partnership on training – for example supporting Data Stewards at institutions, as well as support for and publication of Data Management Plans which link to published research outputs.

Incentivise researchers to share all outcomes of their research process in real time.


More visibility of null / negative results.


Reduces likelihood of intentional or unintentional manipulation of outcomes.

Alignment on global standards

  • Clearer lines of sight from origins to outcomes of research, supported by persistent identifiers (e.g. research funding source, institutional affiliation). Existing identifiers include but are not limited to:

All stakeholders to align / endorse common persistent identifiers.


Funders could consider aligning on a common identifier for grant funding.


Fosters trust in research through improved transparency and audit trail

Innovative approaches to sharing/publishing research

  • Gateways on F1000Research and F1000 platforms are one way the boundaries of journal publishing are shifting, opening up alternative venues for research with specific organizational or mission-specific focus (e.g. Tree of Life Gateway on Wellcome Open Research and NC3Rs Gateway on F1000Research).
  • Provision of infrastructure for new formats beyond regular published research articles, such as:
  • Journals are increasingly requiring not only that data and code are made available (there is great variation in what that actually means) but that mandatory full reproductions are conducted.
  • Some journals and platforms, including F1000, are moving away from using supplementary materials, encouraging the active sharing of the comprehensive research output, such as entire methodology articles, dedicated to new and improved techniques within experimental sciences.

Expand research assessment criteria to capture non-conventional publication formats, and advocate for increased use of publication formats which improve research transparency (e.g. Methodologies, Study Designs, lay-summaries, video, graphic abstracts etc.)


Increased signposting of research which has been peer review vs. that which has not, and outreach to journalists reporting on the results of preprints, to solidify understanding.


Partnerships with repositories.


Ensuring clear and common standards for resources.



Better archiving of science, provides a much better way to evaluate the submitted papers, and enables better detection of fraud[6].

Experimentation & pilot studies

  • Some journals are investing in efforts to verify that materials they publish are indeed reproducible (this includes human-based reproducibility checking in partnership with bodies such as CASCAD and the Odum Institute[7]). 
  • Open peer review is a further method by which integrity in published research outputs can be established, since making available to readers the discussions within the peer review process can go some way to improving trust in the final published research output.
  • F1000 uses the CODECHECK, Sci Source and Ripeta softwares for the automation of some manuscript quality checks, to inform the qualitative analysis of manuscripts undertaken by peer reviewers.

Explore support for reproducibility checks (whether manual or automated, and include ‘Reproducibility’ as a category of assessment within future research assessment frameworks.


Promote Open Peer Review as one means to improve trust in published research.

Verification of reproducibility – greater oversight of what work has gone into published research from submission to eventual publication.

Training (data skills)

  • Taylor & Francis provides several training modules to early career researchers, focusing on Open Research, Publishing Ethics and Writing for Scholarly Audiences. We also employ a dedicated Publishing Ethics and Integrity team, serving as a centre of excellence within Taylor & Francis and providing expert advice and support on ethical, research integrity, publishing ethics and policy matters.
  • Similar teams working across funding agencies and executive bodies (e.g. UKRI) could help promote greater research integrity and reproducibility best practice.
  • There is also scope for new roles on journal editorial boards and publishers’ staff to advise on and specialize in reproducible and open science (e.g. Dedicated OS and Research Integrity teams).

Establish subject specific training programs for researchers and ensure appropriate support functions in place.

Greater awareness of potential research integrity issues, including understanding of how to resolve potential conflicts of interest.

Improving diversity

  • Publishers are working to promote greater diversity across journal editorial boards and introduce ways for researchers and communities from all walks of life to participate in research.  
  • C4DISC is one of the most widely recognised initiatives aiming to promote diversity across the Scholarly Communications workforce, as is the Joint Commitment for Inclusion and Diversity in Publishing.

Ensure broad representation in decision making bodies (including steering committees, promotion panels, editorial boards).

Reduction in unconscious bias / blindspots

Incentives and rewards

  • The San Francisco Declaration on Research Assessment (DORA)[8] has as a mission moving to a system where good research conduct and practice are as valued as groundbreaking research[9].
  • The Hong Kong Principles, a series of indicators developed “with a specific focus on the need to drive research improvement through ensuring that researchers are explicitly recognized and rewarded for behavio[u]rs that strengthen research integrity.”

Consider supporting / alignment with existing initiatives and principles.



Making it more acceptable to admit mistakes in research/retract work.


Incentivise responsible research conduct, leading to improved integrity.




  1. What policies or schemes could have a positive impact on academia’s approach to reproducible research

Researchers globally look to the UK for leadership on all areas of research and development, recognising the quality of our published research outputs, and the stringent frameworks already in place for ensuring the accuracy and integrity of our research practices. We believe that the adoption and endorsement of additional programmes and initiatives by the UK Government (about which more below) offers the opportunity for the UK to expand its reputation and leadership around research excellence. We have identified four key areas where policy intervention could positively influence and improve the reproducibility of research. These are open research (science), training, rewards and incentives, and global standards. We offer some suggestions below around specific areas for policy input.


Open research

Key recommendation: investing in appropriate and interoperable infrastructure.

There is global movement towards more open and collaborative ways of working across science and research more broadly (‘open research’), the aim of which is to democratise access to knowledge and enable collaboration in ways that accelerate the potential for research to have impact. Open research encourages researchers to share the products and outputs of research; including throughout the research lifecycle e.g. from study design and methodology, and research data and software, in addition to negative and null findings, to support robustness and transparency in the way research is conducted, thereby enabling trust and transparency in research. We note that UKRI has already made significant strides in promoting Open Research and the Committee would be strongly placed to use its influence to further support and incentivise Open Research activities as a means to also improve the discoverability, usability and reproducibility of research and its integrityspecifically through ensuring funding is available to introduce and develop supporting infrastructure (e.g. persistent identifiers).



Key recommendation: targeted investment in training around data science skills, including analysis, curation, and formatting to enhance the reproducibility of research outcomes across the disciplinary spectrum.

Many researchers are willing to share their data, but lack confidence in data skills, especially ensuring that their data is FAIR (Findable, Accessible, Interoperable, and Reproducible). Research Offices at institutions could be provided with dedicated funding for training around responsible research conduct and data stewardship. Support could also be directed to community-led groups (for example the Research Data Alliance and its Interest Group on Data Stewardship).


Alignment with global standards and best practice

Key recommendation: evaluate and align with existing initiatives; formally adopt or endorse principles.

The global scholarly communication and research communities have developed a significant body of guidelines and best practices that apply to general research practice and communication, as well as to specific fields (many of these are outlined in question 3 above). These resources have been created to foster quality, reliability, and rigour in research and the outcomes of the research process. Many of the guidelines developed have gained significant support within the community as they outline common and interoperable standards, promote transparency, outline common methodologies and checklists, and provide tried and tested examples of best practice that are valued by researchers. These have typically been developed with the input of key stakeholders including researchers, learned societies, funders, institutions, and publishers.


Rewards and incentives

Key recommendation: revise rewards and incentives structures in the next REF to reward those who support and further research integrity.

Research integrity often relies upon invisible work carried out by many researchers in the service of their communities – for example peer review, mentoring, training colleagues, taking up advocacy roles within their institutions.  We applaud UKRI for its focus on this area, including advocacy around the narrative CV. Good progress is already being made, but there is an opportunity to go further, faster.


The upcoming evaluation of the Research Excellence Framework (REF) is an ideal opportunity to revise incentives structures to reward research integrity and the responsible conduct of research. The Hidden REF may be a useful source of inspiration.


  1. How establishing a national committee on research integrity under UKRI could impact the reproducibility crisis


Key recommendation: ensure that any committee include broad representation.

We suggest that any national committee on research integrity aims to align with global initiatives, to work with all stakeholders, and to adhere to standards that have been developed or are evolving as the result of cross-stakeholder inputs (see response to the question above for some examples). The national committee may find that it shares many aims with the UK Reproducibility Network (UKRN).


For the committee to be able to make evidence-based interventions that benefit all, we suggest that it has broad representation to ensure that its direction can be informed by a range of views. This diverse representation should cover academia (including cross disciplinary, early career, minority ethnic, disabled, and other marginalised groups), as well as including other stakeholders in the research sector (institutions, funders, industry, publishers and service providers).


In our experience, policy measures that incentivise and reward researchers are more effective in the longer term than prescriptive or punitive measures (such as sanctions). We have found that policy interventions also have more success when they are supported by appropriate resourcing, be that funding, training, or guidance. For these reasons, we believe any national committee i) needs to have broad representation, ii) should cover the disciplinary spectrum, and iii) should ensure that any policy direction is supported by appropriate resources and funding. We also advise that the committee iv) reviews existing best practice and standards, with an aim to align with these wherever possible to facilitate improvements to global research culture.



(September 2021)


[1] Rik Peels & Lex Bouter (2021): Replication and trustworthiness, Accountability in Research, DOI: 10.1080/08989621.2021.1963708

[2] https://www.cwts.nl/blog?article=n-r2v2a4&title=the-humanities-do-not-need-a-replication-drive

[3]Precarity means top students quitting academia, warns OECD expert, Times Higher Education, 8 June 2021. Accessed 18 August 2021.

[4] Chalmers, I. et al. (2009) Avoidable waste in the production and reporting of research evidence. The Lancet, Volume 374, Issue 9683, 86 – 89.

[5] An example reported in this article: Jan C. Frich, Kirsti Malterud & Per Fugelli (2006) Women at risk of coronary heart disease experience barriers to diagnosis and treatment: A qualitative interview study, Scandinavian Journal of Primary Health Care, 24:1, 38-43, DOI: 10.1080/02813430500504305

[6] Guédon, J.-C. (2014). Sustaining the ‘Great Conversation’: The future of scholarly and scientific journals. In B. Cope & A. Phillips (Eds.), The future of the academic journal. (2nd ed., pp. 85–112). Oxford, England: Chandos Publishing. https://doi.org/10.1533/9781780634647.85

[7] Outcomes from the ACM workshop on https://reproducibility.acm.org/2021/06/29/reproducibility-publishing-taking-the-pulse/

[8] https://sfdora.org/

[9] http://www.leidenmanifesto.org/