Written Evidence Submitted by protocols.io



Emma Ganley, Director of Strategic Initiatives, protocols.io 

Lenny Teytelman, CEO, protocols.io



We are responding to this inquiry into reproducibility and research integrity because this is a core element of the mission of protocols.io. We provide a tool and platform for researchers to develop, share, collaborate on, optimize, and publish their research methods and protocols. protocols.io is free to read and share for public content, and all published content is available open access under a CC-BY license. Private space on our platform is limited, and so our business model allows individual researchers, universities, or funders to purchase a license allowing unlimited private use for their researchers.

In terms of moving towards more reproducible research of trusted integrity, to date a lot of effort has focussed on achieving more open access publications, improving data and code sharing practises. However, these efforts have often overlooked the need for more method details to be available alongside the data and article. In order for data and analyses to be completely understood and for research efforts to be truly reproducible, comprehensive methods details should be shared as a de facto part of the research process. 




Awareness of issues relating to research reproducibility has been growing in recent years. Some well-known examples like the Cancer Research Reproducibility Initiative have really shone a light on the extent of the issue. This project sought to replicate the experiments from fifty published cancer research articles. Ultimately, this effort was halted after replication of just eighteen studies had been attempted due to the missing methods details from the published articles. The scientists involved called the effort an arduous slog; time and money spent contacting authors and attempting to access sufficient insight into the methods exceeded what was reasonable and still experiments could not be replicated. This shocking inability to access the required information highlights the need for different approaches to ensure research reproducibility. Similar efforts have investigated data sharing practices, and while this seems to be slowly improving thanks to more policies from funders and journals, it is still far from where it should be. Other efforts to look at the reproducibility across research sectors have shown that failure to record or convey method details is not just a concern in one discipline but is a systemic concern for research as a whole.




There are a multitude of factors within academia that have led to these research reproducibility issues: pressure on researchers to publish, publication bias issues, concern about scooping leading to reservations about sharing data or approach, poor infrastructure, lack of funds to support advances and implementations of tools or training, insufficient policies/requirements from institutions, funders, publishers to ensure reproducibility, and even when there are policies, insufficient consequences or lack of monitoring can render them ineffectual. Across the board, training is needed for researchers to know how to adopt best practices for reproducibility and support needs to be provided to ensure there are feasible mechanisms and knowledge as to how to achieve this. There are also sometimes budget issues or blocks within universities that make it hard for some tools to be made available to researchers - there are clear approaches to paying Open Access fees at journals, but if a tool (like protocols.io) was available to all researchers in a university, it could be transformative, however often there does not appear to be budget within the university to cover these sort of tools, even though they would be cheap when compared with open access charges, and would then allow for record keeping during the research process and then open publication of all methods for all of the university researchers.




All stakeholders can help address this lack of reproducibility: 

(1) Funders - must implement policies requiring data, code, and methods sharing (these must be supported financially, monitored, and have consequences for failure to comply). Funders can also help with support for the technical infrastructure elements, tools, and platforms that are needed to achieve reproducible research.

(2) Journals - Reviewers can be asked to assess reproducibility (availability of data, analysis, materials, methods etc.) and Editors/Journals/Publishers should also play a part in implementing policies around the expectation that all of these elements be available in as comprehensive a way as possible. Journals must also explore how to align with more modular approaches with (i) methods existing in separate repositories suited to capturing a step-by-step protocol alongside equipment and materials like protocols.io, (ii) data being available in an external repository suited to that datatype, Dryad, zenodo, etc. (iii) code being available in relevant repositories whether Github, zenodo or CodeOcean etc. 

(3) Institutions - must ensure researchers have access to the tools and infrastructure needed for research efforts and outputs to all be captured digitally overtime, and also institutions should set an expectation that their researchers make their research as reproducible as possible, they must have access to tools and infrastructure and should be make use of the various mechanisms available (data, methods, code sharing etc). Alongside tools and infrastructure is a need for Institutions to ensure that teaching and training for best practices for reproducible research is available and that their researchers are expected to follow some minimum amount of these (in the same way that ethics, health and safety, security, diversity equity and inclusion trainings are provided and expected to be attended). Finally, Institutions must also implement career assessment criteria that factor in adherence to ensuring they’re generating reproducible research outputs: they can factor in outputs like availability of datasets and published protocols as well as looking at research articles and which journals they are published in.

(4) Governments - must align policies such that all stakeholders are compelled to implement and adhere to these policies and expectations, and should commit to development, implementation, and support for core infrastructures that will facilitate these efforts.

(5) Other organisations like tool/infrastructure providers should work with the community to ensure access is available and adherence to any developing norms for integrations are kept in mind. They should also work with funders, institutions, publishers, and individual researchers to support efforts to provide training to ensure the most is made of technological advances available to all.

(6) Last, but most certainly not least, researchers must implement better general research practices to implement approaches that will ensure reproducibility of their efforts (both by themselves and by others).







Bringing together stakeholder representatives to align policies and carry out feasibility studies would ensure discussion allowed for a progressive approach. Not everything needs to happen at once, but if all stakeholders could agree on a staggered roll out of various policies, expectations, and consequences then this would be more effective than scattershot independent approaches being taken by separate stakeholders on their own timeframes. We are confident that over time alignment of efforts could be much more powerful and effective in improving the quality and reproducibility of research. This will also result in economic gains both in value of research that is being funded, but also in terms of aligned approaches reducing redundant effort from the various stakeholders.


(September 2021)