Written Evidence Submitted by the British Neuroscience Association
(RRE0051)
Introduction
The British Neuroscience Association (BNA) is the largest UK organisation representing and promoting neuroscience and neuroscientists. We have over 2400 members, whose interests cover the whole range of neuroscience, from ion channels to whole animal behaviour to real-life applications in the clinic and beyond.
In 2019, the BNA launched a programme of work aimed at ensuring that neuroscience research is as robust, reliable, replicable, and reproducible as possible – efforts all aimed at strengthening credibility in neuroscience. This was in response to concerns in the rise of irreproducible research across research as a whole, and a strong desire to support neuroscientists to tackle this challenge head-on. As representatives of the neuroscience community, we have a duty to strive for science that is reliable, sustainable and will make a difference to our future.
The Committee’s inquiry provides a much-needed opportunity to examine the issue of reproducibility in research more closely, and we welcome the opportunity to respond to it through this written evidence. We have consulted with members of our Credibility Advisory Board in the course of preparing this response. The BNA is also an affiliate stakeholder of the UK Reproducibility Network, with whom we have liaised on this response.
Summary:
The breadth of the reproducibility challenge
Concerns on reproducibility of research are not limited to any specific biomedical discipline, or confined to biomedical science itself, but span a wide range of scientific research.[1] However, attention has been given to its prevalence within psychological sciences and neuroscience.[2],[3] This is partly due to psychology being one of the first fields to identify the problem, but also that the field itself considers some of the human biases that result in questionable research practices. Within neuroscience, much of the work to both identify and attempt to tackle issues of reproducibility have been within neuroimaging, though through the BNA’s own survey work we have seen concerns expressed on reproducibility from researchers working across the wide range of neuroscience fields – of the 570 neuroscience researchers surveyed, 39 per cent were unsatisfied with reproducibility within neuroscience, with a further 8 per cent very unsatisfied.[4]
Neuroscience research essentially follows the same path as other science-based disciplines within academia, involving experiments to test or to generate hypotheses, collecting and analysing data, and then disseminating the findings via publication in a scientific journal. Progress of research tends to be cumulative, combining findings from many studies over many years.
Increasingly, however, research culture has created a ‘publish or perish’ mentality, where in the hope of funding and career advancement researchers are driven towards publishing as many papers as possible, as quickly as possible, in journals considered high-impact, with incentives to publish only surprising and novel results.[5] That has in turn meant that replication studies or inconclusive findings have struggled to be considered of value within the system, contributing to the reproducibility challenge.[6],[7] To combat this, advocates of ‘slow science’ have emerged arguing that a move towards producing fewer, more reproducible studies may ultimately result in faster progress through a reduction in research waste.[8]
Within the ‘publish or perish’ culture, questionable research practices have emerged that undermine reproducibility. For example: HARKing (hypothesising after results are known)[9]; P-hacking (analysing data in multiple ways to reach significance)[10]; failing to share research data needed for reanalysis.[11] These practices render science vulnerable to biases that skew scientific understanding, contribute to hyped expectations, and jeopardise the translation of research to real-world applications.[12]
The overall extent of this within neuroscience, and the resulting impact on research, is difficult to truly quantify. There is no doubt though that elements of poor study design, analysis and data availability contribute to undermine reproducibility within neuroscience research, including:
Within the US, the Stanford Center for Reproducible Neuroscience was set up in 2015 to help tackle some of the issues above and has focused its expertise on neuroimaging.[16]
Key players in tackling the reproducibility challenge
Research funders, including public funding bodies
In the BNA’s survey of neuroscientists, over half of respondents identified a lack of dedicated funding and a lack of positive incentives as barriers to doing credible, open and reproducible research.[17] 49 per cent identified a lack of training, and 40 per cent reported a lack of direction from funders, institutions and regulators.
Research funders can, through their funding, conditions of funding, and the training they support, have a major impact in shifting the research culture in the UK to one that values credibility at the heart of it. Funders should leverage their position in helping decide who and how they fund by exploring new ways to better incentivise reproducible research, including through making more funding available for replication studies and through funding researchers with a track record of reproducible research.
Research institutions and groups
“The 'publish or perish' mind-set causes people to take shortcuts. The idea of slowing down science is an easy one to posit but, as an early career researcher, when you know you will not get the next fellowship or post-doc position without a paper in a so-called 'high impact journal' it's difficult to slow down. Better job security would help with this.” – Early Career Researcher in cellular & molecular neuroscience |
The above statement from the BNA’s survey illustrates some of the career challenges faced by researchers impacting on their ability to tackle reproducibility issues individually.[18] We need to improve researcher well-being and move away from measuring the value of research and researchers by their publishability rather than their credibility.
Research institutions have a role in helping ensure sufficient training is in place for researchers to help make their work reproducible, and in making their hiring and promotion criteria value efforts on reproducibility. Research institutions also have a role to play in overturning the typical narrative of how research outcomes are often reported, and to develop new narratives that value uncertainties, probabilities and caveats.[19],[20]
Individual researchers
Whilst accepting that individual researchers cannot tackle all issues undermining reproducibility alone, we believe that there are a number of actions that individuals can and should take to strengthen the credibility of their work. Within neuroscience, we have encouraged researchers, regardless of career stage, to take simple steps in this process to help shift research culture within neuroscience.[21]
The BNA has also championed the ‘preregistration poster’ model of presenting research at our past two festivals of neuroscience, allowing researchers the opportunity to present their plans for research and to receive feedback on these (and the opportunity to strengthen their work) prior to collecting data.[22] These offer a useful additional step that researchers can take prior to (or alongside) placing a research plan in a registry – helping to counter publication bias and non-reproducibility. Presenters from BNA’s 2019 Festival found them to be a useful tool in promoting academic discussion of planned and on-going research, encouraging open science, and benefiting early career researchers.[23]
There are also now more than ever before a number of groups and organisations that individual researchers can engage in to help support their efforts, from grassroots organisations such as the ReproducibiliTea journal clubs[24], through to organisations such as the UK Reproducibility Network.
Publishers
The role of traditional publishers is changing. Preprint servers such as bioRxiv have led researchers to increasingly look at alternatives to traditional publishers, with a steady rise in preprint publishing in the past year in both Covid-19 and non-Covid-19 articles.[25] Other publishing platforms, such as F1000 Research, are seeking to build on this and make preprints part of a broader publishing system with formal, invited, and transparent post-publication peer review – moving towards a model focused on quality standards and fairer article-based metrics rather than the stature of journals and their Journal Impact Factor (JIF).[26]
Moving the publishing sector away from JIF plays an important role in helping to facilitate reproducibility in research. The BNA is a signatory of the San Francisco Declaration on Research Assessment (DORA), which discourages use of journal-based metrics, such as JIF, as a surrogate measure of the quality of individual research articles, or to assess an individual scientist’s contributions, or in hiring, promotion, or funding decisions. Some publishers actively try to increase their JIF as part of their publishing strategy, through means to ‘game’ the system such as avoiding low-citation topics. This practice contributes to the low perceived value given to outputs that strengthen credibility and reproducibility, such as publishing replication studies and studies with null results.
Publishers also have a role to play in tackling resistance within peer review culture that undermine efforts to strengthen reproducibility. One recent assessment of peer review reports from the journal PLOS ONE found that, despite efforts by the journal to focus comments on the technical soundness of papers, reviewers for the journal continued to often comment on their novelty, rather than remarking upon reproducibility.[27]
Governments and the need for a unilateral response / action
Governments have a key role to play in helping to set the agenda towards tackling the systemic issues that act as a barrier to reproducibility within the research environment, and to help support the existing efforts to tackle them.
The UK Government has sought to respond to broad concerns around research culture through its new R&D People and Culture Strategy, issuing a call to action inviting the research sector to work with it on its vision for people, culture and talent in the future. Within the strategy, there are some encouraging proposals, including support for post-graduate research students, though it is noticeably thin on actions to tackle reproducibility specifically.
Whilst funding new research projects, new infrastructure and training for researchers all carry costs, there is a strong fiscal argument that it will be beneficial to governments in the long term. Governments have a responsibility to ensure that they spend public funding responsibly and to achieve value for it, which is challenged by the lack of reproducibility to varying levels in research. Research in neuroscience, for example, is essential to better our understanding of the human brain and its diseases, which remains one of the greatest scientific challenges. Reproducibility is essential to ensure that the investment leading to these new insights and understanding is not wasted.
Key actions to improve reproducibility of research in academia
Preregistration of research
Preregistering studies in an independent registry provides a clear time-stamped account of the experimental rational, hypothesis, methods, n=numbers and intended statistical analyses, and we encourage researchers to do this via the Open Science Framework.[28] Preregistration provides a key route to guard against practices such as HARKing by transparently having a record of their research plan prior to data collection, and allows opportunities for feedback on this plan. This is of clear benefit for hypothesis-testing research to remove biases, and there is also evidence it is entirely compatible with exploratory research.[29]
Recommendation 1: Preregistration must become the starting point for tackling reproducibility in research and should be required for all hypothesis-testing studies. Funders should add preregistration to their terms and conditions of grant funding.
New publishing models to drive change in academia
The traditional means of publishing research is no longer fit for purpose. Publishers must switch the emphasis away from novel results and more towards complete reporting free from biases. Changes to publishing models could impact how researchers will report their work, and these can be reproducibility-strengthening in nature. One such model to achieve this is through Registered Reports.[30] This involves a two-step publication process:
This benefits individual researchers, and research as a whole, through ensuring open data and materials, and by publication of negative or inconclusive data. A recent study found that reviewers rated registered reports as more rigorous, and their methods as higher in quality, than similar papers published in the standard publishing format, without compromising on novelty.[31]
UKRI has also recently announced funding for Octopus, which has been designed to replace journals and papers as the primary research record.[32] This is a potentially transformative approach, which would allow researchers to publish Research Problems, Hypotheses, Protocols, Data, Analyses, Interpretations, Translations and Reviews, with each new type of publication having to be linked to an existing one.[33]
Recommendation 2: Publishers need to commit to switching their emphasis away from novel results and more towards complete reporting free from biases. UK governments should explore alternative models for researchers to publish their work.
Transforming incentive structures in research
We need to move away from the ‘publish or perish’ culture and incentivise researchers to make their work as reproducible as possible. This should be reinforced through changes to the Research Excellence Framework (REF). REF 2021 made improvements on open access by requiring research outputs meet a set of minimum requirements to be eligible for assessment – including that these were deposited in an open access repository within three months of acceptance.
A natural next step for REF would be to expand this set of minimum requirements for eligibility to include preregistration and open data and materials. The Future Research Assessment Programme presents an opportunity for reproducibility to be given a core role in future assessment.
The Hong Kong Principles for assessing researchers, developed by the World Conferences on Research Integrity, also offer one route for institutions to explicitly commit to recognising and rewarding researchers for behaviour that leads to trustworthy research by avoiding questionable research practices.[34]
Recommendation 3: Incentive structures in research need to be rethought to shift emphasis away from researchers’ publications and towards rewarding other contributions to research that contribute to reproducibility – such as publishing alternative research outputs, reviewing, training, and Team Science. The future REF should add preregistration and open data/materials to its requirements for eligibility of research outputs.
Supporting people to support reproducibility
Increasingly, the research sector has produced training materials to help provide researchers with the knowledge they need, e.g. Reproducibility for Everyone[35], and the Framework for Open and Reproducible Research Training.[36] However, these materials currently remain underused, and often rely on individuals devoting their own time and resources.[37]
Two-fifths of researchers who responded to a recent UKRI survey report a negative impact on research integrity from JIF and other metrics, in relation to their influence on securing funding, hiring and promotion decisions.[38] We believe this contributes to why JIF appears to remain important for a substantial majority of neuroscience researchers the BNA surveyed.[39] Hiring policies are needed that support credibility of research, and which reject the use JIF as a direct proxy for research quality and a researcher’s abilities. The University of Bristol, for example, includes ‘producing open research outputs’ within its examples of research output in its academic promotions framework.[40]
Linked to the shift in incentive structures, the research sector also needs to create career pathways that can help to fill the core expertise needed to enable research groups to manage and make available the vast amounts of alternative research outputs such as methods and data produced.[41],[42] Simply making the information freely available does little to enable reproducibility if outputs are not curated in a way that enable their reuse.
Recommendation 4: Institutions should be encouraged to introduce hiring and promotion policies that value reproducibility, Open Science, and other factors which support credibility of research.
Recommendation 5: Institutions should also ensure that researchers can be trained in methods to strengthen reproducibility of their work. New career pathways need to be created alongside this to help provide support for managing and curating the research outputs produced.
Funding the change needed to strengthen reproducibility
The UK Government needs to build on its recent R&D People and Culture strategy with specific actions on reproducibility, with the upcoming Spending Review providing an opportunity to consider how to sufficiently fund these efforts and incentivise positive change within research. Funding is needed, for example, for:
The UK Government should also look towards new opportunities to help strengthen the credibility of research. It has committed £800m of funding towards the Advanced Research and Invention Agency (ARIA), modelled on the US Defense Advanced Research Projects Agency (DARPA). In 2016, between 3-8% of DARPA’s Biological Control programme funding supported shadow teams of scientists that conducted independent validation and verification of the research groups’ work.[43] This provided substantial support towards reproducibility and is a model that ARIA could adopt.
Recommendation 6: Funding is needed to support replication studies, training of researchers, and infrastructure including support staff that supports transparency in research. The UK Government should use the Spending Review as an opportunity to kickstart funding to support reproducibility.
Whilst it is too early to speculate on the likely impact of UK Committee on Research Integrity (UK CORI), early indications, based on the information currently available on UK CORI, are that it will largely focus on issues primarily related to misconduct in research.[44] Some of these will naturally overlap with areas of reproducibility, however, we agree with the Committee’s implication at the launch of this inquiry that there is a danger reproducibility is largely overlooked within the much broader integrity topic. It is unclear how UK CORI will seek to involve itself on issues of irreproducibility that do not constitute misconduct. If a separate committee focused on reproducibility is deemed unfeasible, we suggest that there should be a sub-committee reporting into UK CORI tasked with reproducibility in research.
Conclusion
Without challenging the damaging trend of recent decades towards the ‘publish or perish’ culture, the progress of research as a whole risks of being subverted – to the detriment of both science and society. The BNA's remit is to support neuroscience and is thus committed to strengthening reproducibility in neuroscience, but neuroscience does not exist in a bubble and the same principles apply across all scientific fields. Whilst individual researchers play a role in efforts to address reproducibility, they need to be supported by funders, institutions, publishers and UK governments if the changes needed to research culture and the broader research environment are to be fully addressed.
(September 2021)
[1] Academy of Medical Sciences, the Biotechnology and Biological Sciences Research Council, the Medical Research Council, Wellcome Trust. Reproducibility and reliability of biomedical research: improving research practice; 2015. https://acmedsci.ac.uk/file-download/38189-56531416e2949.pdf
[2] Open Science Collaboration: Psychology. Estimating the reproducibility of psychological science. Science. 2015 Aug 28;349(6251):aac4716. doi: 10.1126/science.aac4716. PMID: 26315443.
[3] Szucs D, Ioannidis JP. Empirical assessment of published effect sizes and power in the recent cognitive neuroscience and psychology literature [published correction appears in PLoS Biol. 2021 Mar 5;19(3):e3001151]. PLoS Biol. 2017;15(3):e2000797. Published 2017 Mar 2. doi:10.1371/journal.pbio.2000797
[4] British Neuroscience Association. Surveying the neuroscience community on open and reproducible practices; 2020. osf.io/y2t97.
[5] van Dijk D, Manor O, Carey LB. Publication metrics and success on the academic job market. Curr Biol. 2014 Jun 2;24(11):R516-7. doi: 10.1016/j.cub.2014.04.039. PMID: 24892909.
[6] Fanelli D. "Positive" results increase down the Hierarchy of the Sciences. PLoS One. 2010 Apr 7;5(4):e10068. doi: 10.1371/journal.pone.0010068. PMID: 20383332; PMCID: PMC2850928.
[7] Fanelli D. Negative results are disappearing from most disciplines and countries. Scientometrics. 2012;90(3):891-904. https://doi.org/10.1007/s11192-011-0494-7
[8] Frith U. Fast Lane to Slow Science. Trends in Cognitive Sciences, 24 (1) pp. 1-2. 10.1016/j.tics.2019.10.007.
[9] Kerr NL. HARKing: hypothesizing after the results are known. Pers Soc Psychol Rev. 1998;2(3):196-217. doi: 10.1207/s15327957pspr0203_4. PMID: 15647155.
[10] Head ML, Holman L, Lanfear R, Kahn AT, Jennions MD. The extent and consequences of p-hacking in science. PLoS Biol. 2015 Mar 13;13(3):e1002106. doi: 10.1371/journal.pbio.1002106. PMID: 25768323; PMCID: PMC4359000.
[11] Tedersoo L, Küngas R, Oras E, Köster K, Eenmaa H, Leijen Ä, Pedaste M, Raju M, Astapova A, Lukner H, Kogermann K, Sepp T. Data sharing practices and data availability upon request differ across scientific disciplines. Sci Data. 2021 Jul 27;8(1):192. doi: 10.1038/s41597-021-00981-0. PMID: 34315906.
[12] Munafò M, Nosek B, Bishop D et al. A manifesto for reproducible science. Nat Hum Behav 1, 0021 (2017). https://doi.org/10.1038/s41562-016-0021
[13] Button, K., Ioannidis, J., Mokrysz, C. et al. Power failure: why small sample size undermines the reliability of neuroscience. Nat Rev Neurosci 14, 365–376 (2013). https://doi.org/10.1038/nrn3475
[14] Botvinik-Nezer, R., Holzmeister, F., Camerer, C.F. et al. Variability in the analysis of a single neuroimaging dataset by many teams. Nature 582, 84–88 (2020). https://doi.org/10.1038/s41586-020-2314-9
[15] Rauh, S., Torgerson, T., Johnson, A.L. et al. Reproducible and transparent research practices in published neurology research. Res Integr Peer Rev 5, 5 (2020). https://doi.org/10.1186/s41073-020-0091-5
[16] https://reproducibility.stanford.edu/
[17] British Neuroscience Association. Surveying the neuroscience community on open and reproducible practices; 2020. osf.io/y2t97.
[18] British Neuroscience Association. Surveying the neuroscience community on open and reproducible practices; 2020. osf.io/y2t97.
[19] BioMedCentral Med., 2019: Claims of causality in health news: a randomised trial. https://doi.org/10.1186/s12916-019-1324-7
[20] Academy of Medical Sciences. Enhancing the use of scientific evidence to judge the potential benefits and harms of medicines; 2017.
[21] https://bnacredibility.org.uk/academia
[22] https://bnacredibility.org.uk/preregposters
[23] Brouwers K, Cooke A, Chambers CD. et al. Evidence for prereg posters as a platform for preregistration. Nat Hum Behav 4, 884–886 (2020). doi.org/10.1101/833640
[24] https://reproducibilitea.org/
[25] Callaway E. Will the pandemic permanently alter scientific publishing? Nature. 2020 Jun;582(7811):167-168. doi: 10.1038/d41586-020-01520-4. PMID: 32504015.
[26] Tracz V, Lawrence R. Towards an open science publishing platform. F1000Res. 2016 Feb 3;5:130. doi: 10.12688/f1000research.7968.1. PMID: 26962436; PMCID: PMC4768651.
[27] https://blogs.lse.ac.uk/impactofsocialsciences/2021/03/31/reading-peer-review-what-a-dataset-of-peer-review-reports-can-teach-us-about-changing-research-culture/
[28] https://osf.io/
[29] Nosek BA, Ebersole CR, DeHaven AC, Mellor DT. The preregistration revolution. Proc Natl Acad Sci U S A. 2018 Mar 13;115(11):2600-2606. doi: 10.1073/pnas.1708274114. PMID: 29531091; PMCID: PMC5856500.
[30] https://cos.io/rr/
[31] Soderberg CK, Errington TM, Schiavone SR, Bottesini J, Thorn FS, Vazire S, Esterling KM, Nosek BA. Initial evidence of research quality of registered reports compared with the standard publishing model. Nat Hum Behav. 2021 Aug;5(8):990-997. doi: 10.1038/s41562-021-01142-4. Epub 2021 Jun 24. PMID: 34168323.
[32] https://www.ukri.org/news/funding-agreed-for-a-platform-that-will-change-research-culture/
[33] https://science-octopus.org/
[34] https://wcrif.org/guidance/hong-kong-principles
[35] https://www.repro4everyone.org/
[36] https://forrt.org/
[37] Auer S, Haeltermann NA, Weissberger TL, et al. A community-led initiative for training in reproducible research. Elife. 2021;10:e64719. Published 2021 Jun 21. doi:10.7554/eLife.64719
[38] Vitae, UK Research Integrity Office, UK Reproducibility Network. Research integrity: A landscape study; 2020. https://www.vitae.ac.uk/vitae-publications/reports/research-integrity-a-landscape-study
[39] Clift J, Cooke A, Isles AR, Dalley JW, Henson RN. Lifting the lid on impact and peer review. Brain Neurosci Adv. 2021;5:23982128211006574. Published 2021 Apr 11. doi:10.1177/23982128211006574
[40] University of Bristol. Academic Promotions Framework: Version for 2020/21 promotion round; 2020. https://www.bristol.ac.uk/media-library/sites/hr/documents/academic-promotion/framework.pdf
[41] Verheul I, Imming M, Ringerma J, Mordant A, Ploeg J, Pronk M. Data Stewardship on the Map: A Study of Tasks and Roles in Dutch Research Institutes. 2019; DOI: https://doi.org/10.5281/zenodo.2669150
[42] Mons B. Invest 5% of research funds in ensuring data are reusable. Nature. 2020 Feb;578(7796):491. doi: 10.1038/d41586-020-00505-7. PMID: 32099131.
[43] Raphael MP, Sheehan PE, Vora GJ. A controlled trial for reproducibility. Nature. 2020 Mar;579(7798):190-192. doi: 10.1038/d41586-020-00672-7. PMID: 32157231.
[44] https://www.ukri.org/news/promoting-research-integrity-across-the-uk/