Written Evidence Submitted by MQ: Mental Health Research

(RRE0063)

 

MQ: Mental health research is a UK-based international funder. MQ’s vision is to create a world where mental illnesses are understood, effectively treated and one day preventable. MQ does this through directly funding research, identifying priorities and enabling others to effectively support mental health research. MQ convenes a variety of stakeholders, enabling the sector to unite around the common goal of addressing the pressing problems of mental health.

 

Reproducibility and research integrity are of prime importance to medical research charities such as MQ for three reasons. Firstly, our purpose is to fund research that will change the lives of people affected by mental illness. MQ’s research involves any and all disciplines, and can be at any point on the translational pathway. However, to receive a grant, projects must be reasonably expected to change people’s lives.

 

Secondly, research projects do not take place in isolation: they both stand on the shoulders of important preceding work, and in themselves lead to important further investigations. Therefore it is of prime importance that research data and their findings are robust and verifiable, and a firm foundation for subsequent work. Funders such as MQ, which are intimately invested in the end point of people affected by mental illness, are consequently intimately invested in ensuring that research they fund is robust and replicable, and that it is both preceded and leads to robust and replicable research.

 

Thirdly, the time lag for research innovations to reach people affected is too long. It occurs due to inefficiencies in the system, compounded by the scandal of waste in biomedical research that is substantially related to irreproducible research (Chalmers & Glasziou, 2009; Munafò et al., 2017). Funders have an obligation to financiers to ensure that money is well spent. Moreover, those such as MQ have an obligation to donors to invest in impact, and ensuring that innovations reach people as rapidly as possible. It is unacceptable for people to remain in distress and despair due to solutions remaining within the confounds of wasteful research practices.

 

Consequently, MQ is sending this submission to the Science and Technology Committee’s call for evidence to the Reproducibility and Research Integrity Inquiry. In particular, this submission responds to topics 1, 3 and 4: the breadth of the crisis, the role of different agents in the field, and policies and schemes that could have a positive impact.

 

Firstly, the breadth of the crisis. Fundamentally, the reproducibility crisis jeopardises the entire life sciences sector, and also threatens the UK’s position as a global leader of science.  The sector as a whole has an obligation to its funders, be they government agencies or members of the public donating hard-earned funds to organisations such as MQ, to reduce waste and maximise the outcomes and impact of research.

 

This has growing importance in a context where research assessment of universities and government-funded research is increasingly focused on the impact of the research. To put it in context, around one-third of all research in the UK is funded by members of the Association of Medical Research Charities, most of whom directly fundraise to do this and so report back to donors. Irreproducible research makes this unsustainable.

 

Robust and reproducible research is essential in order to translate research findings into meaningful change for individuals, communities, and societies. Hence it is a requisite for continued funding to the sector, whether from government, industry, or charitable donation.

 

Secondly, different agents in the sector have equivalent responsibility although different roles to solve this crisis. Implementing the following policies and practices requires an integrated response, with each agent taking a role that is critical piece in the jigsaw puzzle.

 

The majority of people who enter science do so with motivations for pursuit of curiosity, new knowledge, and the benefit of mankind and/ or the planet. Bad practice, where it occurs, is often the result of poorly constructed incentive systems, pressure from other sources, or even lack of knowledge. It is of prime importance that recommendations and changes are made with recognition that the vast majority of people have admirable motivations.

 

The key headline is that there is a need for a reproducibility governance framework analogous to that for ethics. The research community identified ethics as a priority, and so instituted the system of institutional and regional ethics committees, which works. The same is needed for reproducibility. Indeed, is it ethical to conduct research that is not robust and reproducible?

 

Importantly, prime responsibility lies with institutions for ensuring the practices of those on their payroll and in their buildings. It needs to be fundamental to career progression and to having permissions requisite for undertaking projects. Importantly, this moves decisions about robust research practices outside of the remit of a sole lab head, and increases the responsibility of peers and senior leaders.

 

The motivations of most early-career researchers, of scientific curiosity and a desire to conduct good science, need to be fuelled and enhanced by their team leaders. Putting responsibility with universities, line managers, lab heads, and leaders is essential in a context where much of the UK’s research is funded through centre-based, institutional funding.

 

In keeping with the UK government’s contemporaneous review of bureaucracy, chaired by Adam Tickell, this must not add additional workload or bureaucracy to an overburdened system. A solution to the need for a reproducibility governance framework that does not invoke additional bureaucracy.

 

In reality, transparency is one of the greatest tools of the reproducibility agenda. This is because post-hoc hypothesising is one of the major issues in relation to the subsequent impact of a project and its findings. Transparency provides an a priori check of hypotheses, and so removes post-hoc analytic biases. Moreover, simply knowing that external people can look at the detail of one’s work, results in the overwhelming majority of people improving their own practices such that they move towards those outlined as good practice (for more on what good practice looks like, see Munafò et al., 2017).

 

One such solution, without additional bureaucracy and based on transparency, is to publish submissions to ethics committees in a similar approach to that of registered reports. This may involve

 

The sector has responded to the open science framework by assigning “badges” denoting the extent of open science practices, such as availability of research outputs, including data, code, materials etc (for more, see https://www.cos.io/initiatives/badges). The same is needed for reproducibility, to be used across funding applications and publications. This involves:

 

Journals have a principal role in ensuring that editors have responsibility to ensure that reviewers are not allowed to make requests that invalidate reproducibility principles; eg post-hoc hypotheses, or unplanned comparisons (unless clearly labelled as exploratory). The field is littered with many anecdotal stories of reviewers requesting such changes, and researchers being forced to make them in order to see the work published. (Due to lack of transparency of the review process, these are only rarely published). Reviewers should receive clear guidelines about the nature of requests they can and can't make of authors, with enforcement responsibility lying at the hands of editors who pass on requests, and of Chief Editors who supervise journal editors.

 

Journals and funders have a key role in aligning assessment processes and, where possible, utilising single-step assessment. Having an additional quality check prior to publication promotes bad science, is extra bureaucracy, and an additional load on people’s time. This model has been trialled with great success. Importantly, exploratory findings are in no way inhibited from being published, but are transparently labelled as such and as requiring replication under hypothesis-testing conditions. This requires funding schemes and journals to partner from the point at which funding rounds are conceptualised. Similarly, journals could be required to accept UKRI/ NIHR-accredited / AMRC etc research assessment and study registration as sufficient for research quality; journals’ quality decisions are therefore based on whether research has been conducted according to the approved plan. Decisions on topic-fit still sit with journal, naturally.

 

Funders have a role in incorporating the assessment of reproducibility alongside other aspects of scientific merit and techniques such as Patient and Public Involvement and engagement (PPIE). It should be a standard part of assessing the calibre of the science.  Funders have an obligation to follow best practice in research quality review, and members of the Association of Medical Research Charities are required to do so. Similarly, they have a responsibility to incorporate journals into assessment process wherever possible in order to streamline.

 

The sector as a whole needs to focus more on finding the right answers to important questions. Often the focus is on novelty, excitement, and discovery. This results in a culture that does not highlight the importance of finding out what does not work, and what can be discounted. Consequently,

 

In sum, the majority of people in the life sciences sector have admirable motivations; harnessing their vision and good will is a key tool in resolving this crisis. The prime responsibility lies with institutions and employers to ensure that researchers conduct research that is robust, reproducible and of integrity. A reproducibility governance framework analogous to that for ethics is required, such as publication of ethics submissions. Journals must ensure that editors do not pass on requests from reviewers that violate core principles of reproducible research. Journals and funders must align to utilise single-step assessment of projects funding and publication.

 

                                                       

Chalmers, I., & Glasziou, P. (2009). Avoidable waste in the production and reporting of research evidence. The Lancet, 374(9683), 86–89. https://doi.org/10.1016/S0140-6736(09)60329-9

Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 0021. https://doi.org/10.1038/s41562-016-0021

(September 2021)