Written Evidence Submitted by The British Psychological Society (BPS)
(RRE0052)
The British Psychological Society (BPS), incorporated by Royal Charter, is the learned and professional body for psychologists in the United Kingdom. We are a registered charity with a total membership of just over 60,000.
Under its Royal Charter, the objective of the British Psychological Society is "to promote the advancement and diffusion of the knowledge of psychology pure and applied and especially to promote the efficiency and usefulness of members by setting up a high standard of professional education and knowledge". We are committed to providing and disseminating evidence-based expertise and advice, engaging with policy and decision makers, and promoting the highest standards in learning and teaching, professional practice and research.
The British Psychological Society is an examining body granting certificates and diplomas in specialist areas of professional applied psychology.
The Research Board of the BPS welcomes this opportunity to contribute to the inquiry and has provided responses in relation to the following areas:
We are particularly concerned that the culture of “publish or perish” that pervades academia and career progression has created perverse incentives for rapid research outputs and even in some cases, extensive fraudulent data. Further we are particularly concerned that the reliance on metrics and anticipated metrics to evaluate research outputs presents considerable barriers to early career researchers and researchers carrying out replication work being able to achieve publication.
There is also a great deal of distortion in citations, with null results receiving few citations despite their importance. This also leads to significant file drawer issues, with findings in the mainstream literature taking longer to be debunked. There is also evidence that manuscripts submitted to journals are being pre-screened through the lens of the potential citations that they might attract, and that articles estimated to be likely to attract lower citations are being cursorily desktop rejected regardless of their actual scientific rigour. On a more positive note, however, journals willing to publish null results – and in some cases even actively doing so – have gradually become more prevalent, assisted by open access publishing systems. More needs to be done to encourage these.
- research funders, including public funding bodies;
We welcome the development of overarching policies by the UKRI[1] clearly detailing the requirements for funded researchers to engage in open research practices. However, it is also important to support researchers via resources and guidance on how to adopt these practices. There are considerable pervasive myths regarding open science and the requirements on researchers to adhere to the underpinning principles.
For instance, Houtkoop et al. (2018)[2] surveyed authors of published psychological research and highlighted that public data sharing remains relatively uncommon. It is something that happens only infrequently. Reasons cited for this include the purveyance of a view that it is not a common thing to do, that sharing requires extra work, that ideally it should be on request and a lack of awareness of how to share data. Greater incentive and support from HEIs, journals and funders was advocated as being effective means of addressing these concerns and providing training and resources. Moreover, whilst there was an overall recognition of the desirability of data sharing for the discipline, this was not reflected for an individual’s own research.
There is also a need for research funders to recognise that rigorous research takes more time and more resources, and ensure that grants and timescales for completion actively promote rigorous and transparent research. Funding schemes aimed at supporting replications are also required.
Many methodological reforms also need to be supported by increased investment and infrastructure (data repositories and administrative support for data stewardship), promoted by funding bodies, and rewarded by universities.
However, the scientific community has responded actively to the replication crisis (e.g., via the open science and meta-science movements), and there is increased collaboration across scholarly societies (e.g., BPS, APA, for pre-registration).
Psychological science has been a trailblazer, both in the amount of self-reflection and scrutiny of research practices within the discipline but also in the adoption of practices aimed at rectifying these. This includes the move toward pre-registration, data sharing and the increased importance of this within publishing, as well as journals that accept registered reports. However, there is much more to be done, as some sub-fields of psychology are more advanced in relation to open research practices than other areas.
The BPS has produced numerous briefings and guidance for researchers in psychology to engage in open research, and particularly open data. We recognise that in order to encourage the adoption of such practices, support and resources are needed, especially for areas in research where openly sharing data is not straightforward.
Recognising the importance of “as open as possible; as closed as necessary” has underpinned our position and guidance to ensure that we are inclusive in our approach across all research subfields and methodologies.
It is our view that a really effective response means increasing research integrity through instilling the importance and value of scientific rigour at all levels of psychological qualification and to increase replications in the field. This can be through students understanding the importance of pre-registration and the principles surrounding the reproducibility through to practical engagement with this in terms of their own research undertaken as part of their course. For example, Dr Katherine Button’s (University of Bath) project helps third-year psychology students collaborate on a replication study for their final year dissertation project. Button realised if she could have undergraduate students collaborate on a project to replicate an established research finding, also pre-registering the study’s methods and proposed analyses, this would give students the best start in terms of methodological training, but would also add an invaluable replication attempt to the literature. We are now rolling this out through inclusion in the requirements for our accreditation of undergraduate programmes.
However, some responses have inadvertently led to additional issues (e.g., a rise in predatory open access journal) and a reliance on third-party websites to compensate for lack of infrastructure and funding and university level.
There are also issues with media reporting that inflates the importance of findings and fails to distinguish carefully between peer-reviewed and non-peer-reviewed publications, often picking up and trumpeting results from pre-print articles.
Government is largely remote from the issues considered here, but it does have a role to play in recognising the resource implications of reproducibility and integrity, and supporting UKRI funding accordingly. It also has a role to play in openly and consistently supporting the value of the research process, the evidence it produces, and evidence-based decision-making – rather than undermining this by decrying ‘experts’ and engaging in highly selective use of convenient results. It would also do well to recognise that its policies on university funding create or at least maintain some of the perverse incentives outlined above.
Departments should be encouraged to develop an Open Research strategy across research, teaching and practice that aims to embed greater scientific rigour. This will also require departmental or University level infrastructure for the storage of data and materials (where appropriate) and University wide workshops for doctoral researchers on Open Research practices.
Within academia more widely there needs to be a shift away from incentive structures that focus on the metrics of publications (impact factors, volume of publications), and toward studies with greater rigour. There may be a need to actively reward those who carry out reproducible research and carry out open science practices, or to reward research where the authors are committed to being as “open as possible and as closed as necessary” (for example, through internal funding streams).
More broadly, the move to an Open Research culture is aligned with a move toward a more positive Research Culture and there have been various reports on how to do this (see Welcome Trust report).
Changing incentive structures and moving the focus away from impact factors of journals, potentially by rewarding those who carry out reproducible research and carry out open science practices. Doing this must involve the publishing houses as there is clear evidence for an increase in editorial desk top rejections based on perceived citations that submitted articles might achieve.
It is not clear how this would be different to the excellent work currently carried out by the UK Research Integrity Office (https://ukrio.org/). Any such development should consider the important role that UKRIO has performed and the gaps in support and guidance that it has filled and how the formation of a national committee would provide a substantially different function.
(September 2021)
[1] https://www.ukri.org/publications/ukri-open-access-policy/
[2] Houtkoop BL, Chambers C, Macleod M, Bishop DVM, Nichols TE, Wagenmakers E-J. Data Sharing in Psychology: A Survey on Barriers and Preconditions. Advances in Methods and Practices in Psychological Science. March 2018:70-85. doi:10.1177/2515245917751886
[3] https://www.psychologicalscience.org/observer/a-call-to-change-sciences-culture-of-shaming