Written Evidence Submitted by the Association of Research Managers and Administrators



About ARMA

ARMA (UK) is the professional association for research management in the UK. We represent research leaders, managers and administrators, offering professional development and opportunities to build networks, knowledge and skills.

With over 2,800 members, we work with UK-wide and international bodies to influence and understand the changing research management agenda, translating the impacts of that change for our members. We work with others to promote public trust in research, communicating its benefits and value. Most of all, we work to enhance research management as a professional partner in the UK research environment.

We provide a wide range of membership benefits and development opportunities, including a comprehensive training and development programme, a 3-day annual conference, ATHE-accredited qualifications and a significant online resources library.


Our Response



What do you consider is the breadth of the reproducibility crisis and in what research areas is it most prevalent?


ARMA members felt that the breadth was broad across the range of disciplines, although particular focus was given to research involving human interaction, Biosciences, Social Science, Arts as well as emerging interdisciplinary areas, AI for data interrogation & analysis and machine learning.



What are the issues in academia that have led to the reproducibility crisis, i.e. its causes?


Research culture: institutional research culture, including lack of training at all levels in robust research design and methodology. Lack of understanding of open data and open science, colleagues unwilling to share data / collaborate with others.

Development of biases in certain disciplines especially in relation to past studies and working practices.

Funding pressure: for example, external funding of research projects with short durations (rarely for more than 5 years and requiring reporting perhaps too early in the project life)

Publication pressure: it was noted that there is little reward for the publication of confirmatory studies vs novel studies. Including where reproduction studies could be published. There is also too much weight on publication, leading to early publication, smallest publishable unit and excessive papers making it hard to keep current, thus less testing of others' work

Resources: lack of resources and time was indicated as a factor. Having time to scrutinise data.

Promotion / tenure: linking of research successes with promotion/tenure, and higher salaries, needing quick results. The constant pressure to do more with less – more publications are needed to establish an academic career at a time when there is less funding available. This creates a need to publish at all costs and as quickly as possible. Academics don’t exist in a perfect bubble. They have all the usual issues to deal with mortgages to pay - but their income is linked to their ability to continually publish ‘unique’ findings. The same researchers are also often having to bear heavier administrative and teaching burdens than might have been the case in previous generations. We are now expecting them to work openly - sharing other outputs from their research to aid reproduction and reuse without reducing the other burdens on them.

Lack of infrastructure: there has also been (until very recently) lack of accessible infrastructure to support the other actions needed to make research more reproducible – repositories for data, platforms for sharing methods and techniques, platforms for the sharing of preregistration documentation.

There are very limited stock and sample centres available for the preservation and sharing of the materials needed to reproduce some types of studies.

Only a few fields of research have standardised metadata standards for documenting research processes. In fields where these standards don’t exist, researchers each document their research in their own, idiosyncratic way leading to differences in interpretation about procedures and processes.

There is still no obligation across all fields and funders to share the data which underpin research. Without this it is almost impossible to determine if the findings of a study are reproducible.

REF pressure: to publish and produce impact rather than the focus on the integrity of the dataset.




What would you say is the role of the following in addressing the reproducibility crisis: research funders, including public funding bodies; research institutions and groups; individual researchers; publishers; and Governments and the need for   a unilateral response / action.


Research Funders: Clearer funder requirements in relation to reproducibility – with an increased focus on Responsible research and innovation (RRI)

Providing funding for reproducibility studies and for RRI training

It was noted that funders often take a scatter-gun approach with their schemes to some extent (and this is understandable) covering early career short term funding in larger numbers (e.g. Leverhulme/BA Small grants), with longer term/larger funded research grants for experienced researchers which require ambitious schemes and appear to be "result- seeking". The funders also have stakeholders to convince that funding is being invested appropriately and skilfully.

Funders to ensure adequate funding for time, resources and equipment, and for commercial funders to focus on the science not the profit-margins.

Funders could indicate that a certain percentage of research funds are expected to be spent on data management and open research processes. This would reassure researchers who worry that ‘expensive’ processes which are not directly related to the acquisition and analysis of new data would be judged harshly by grant review committees.

Government to understand the implications of inadequate funding to research councils, institutions and researchers.

Research Institutions:

Improving research culture, including providing training for academic and professional support staff.

Peer review – internal / external (including funders and publishers) scrutinising datasets as well as the publication (although this would be resource intensive and potentially costly).

Research institutions could move away from rewarding researchers based on the number of publications. Ensuring responsible metrics policies / processes are embedded in research culture.

One respondent compared institutions / groups to premier league football teams, especially when the REF is 2 or 3 years away, in contracting well-known high achieving researchers to work for them. These new assets are treated with special consideration but expected to "make a difference" early on in their appointments.

Have robust research ethics processes and make a requirement that researchers should share data, software and equipment.

Research institutions invest in infrastructure (with support from funders / publishers) which makes the sharing of all research outputs easier for researchers. Funders could also help with this through block grants in a similar way to current arrangements for open access.


It was felt that researchers alone would not have enough support to change the culture of an organisation. In a culture that does not fully support reproducibility as a key standard, researchers may risk their careers.

There needs to be a greater transparency of methodology, biases, negative results, reporting of findings; and that this is being contradicted by a system requiring results that are successful and within a short term.

Whilst an extreme example and not 100% fit, the current Theranos court case is a very good demonstration of why reproducibility/scientific integrity needs to be actively practiced.

Individual researchers do bear a responsibility to work as openly as possible and to make their research process as transparent as possible, but the reality is that if they are not supported to do this, and rewarded when they do, they will not change their ways of working. The pressures they experience need to be changed by the other stakeholders in the process.


Maintain excellent peer-review processes. Identify processes for managing and shutting down predatory journals. There should be a clear criterion for pre-prints, this should take into considerations the reproducibility of data, and for a provisional peer review.

Publishers to tighten up their policies on data, code and sample sharing and act when these policies are flouted. Ideally articles would be checked for reproducibility (at least at the data level) during the review process, but this will place a much greater demand on reviewers. More publishers could employ statisticians to check data analysis in articles, but this would increase publication costs.

There is also a need for better metadata standards to be evolved in some fields, but this is a multi-stakeholder initiative and although it would be best developed by researchers, without support (and possibly a lead from larger bodies), this will be difficult to get going.

Everyone should have a role. Different approaches to emphasise quality over quantity. Re-emphasise review papers, which have dwindled as not REFable (these should be in addition to high quality outputs, not in place of results).


What policies or schemes could have a positive impact on academia’s approach to reproducible research?


ARMA members identified several initiatives that could have a positive impact:

Review existing Open Access Concordat bringing it into line with other recently revised concordats to ensure consistency and accountability.

A sector best practice document would be ideal. Much like the ARMA/UKRIO Ethics Guidance. This is perhaps something ARMA and UKRN could work together on?

Institutions to undertake in-house research audits (sampling) of the research projects by a small team, including external/lay person, with specific questions about methodology, research results so far, and an assessment of the project including specific questions about how the study might be reproduced. Alternatively, a group of HEIs could form a joint team and sample from each of the HEIs. Audit results would be examined by the internal ethics and research management committees and issues of interest identified.

Better understanding of the history of discovery and new 'story arcs' disseminated and supported by funders and HEIs.

Funders to invest more in longer term "grand challenges" research where project terms could be 10 or more years. The only place this type of funding seems to appear is for Centre funding grants. As well as providing funding for reproduction studies – to show that this is valued. Funding to allow for testing of prior results in novel context or comparison.

Allowing researchers to make a career in this area – there are many skilled scientists who don’t necessarily want to run their own group, but who would like to continue in the lab, and would be well-suited to this work. Currently these researchers are lost from research once they become too expensive to support on grant funding as members of other groups.

Improved organisational management and line management with a focus on research integrity and reproducibility

Institutions, funders, researchers embedding Open science and Open Access into everyday practices; Responsible Research and Innovation training to be implemented and embedded into the research culture (applies to both funders and research institutions/groups)

It was suggested that the Octopus publishing platform could be a welcome step to ensuring wider scrutiny from across the academic community on research and critiquing research design, methodology and analysis.

If the processes needed to enable reproducibility were required by all researchers, there would not be the perception that some researchers were being held to different standards (with the associated added time and resource requirements).

It was felt that commissioned review papers should be REFable.

Ultimately, most solutions will come down to funding, time and recognition. The whole of academic research needs slack in the system so that researchers can give proper time and consideration to activities beyond constantly pursuing their next novel publication, without harming their careers.






How could establishing a national committee on research integrity under UKRI impact the reproducibility crisis?

It was felt that breadth of influence that the national committee may have is, unclear. Research is an international endeavour and action taken at a national level will only achieve so much.

If the committee is active in developing the research integrity agenda with reproducibility being a key initiative, enacted by significant engagements with institutions to understand the problem and create incremental resolutions. The committee needs to be active at the ground level and not solely strategic.

Members asked how the national committee would work with the UKRN?

Having clear remit of the committee is needed. Academic colleagues also need to be made aware that such a committee exists and how it can support them.

It was felt the national committee could (with appropriate links in place) help the UK Government understand the issues as they ultimately decide upon the level of funding.

It was also noted that integrity is a large, complex and scary subject. People who believe they are acting with integrity do not engage. Therefore well-intentioned, but insufficiently-thorough work goes ahead. It was suggested that funders pushing a) output quality and b) pushing publishers to allow longer more in-depth papers would work better


(September 2021)