Written evidence from The Health Statistics User Group (DTA 33)

Public Administration and Constitutional Affairs Committee

Data Transparency and Accountability: Covid 19

 

Introduction

The Health Statistics User Group (HSUG) was established to represent all users of health and health services statistics and to bring together users and producers of statistics. Our activities are aimed at maintaining and improving data quality, data access, and the use of health, health services and social care statistics. The group is independent, but, along with other groups representing users of statistics, we are affiliated to the Statistics User Forum. Our membership includes professionals working in a wide range of organisations including NHS organisations, central and local government, universities and non-governmental organisations.

This response has been prepared in collaboration with the Faculty of Public Health

At the start of the Covid-19 pandemic, providers of official data and statistics faced multiple challenges. There was unprecedented demand for rapid and detailed information to monitor the pandemic and inform policy decisions. At the same time, statistics providers also had to continue their usual role of monitoring the economy and society in the face of new barriers to collecting the data needed, and in many cases rapidly provide the infrastructure needed to enable staff to work from home. Our response will focus on data relating to health and to health and social care.

Devolution is a key feature of data about health and social care in the UK as England, Wales and Scotland have had separate health administrations since 1919, followed by Northern Ireland when it was established in the 1920s. Data collection in each nation is split between government, the NHS, civil registration and a public health agency, making collaboration and communication crucial. During the pandemic, existing joint working between nations was further developed to bring together data and analyses to monitor the pandemic.

Response to questions

Question 1. Did Government have good enough data to make decisions in response to Coronavirus, and how quickly were Government able to gather new data?

 

1.1              Like most countries, UK governments had to adapt existing systems and introduce new systems to make decisions in response to coronavirus. Where data and systems existed new data collections and analyses mostly developed quickly. ONS developed new analyses of mortality data, including linkage to the census to investigate ethnic and other differences.[1] It adapted its processes to provide new feeds of mortality data, in some cases daily, to government agencies dealing with national and local pandemic planning. It quickly added new questions to existing surveys[2] and set up its infection survey, in collaboration with the University of Oxford, University of Manchester, Public Health England and Wellcome Trust in record time.[3] Although this survey initially had a small sample, it has now expanded to be the most useful source of data about the prevalence of coronavirus in the population. Once the extent of deaths in care homes was recognised, ONS worked with the Care Quality Commission to improve and publish statistics based on CQC’s regulatory function.[4]

1.2              ONS and PHE have published regular data on the impact of the COVID cases and deaths by socio-economic group, geography and ethnicity during the course of the pandemic. Whilst these data have given a high-level overview of the emerging inequalities that compound existing inequalities, more depth is needed. For example, more information is needed on the impact of delays in seeking and receiving health care on inequalities in excess mortality. Data on inequalities by ethnic group have become highly politicised and a polarisation of the debate impedes reaching consensus on meaningful action.

1.3              While the focus has been on inequalities in socio-economic status and ethnicity, there is a lack of data on other groups such as LGBTQ+, people with justice sector involvement, people with serious mental health problems, gypsy, Roma and traveller communities, sex workers and people who sleep rough. These factors are not readily available in official statistics and joined-up, creative approaches are needed to develop new research.

1.4              A key data gap has been the impact of COVID-19 on health care inequalities. There is very little publicly available data showing the impact of COVID-19 on inequalities in access to screening, elective care or primary care consultations. In the absence of these data, there is a risk that the NHS could have inadvertently increased inequalities during the response.

 

1.5              There are areas where existing data collection systems had been allowed to fall into decay, as with communicable disease notification. The systems which developed from the late nineteenth to mid-twentieth centuries were locally based; when the role of medical officer of health was abolished and communicable disease surveillance was moved into the NHS in the 1974 reorganisation, a top-down system evolved. Although public health was moved back into local government under the Health and Social Care Act 2012, it was with a much-reduced role which suffered further as a result of cuts in local authority expenditure.

 

1.6              Early monitoring of community outbreaks of COVID-19 was undermined by government guidance not following the statutory notification systems impeding the notification of suspected cases.[5] General practitioners, along with all other registered practitioners have a duty to notify suspected cases of communicable diseases in their patients, but were no longer able to do so when the coronavirus pandemic broke out - and people were told not to consult them if they had symptoms of COVID and to consult NHS 111 instead.

 

1.7              In other areas the need to create ad hoc data collection systems highlights existing weaknesses in health data, particularly the lack of regular reporting from Trusts in anywhere near real time. Because ad hoc systems had to be created in the NHS, there were a lot of data quality problems especially in earlier months. This not only made monitoring the situation difficult, but contributed to lack of trust in the data, especially as definitions and inclusion criteria changed over time, calling into question the validity of time trends. There have been multiple revisions to the separate methods for counting COVID-19 deaths used by NHS England and Public Health England, as well as revisions to the time-series due to delays in data flows.

1.8              The country which experienced the most problems is England, where the NHS and its data collection function has become fragmented into multiple agencies. This was recognised in 2015 in a systemic review by the Office for Statistics Regulation which pointed out that there was no single individual or organisation with clear leadership responsibility.[6] This review led to a number of meetings and initiatives, including a website to help users to locate sources of data about health and heath care.[7]

1.9              One issue this process did not address was the shortage of statistical capacity in the Department of Health and Social Care. This may have reflected both downsizing through voluntary redundancies and early retirement, as well as analysts moving into the new organisations (such as NHS England and Public Health England) formed as a result of fragmentation over several years. This reduced its capacity to monitor the pandemic.

1.10              It will be important that the creation of the new National Institute for Health Protection does not result in a further loss of analytical expertise or capacity as the new organisation is created and staff move round. This is particularly important given the latest case trends indicating a high level of expertise will be needed for the foreseeable future. It will also be vital not to diminish the capacity in health promotion work as a result of the change.

1.11              Data about tests of COVID-19 have expanded, as the testing regimes and scope of testing have grown.[8] It has sometimes been difficult to discern how the data had been collected and the impact of changes in methods over time, notably whether data were compiled according to when the swab was taken, when the result was reported or when the test kit was posted to people to be tested. A key problem is the failure to differentiate and publish testing statistics by purpose.  Crucially, data about people tested because they have symptoms should be separated from data about screening and testing healthy symptomless people.[9] Testing standards vary by laboratory and over time, while it is not known whether government guidance is applied.

1.12              Although the DHSC has improved its presentation of data about testing and tracing in England other problems remain, for example changes over time in criteria for positive tests to be included in counts.[10] Other key data items, such as the reasons for the tests or whether the people tested showed symptoms of COVID are not presented.[11],[12],[13] England is not currently providing data on outcomes of advice to cases and contacts to self-isolate. Given the wide variation in testing by region and over time, according to perceived prevalence it is essential to publish positive and negative results on tests and on people by age, by area and over time.

1.13              As the government’s unusual approach to service procurement has proceeded, testing and tracing activities have been fragmented across a number of public and private sector organisations, some of them likely to be lacking the input of statisticians, epidemiologists and other key experts. Data from these activities are limited, inconsistent and of unknown quality.

Question 2. Was data for decision making sufficiently joined up across Departments?

2.1              Great efforts were made to share data and join up the use of evidence across government. The Secretary of State for Health’s ‘COPI direction’ which gave NHS bodies time-limited exceptional authorisation for data sharing under Section 251 of the NHS Act 2006[14] was a positive factor in facilitating data sharing between departments and enabling external researchers to be involved in analyses. In many cases, creative but still appropriate use was made of legal gateways and procedures to enable data sharing that went beyond previous practice. However, some NHS bodies failed to facilitate information governance processes in line with the urgency of the situation. Overall, while very positive results have been possible, the obstacles that had to be overcome highlight the need for a single, consistent and efficient framework for information governance across the health sector.

 

Question 3. Was relevant data disseminated to key decision-makers in: Central and Local Government; other public services (like schools); businesses; and interested members of the public?

 

3.1              During the first wave, data was not shared with Local Government public health teams on the details of people testing positive in their area. For example, it was not until Leicester went into local lockdown that these teams were given access to postcode data for positive cases, and it was sometime later than that that individual data was shared. This was a major failing in data sharing and massively undermined local ability to reduce transmission. As time has progressed, large improvements in data sharing have been observed. 

 

3.2              The flow of data to local authorities has not been accompanied by an increase in public health analytical capacity, which had already become limited in many areas following cuts in local government funding. This meant that some local teams received increasing amounts of information but did not have the staffing or technical expertise to make optimum use of it. The gradual breakdown of public health analysis networks over the past decade and confusion over the future of PHE’s Local Knowledge and Intelligence Teams means that best practice is not being shared and local authorities are having to make sense of a mass of new datasets by themselves with minimal support. PHE are willing but not always able to help because of the pressures they are under.

 

3.3              Extensive amounts of government guidance for these sectors have been published, but there have been some major challenges with the guidance. Most significantly, guidance has often contradicted other pieces of guidance and it has been unclear to what extent it has been informed by data-based evidence. In some ways this seems inevitable when so much guidance comes out from so many departments so quickly. Many queries have been received by local government teams from businesses, schools, and members of the public, asking exactly which bit of ’the science’ supports the latest rule, as people struggle to apply population-level interventions to their individual lives. Lengthy guidance has on occasion been released in a non-timely manner. Particularly guidance on re-opening of schools, which was published on the Friday evening ahead of implementation the following week. 

 

3.4              As demands have risen and agencies have become more stretched, some local teams have found that their ability to respond to individual members of the public has been lost. 

 

3.5              Finally, there have been challenges in feeding problems back up the chain from Local to Central Government. It was not until later in the pandemic response that DHSC regional reps were available to effectively communicate problems with, and effect positive change in the National response. This was notably problematic during centrally determined organisational changes such as the establishment of the Joint Bio-Security Centre. There was a lack of consultation about the need for a new body or communication or explanation about its role and composition, leaving local teams in the dark in a way that threatened to undermine their interactions with local stakeholders.

 

Question 4. Were key decisions (such as the “lock downs”) underpinned by good data and was data-led decision-making timely, clear and transparently presented to the public?

 

4.1              It is unclear how well decision-making was led by the data and science as opposed to political considerations. Government has seemed slow and inconsistent in the way it responded to scientific advice. Presentation of data to the public has been constantly changing and to many people gives an impression of being politically influenced.

 

4.2              Data have been published at many different levels of disaggregation ranging from large areas such as regions and counties down to small local areas below ward level. These give different messages. Data for small local areas show wide differences but are unstable over time while data for large areas mask local differences and may lead to restrictions in areas with low levels of infection.

 

4.3              Initially the identity of members of SAGE and details of the ‘science’ on which its recommendations were based, was not publicly available. This was not good for either scientific debate or public trust.

 

Question 5. Was data shared across the devolved administrations and local authorities to enable mutually beneficial decision making?

 

5.1              Good efforts were made to share data across the four UK nations at national level, for example in reporting death numbers. Sharing of information from national to local level was slow, however, as noted above.

 

5.2              There has also been some tension between local authorities and city regions in terms of agreeing whose ‘version of the truth’ to accept. There has been a lack of clear interpretation that can be consistently agreed across all levels of government.

 

Question 6. Is the public able to comprehend the data published during the pandemic. Is there sufficient understanding among journalists and parliamentarians to enable them to present and interpret data accurately, and ask informed questions of Government? What could be done to improve understanding and who could take responsibility for this?

 

6.1              Health data are often complex and nuanced and can be difficult to communicate accurately in a crowded information arena. The situation has been greatly complicated by the unfortunate growth of myths and conspiracy theories, fuelling mistrust in scientific and government messages. This trend includes competing interpretations of the data and its meaning, based on ideological viewpoints and wider beliefs, which make understanding even more difficult. Public trust and understanding can be maximised by clear, transparent and impartial communication, such as from ONS, but this is not enough in itself to overcome the wider crisis of partisanship and distrust in public discourse.

 

6.2              Publication of data at Downing Street daily press conferences did not follow GSS practice in which press conferences are led by statisticians whose role is to explain the data. These may be followed by pronouncements by officials and ministers about how they will respond. Journalists at the press conferences were allowed only one question each, which meant that data were not sufficiently probed. While the circumstances are of course exceptional, handling of communication could probably have been more effective, and more care could have been taken to follow the principles of the Code of Practice on Statistics.

 

6.3              The government should have clear publicly available plans for the production of both the day to day 'operational data', including how it is produced and put into the public domain, and the subsequent analysis and research which helps us understand underlying issues associated with COVID-19, which could be presented in a forward plan. It is strongly recommended that the Government Statistical Service is involved in setting this out, accepting that there needs to be flexibility to cope with changing circumstances.

 

6.4              Some specific areas of communication have led to bad practice and loss of trust. For example, confusion over the counting of tests led to a letter from Sir David Norgrove, chair of the UK Statistics Authority, to the Health Secretary in June in which he said "The aim seems to be to show the largest possible number of tests, even at the expense of understanding.”[15]

Question 7. Does the Government have a good enough understanding of data security, and do the public have confidence in the Government’s data handling?

 

7.1              There has been a great effort by responsible bodies to ensure appropriate information governance, and we are not aware of any inappropriate data sharing or breaches of confidentiality. However, the government procurement approach during the pandemic has been opaque and not conducive to public trust and transparency regarding contractors, and this increases the risk of government data handling being perceived as insecure or risky. Government bodies and contractors have collected exceptionally large volumes of data, including sensitive health data and big data, for purposes such as modelling infection-related behaviour, and something needs to be done to reassure the public that this body of data will continue to be handled sensitively and will not be used for commercial or party political gain.

 

7.2              The ‘COPI direction’ has been widely welcomed as necessary and proportionate. It appears that good practice in agreeing and monitoring data sharing has generally been followed, but there may be challenges in bringing the special arrangements to an end and ensuring that all ‘legacy’ data sharing and holdings from the period remain lawful.

 

Question 8. How will the change in responsibility for Government data impact future decision making?

 

8.1              It is not clear how changes in government responsibility for data will play out either practically or in terms of public perception. There is a risk that in the current political climate, the move from DCMS to Cabinet Office could raise public worries of greater political control, and even be seen to reduce the likelihood that data will be used to hold the government to account. Lack of transparency around the use of contractors and similar issues may also raise fears of the misuse of data by government or commercial interests.

 

8.2              On the other hand, greater cross-government focus on the importance of data to inform policy is welcome. To add real value, there must be emphasis on practical improvements which a change in departmental roles will not bring about in itself. These include a strong focus on common data standards and interoperability across the public sector, and adoption of efficient and easily scalable technologies for data processing and sharing. However, even more important is a coherent policy approach and governance framework for data sharing across the public sector, including the NHS and local government, which may require legislation. This is the only way to overcome a long history of difficulties and delays to beneficial data use caused by differing governance regimes and constant re-interpretation of legal and policy barriers on access to data.

 

8.3              The Faculty and HSUG would be delighted to be further involved in these discussions and to offer positive input based on the collective experience and subject expertise of our members.

 

 

November 2020

 

 


  1. [1]Office for National Statistics. Updating ethnic contrasts in deaths involving coronavirus COVID-19: England and Wales, deaths occurring 2 March to 28 July 2020. https://www.ons.gov.uk/releases/explainingethnicbackgroundcontrastsindeathsinvolvingcovid19england2ndmarchto3rdjuly2020
  1. [2]Office for National Statistics. Coronavirus and the social impacts on Great Britain: 30 October 2020. https://www.ons.gov.uk/peoplepopulationandcommunity/healthandsocialcare/healthandwellbeing/bulletins/coronavirusandthesocialimpactsongreatbritain/latest
  1. [3]Office for National Statistics. Coronavirus (COVID-19) Infection Survey, UK: 30 October 2020. https://www.ons.gov.uk/peoplepopulationandcommunity/healthandsocialcare/conditionsanddiseases/bulletins/coronaviruscovid19infectionsurveypilot/latest (Accessed October 30 2020)
  1. [4]Impact of coronavirus in care homes in England: 26 May to 19 June 2020 https://www.ons.gov.uk/peoplepopulationandcommunity/healthandsocialcare/conditionsanddiseases/articles/impactofcoronavirusincarehomesinenglandvivaldi/26mayto19june2020
  1.                                                                                                                                        [5]Roderick P, Macfarlane A, Pollock AM.  Analysis: Getting back on track: control of covid-19 outbreaks in the community. BMJ 2020; 369: m2484 https://www.bmj.com/content/bmj/369/bmj.m2484.full.pdf

[6] UK Statistics Authority. Health and Social Care Statistics in England: update on Systemic Review. https://uksa.statisticsauthority.gov.uk/publication/health-and-social-care-statistics-in-england-update-on-systemic-review/

[7] Office for Statistics Regulation. A healthy improvement: helping users find the stats they need on health. https://osr.statisticsauthority.gov.uk/a-healthy-improvement-helping-users-find-the-stats-they-need-on-health/

[8] Initially, the data reported applied to tests carried out in NHS or Public Health England laboratories and submitted to PHE through its Second Generation Surveillance System (SGSS), described as Pillar 1. In England, the numbers of tests were subsequently expanded to include Pillar 2, tests commissioned from a plethora of contactors and undertaken by commercial and other partners, rather than expanding the existing system. More generally, systems for coronavirus testing and tracing have been set up in diverse ways at national and local levels.

[9] Deeks JJ, Brookes AJ, Pollock AM. Operation Moonshot proposals are scientifically unsound. https://www.bmj.com/content/370/bmj.m3699

[10] Initially all positive tests were counted, then people with previous positive tests were excluded. More recently, positive tests were included if the person’s previous positive test had been more than a week earlier.

[11] Raffle AR, Pollock AM, Harding-Edgar L. Covid-19 mass testing programmes. BMJ 2020; 370 doi: https://doi.org/10.1136/bmj.m3262 (Published 20 August 2020)

[12] Harding-Edgar L, McCartney M, Pollock AM. Test and trace strategy has overlooked importance of clinical input, clinical oversight and integration. Journal of the Royal Society of Medicine 28 October 2020

[13] In the absence of data on the purpose of testing, cycle thresholds and confirmatory testing, exclusion of previous test positives, testing on the current scale of capacity for 300,000 tests a day of will overload and undermine the efficiency and effectiveness of the contact tracing system. Testing on this scale will generate false positive tests and false positive cases in people who are symptomatic, presymptomatic, asymptomatic and people with previous infection who are no longer infectious. This will result in cases and contacts being isolated unnecessarily if they are false positive and not isolating if they are a false negative but have symptoms.  Moreover, the failure to record the purpose of testing and link this to studies of contact tracing and GP records is a missed opportunity to answer questions about people who are asymptomatic and their risks of transmission.

[14] Coronavirus (COVID-19): notification to organisations to share information. https://www.gov.uk/government/publications/coronavirus-covid-19-notification-of-data-controllers-to-share-information

[15] Norgrove D. Response to Matt Hancock https://uksa.statisticsauthority.gov.uk/correspondence/sir-david-norgrove-response-to-matt-hancock-regarding-the-governments-covid-19-testing-data/