Royal Statistical Society – written evidence (DAD0084)



1.1              The Royal Statistical Society (RSS) is a professional body for statisticians and data scientists with over 10,000 members around the world.  Our strategic and charitable goals focus on the use of statistics and data in the public interest, improving statistical literacy, developing professional skills, and the strength of statistics and data in academia and research.

1.2              The RSS has answered five of the fourteen questions detailed in the Select Committee’s call for evidence, due to their alignment with the expertise and interests of our members. Our answers cover four of the six areas outlined by the Committee: transparency in political discourse; the effects of digital technologies on public discourse; misinformation; and how technology can facilitate democracy.

How have the design of algorithms used by social media platforms shaped democratic debate? To what extent should there be greater accountability for the design of these algorithms?

2.1              The RSS will answer the second part of this question, owing to our expertise regarding algorithmic accountability, as laid out in our submission to the Centre for Data Ethics and Innovation’s call for evidence earlier this year, on bias in algorithmic decision-making, and other previous submissions. Algorithms are designed to discriminate, but the nature of bias is not well defined. There is a potential role for the use of counterfactuals[1] in defining and mitigating algorithmic bias. This would entail using a counterfactual to highlight through simulation if differing data sets produce another outcome. If not, this highlights an inherent bias within the algorithm. However, there are unlikely to be comprehensive solutions. For example, oversight, and therefore governance, of algorithms that use the outputs from other algorithms as their inputs, and which could generate damaging feedback loops, is a challenge. Facebook and other social media companies are now information (and misinformation) conveyors that affect our democracies. They need to ensure that their algorithms do not promote misinformation as clickbait. It should be in the interests of social media platforms to do more to ensure that reliable content fuels its users’ interactions. Much good could be done if social media platforms were obliged to put just 1% of their profits into an independent trust to fund quality media, especially local media, and fact-checkers.

2.2              Every part of the process - from the data used to train the algorithm to the design of the algorithm, to the way humans respond to the algorithmic output - should be tested thoroughly. This process would ensure greater accountability of algorithms and protect the interests of the public and democratic debate. A highly explainable system that has poor accuracy is at least as capable of creating an ethical breach as one with poor explainability but high accuracy. Both Google and the Department for Digital, Culture, Media and Sport (with a contribution from the Government Digital Service) pointed out, in their written evidence to the UK Parliamentary 2017 inquiry into Algorithms in Decision-Making, that bias exists in many non-algorithmic decision-making processes, and the application of algorithms to these processes can reveal and reduce bias.

2.3              While Facebook posts have been shown to increase voter turnout (see Question 3), its algorithms surrounding ‘like’ buttons also influence democratic debate.  Facebook has to comply with the EU’s General Data Protection Regulation (GDPR): however, Facebook tracks internet users - even if they are not Facebook users - via the widespread use of ‘like’ buttons. ‘Like’ buttons are social sharing buttons installed on many internet pages to allow easy sharing of content. As a result, these algorithms can therefore share debate-influencing material on a prejudice given the number of likes a post receives. Via their use, Facebook gains data on users, who may never have given consent. This requires investigation in the public interest.

2.4              Developments in the credit-scoring industry, for example, have become increasingly regulated to reduce potential harm, whereas other industries, including online social media platforms, are much more lightly regulated and might remain so. We recommend that a review of the credit-scoring industry should be undertaken as it is relatively mature in its use of algorithms, and as credit-scorers appear to be able to explain their algorithms to their regulators. Such a review could consider what lessons can be drawn from this for other fields, including social media platforms.

2.5         GDPR, introduced in 2018, has given the Information Commissioner’s Office (ICO) stronger powers to deal with data breaches. However, regulators such as the ICO still need strengthening. To create and implement the best policies, regulators must be able to pay competitive salaries to recruit technical talent, or risk losing it to the very social media giants that need regulating. Examination of stronger regulatory approaches should lead to some broadly transferable lessons.

What role should every stage of education play in helping to create a healthy, active, digitally literate democracy?

3.1              Our democracy relies on the quality and trustworthiness of data in the public domain. As the new RSS ‘Data Manifesto’ states, to strengthen democracy and trustworthiness we need to involve the public in shaping the conversation about how data is used. Therefore, the critical question here is how to improve data skills across the UK. Establishing a digitally literate democracy needs to coincide with the realisation that data literacy is imperative in forming the basis of our digital world. The most significant lever for this is the education system. In our ‘Data Manifesto, data is viewed as a critical driver of prosperity.  As part of this vision, to ensure that everybody can effectively participate in an increasingly data-driven society, everyone needs to be equipped with the skills to engage in an ‘informed data conversation’. The RSS calls for basic data handling and quantitative skills to be integral parts of the taught curriculum. The RSS would recommend that the Smith Review of Post-16 Mathematics in England is implemented in full by the Department for Education.

3.2              Following on from the Smith Review, the RSS encourages recognition of the need for ‘A People Pipeline’, as identified by The Mathematical Sciences People Pipeline Report from the Council for the Mathematical Sciences. It is the RSS’s view that developing the pipeline requires a future with an increasing focus and content on statistics and data within teaching practice and assessment across the curriculum.  This is essential to meet the needs of employers, the economy and our democracy. It is imperative that we develop and train the next generation of qualified senior scientists, supporting our previous point that they should hold algorithms accountable across every digital and data-driven platform in order for democracy to prosper. As such, it needs to be recognised that the education system is a continuous pipeline, established from early years to postgraduate training, which leads to life-long development and more of the data scientists we need.

3.3              Data science is a relatively young profession, with few professional standards. These standards require development through strong ethical training embedded into data science courses. As such, issues relating to the bias of data which trains algorithms can be pre-empted and thus avoided. Professional bodies should also take a lead role in developing standards and, indeed, RSS’s Data Science Section has recently published A Guide for Ethical Data Science in partnership with the Institute and Faculty of Actuaries. Created with practitioners in mind, it aims to provide practical support to data scientists on ethical practice. Structured around five core ethical themes, the guide provides examples of common ethical challenges in the field and how they can be addressed. The RSS believes this guide should form the basis of professional standards so data scientists can apply and maintain professional competence in order to preserve or increase public trustworthiness regarding the use of their data. In addition, the guide ends with a detailed implementation checklist affirming the necessary steps required to implement safeguards which would ensure ethical data practice. Alongside the creation of professional standards for data scientists, a checklist should be adopted to ensure good practice across those working within social media platforms. The RSS would welcome the opportunity to elaborate on this and answer any questions that the Committee may have.

3.4              One way in which data / digital skills can be taught to those out of the current education system is through relevant employer training of non-specialists. This would enhance employees’ capabilities in the workforce which, in turn, will have to adapt in order to meet the needs of an increasingly data-driven economy and democracy. At the RSS, we have developed experience in working with employers and, more specifically, many not-for-profit organisations. Through our ‘Statisticians for Society pro bono programme we have linked over 50 non-profit organisations to statisticians who volunteer their time free of charge. Our programme has made us particularly aware of the generally low levels of data skills in the not-for-profit sector. Due to the influence this sector has upon policy-making, it is crucial for those in such organisations to receive appropriate education and be equipped with the necessary data literacy for a data-driven democracy. The RSS recognises the importance of such training for all and delivers public training courses to equip non-specialists with foundation skills in topics such as data visualisation, presenting data, and understanding and analysing data.

What effect does online targeted advertising have on the political process, and what effects could it have in the future? Should there be additional regulation of political advertising?

4.1              Online targeted advertising can have a strong influence on behaviour. For example, Lewis and Reiley reported an effect size of 5% (increase in purchases) from a randomised control trial of online targeted advertising to 1.6 million Yahoo users. Online targeting benefits advertisers and therefore political parties who will align themselves with the same tactics. This report highlighted how targeted advertising can heavily influence the public, with the effect growing as more people get engaged with the digital ecosystem.

4.2              Consequential effects of targeted advertising outlined in the above report intensify the evidence of how advertisements can harm the outcome of democratic processes through avenues outside election rules. A Facebook study reported that a single day of advertising increased voting participation by 0.39%. This implies that targeted political advertising – so called ‘dark adverts’ – could have a real impact on political outcomes. The RSS believes that democracy should not be done in the dark and, therefore, we would want to see transparency of political advertising on social media and internet platforms.

4.3              To develop a better understanding of the real and potential pros and cons of online targeting, and to work towards effective governance, we need to move beyond thinking about single organisations of one kind or another to a whole-system perspective on online targeting. Within the EU, an organisation must appoint a Data Protection Officer (DPO) if it is engaged in activities that “require regular and systematic monitoring of data subjects on a large scale” (GDPR, Article 37.1.b). This includes online targeting, but only if it is “systematic” and on a “large scale” - for example, a company buying and selling data for online targeting. An organisation that simply uses a demand-side platform to perform its targeting would not have to appoint a DPO and would be likely to use ad hoc procedures, instead, to monitor online targeting practice. In order to prevent possible negative effects from ‘dark adverts’, the RSS believes regulation of social media platforms and digital companies should necessitate the internal employment of a DPO, alongside a senior data scientist for all social media platforms. This would allow for the data collected and the algorithms to be transparent and facilitate external inspection if needed.



What might be the best ways of reducing the effects of misinformation on social media platforms?

5.1              Currently, ‘fake news’ encompasses the data debate as independent public bodies and institutions react to the credible threat this poses to the integrity of our democracy. Realistically, misinformation cannot be eliminated; instead, we should focus on building resilience against misinformation which has been around for as long as we have had data. The RSS argues for three urgent actions to protect the integrity of our democracy and to mitigate the effects of misinformation via social media platforms:

How could the Government better support the positive work of civil society organisations using technology to facilitate engagement with democratic processes?

6.1              The Government needs to get better at sharing administrative data with researchers through initiatives such as Administrative Data Research UK (ADR UK). As outlined in the RSS Data Manifesto, civil society organisations would benefit if the Government ensured that researchers have the data needed to inform their policy-influencing. Additionally, one of the most significant blockages within Government departments currently is around data-sharing and, in particular, the ability to share data with academia. Despite the Digital Economy Act being passed in 2017, several issues surrounding data being shared and used cooperatively remain -as previously identified in the 2016 Independent Review of UK Economic Statistics by Professor Sir Charles Bean. Section 4.38 of his Review criticised the Office of National Statistics’ (ONS) behaviours and capabilities, stating that too many of those within the ONS were ‘operating in silos’. To achieve alignment in Government surrounding data, the RSS would like to see the full implementation of the relevant recommendations in this report. Following on from this, an update should be provided for all those concerned (e.g. RSS and other bodies), confirming the progress on issues identified within the report and the steps that are still required, going forward, to fully address them.

6.2              Leadership is required within government departments to see the value of data linkage and to invest in it. Proportionate and duly justified access to a wide range of administrative data is imperative in providing conclusive statistics which can help to produce reasoned policy-making and, to cite a topical example, fact-based manifesto-writing. Therefore, opening up channels of data would improve the work of civil society organisations and could better inform the Government and Opposition parties - making policy choices more accurately founded.


















Bean, C. (2016) ‘Independent Review of UK Economic Statistics’. London: HM Treasury. Available at: [Accessed: August 2019].

Bond, R., Fariss, C., Jones, J., Kramer, A., Marlow, C., Settle, J. and Fowler, J. (2012). ‘A 61-Million-Person Experiment in Social Influence and Political Mobilization’. London: Nature. Available at: [Accessed: September 2019].

Cabinet Office. (2019). ‘Protecting the Debate: Intimidation, Influence and Information’. London: HM Government. Available at: [Accessed: September 2019].

Department for Digital, Culture, Media and Sport. (2017). ‘Supplementary written evidence submitted by the Department for Digital, Culture, Media and Sport (ADM0015)’. London: House of Commons. Available at: [Accessed: September 2019].

Full Fact. (2018). ‘Tackling misinformation in an open society. London: Full Fact. Available at: [Accessed: September 2019].

Google. (2017). ‘Written evidence submitted by Google (ADM0016)’. London: House of Commons. Available at: [Accessed: September 2019].

Lewis, R. and Reiley, D. (2014). ‘Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!’. Quantitative Marketing and Economics (QME): New York: Springer US. Available at: {Accessed: September 2019].

Royal Statistical Society. (2019). ‘A Guide for Ethical Data Science’. London: Royal Statistical Society. Available at: [Accessed: October 2019].

Royal Statistical Society. (2019). ‘The Data Manifesto’. London: Royal Statistical Society. Available at: [Accessed: September 2019].

Royal Statistical Society and Institute and Faculty of Actuaries. (2019). ‘The Centre for Data Ethics and Innovation calls for evidence on online targeting and bias in algorithmic decision making’. London: Royal Statistical Society. Available at: [Accessed: September 2019].

Royal Statistical Society. (2017). ‘The use of algorithms in decision making: RSS evidence to the House of Commons Science and Technology Select Committee inquiry’. London: Royal Statistical Society. Available at: [Accessed: September 2019].

Smith, A. (2017). ‘Report of Professor Sir Adrian Smith’s review of post-16 mathematics. London: Department for Education. Available at: [Accessed: August 2019].

The Council for the Mathematical Sciences. (2015). ‘The Mathematical Sciences People Pipeline’. Newcastle upon Tyne: The Council for the Mathematical Sciences. Available at: [Accessed: September 2019].


[1] Counterfactuals are defined as ‘the quality of decisions that would have been taken if the algorithmic technology were not in place’. See RSS evidence to 2017 Science and Technology inquiry on the use of algorithms in decision-making.