Supplementary written evidence submitted by Elizabeth Denham, Information Commissioner

 

The future of Internet Harms Regulation

Paper for the Digital, Culture, Media and Sport Committee

 

I welcome the opportunity to submit my overarching views on the future of internet regulation to the Digital, Culture, Media and Sport Committee (the Committee). This is an important and complex debate and the Committee’s inquiry into Disinformation and Fake News is a key contribution to it. As a regulator already engaged in the area of internet regulation - with significant experience of regulating the tech giants; conducting standards and accountability regulation; and extra-territorial powers - I am also keen to ensure the Information Commissioner’s Office (ICO) plays its part in contributing to this important debate. I have sought to do this through engagement with government and civil society and through oral and written evidence to this Committee, the Science and Technology Committee, and the Lords Communication Committee.

It is important to acknowledge the depth and breadth of public concern around these issues. Our Information Rights Strategic Plan makes our first Strategic Goal “To increase the public's trust and confidence in how data is used and made available.” Currently people want to use the internet, and some feel they have no choice but to use the internet, but their acquiescence doesn’t represent trust. Our most recent Annual Track survey showed only 15% of UK adults have a high level of trust and confidence in how social media platforms store and use their personal data. The more trust is eroded, the greater anxiety people will feel about going online, resulting in detriment to those individuals and a gradual disengagement from the benefits the internet has to offer.

 

What are the main activities in relation to content and interaction online and how is it currently regulated?

It is ultimately a matter for government and Parliament to decide whether there is a need for additional regulation in the area of internet harms; whether this requires a new regulator to be established, and if so its shape and remit. But before doing this it is important to determine what the existing harms are, what activities on the internet are already regulated and therefore where the regulatory gaps lie.

The various harms are along a very wide spectrum, from material that might cause upset or offence to an individual - through child development and protection issues or misinformation/disinformation - to extremist or illegal content. Different experts and interest groups prioritise some of these harms over others, depending on their own experience and priorities. Joint research carried out by Ofcom and the ICO suggested that the main areas of concern for the public when online were:

Dealing with these in turn:

 

A range of organisations – including tech companies, the voluntary sector, academic institutions and Think Tanks – are considering different aspects of this debate, which government and Parliament will want to consider in the round when developing policy in this area.

At this point it is helpful to outline some of the areas where there is already a ‘clear’ legal regulatory scope:

In some cases, the public is not aware of what is (and is not) currently regulated. Ofcom has highlighted that some people mistakenly assume it already regulates all online ‘broadcasts’ via platforms such as YouTube. Conversely, the Law Commission notes that when threats are made online they will often be covered by existing criminal law. Our enforcement activities resulting from the investigations covered in our Democracy Disrupted Report, and our subsequent update to you, addressed several situations where personal data collected online for one purpose was then misused for another purpose, a breach of the Data Protection Act 1998, which would remain a breach of the Data Protection Act 2018.

There appears to be consensus that any co-ordinated approach to internet harms must include activities to improve digital literacy, and this should include making the public more aware of their existing rights and protections. Improved digital literacy will go some way to addressing public anxiety without the need for additional regulation or regulators. There are already a number of initiatives on this, including at curriculum level in schools, but they are not coordinated and joined up. There is also an important role for internet companies to play here too.

However, there are still a number of internet harms which do not appear to be fully addressed by existing law and regulation. These include:

 

The ICO’s experience of regulating activities online.

The ICO has responsibility for regulating the use of personal data. This horizontal regulatory remit means we are already regulating in areas of activity across the internet. Where the regulation of the internet involves personal data, we have a role to play. We have well established engagement with the tech giants such as Facebook (including WhatsApp and Instagram), Google and Twitter, and have undertaken actions against Facebook, WhatsApp and Google in recent years. The new enhanced powers and sanctions in the GDPR and Data Protection Act 2018, including extraterritorial powers, means that we will be able to continue to hold the tech firms to account for how they are handling citizens data online.

We also have relevant experience in relation to Google (and other search engines) and the Right to be Forgotten’ principle. Under this mechanism, individuals make the initial request to Google for search engine results to be delisted. If the individual is not satisfied with the outcome of this request, they can complain to the ICO for adjudication, which involves a careful balance between the rights to freedom of expression and privacy. We have ruled on 1,676 cases since 2014The ICO was key in leading work to develop and agree detailed criteria and guidelines for search engine delisting at an EU level.

Other areas of relevance include the requirement for us to produce and have oversight of an age appropriate design statutory code of practice for the protection of children online under the Data Protection Act 2018. The code, which is understood to be a world first, is currently under consultation. In addition, we are currently developing a framework for explainable artificial intelligence, and building an approach to auditing algorithms.

Of course, our work in adjacent online activities already brings us into contact with other regulators and government bodies, and I am pleased with our level of co-ordination. We already work very closely with Ofcom, as well as organisations including the British Board of Film Classification (BBFC), the Electoral Commission, the ASA and the National Cyber Security Centre (NCSC), whenever the scope of our work has overlapped.

 

 

Principles of future internet regulation.

The breadth and complexity of the challenges facing us means that any solution will be multifaceted, with different approaches best suited to addressing different harms. In developing an approach, I believe it would be helpful to draft a series of guiding principles that can support the design of any additional law and regulation. These principles could include agreeing that any future regime should:

 

Of course, this list cannot be seen as complete or conclusive at this time – but may serve as a contribution to the current debate, and perhaps help identify commonality between different viewpoints and propositions being discussed.

I hope these comments are useful to the Committee. I remain at the Committee’s disposal should you have further questions regarding this important matter.

Elizabeth Denham

Information Commissioner

November 2018