Skip to main content

Make tech companies liable for use of "harmful and misleading material" on their sites

29 July 2018

In a first interim report in its Disinformation and ‘fake news' inquiry, published today, the Digital, Culture, Media and Sport Committee warns that we are facing a democratic crisis founded on the manipulation of personal data, and targeting pernicious views to users, particularly during elections and referenda. The Committee outlines a series of recommendations to tackle the problem of disinformation and fake news facing the whole world.

Damian Collins MP, Chair of the Committee, said:

"We are facing nothing less than a crisis in our democracy – based on the systematic manipulation of data to support the relentless targeting of citizens, without their consent, by campaigns of disinformation and messages of hate.

In this inquiry we have pulled back the curtain on the secretive world of the tech giants, which have acted irresponsibly with the vast quantities of data they collect from their users. Despite concerns being raised, companies like Facebook made it easy for developers to scrape user data and to deploy it in other campaigns without their knowledge or consent. Throughout our inquiry these companies have tried to frustrate scrutiny and obfuscated in their answers. The light of transparency must be allowed to shine on their operations and they must be made responsible, and liable, for the way in which harmful and misleading content is shared on their sites.

We heard evidence of coordinated campaigns by Russian agencies to influence how people vote in elections around the world. This includes running adverts through Facebook during elections in other countries and in breach of their laws. Facebook failed to spot this at the time, and it was only discovered after repeated requests were made for them to look for evidence of this activity. Users were unaware that they were being targeted by political adverts from Russia, because they were made to look like they came from their own country, and there was no information available at all about the true identity of the advertiser. I believe what we have discovered so far is the tip of the iceberg. There needs to be far greater analysis done to expose the way advertising and fake accounts are being used on social media to target people with disinformation during election periods. The ever-increasing sophistication of these campaigns, which will soon be helped by developments in augmented reality technology, make this an urgent necessity.

Data crimes are real crimes, with real victims. This is a watershed moment in terms of people realising they themselves are the product, not just the user of a free service. Their rights over their data must be protected.

The first steps in tackling disinformation and fake news are to identify the scale of the problem and the areas were immediate action is required. In this interim report we have set out a series of recommendations, including; making the tech companies take greater responsibility for misleading and harmful content on their sites; providing greater transparency for users on the origin of content that has been presented to them; raising funding from the tech sector to provide more media literacy training in schools; and calling for an international coalition to act against campaigns of disinformation from Russian agencies and their networks, whose purpose is to disrupt our democracy."

Interim recommendations

The Committee sets out a series of major reforms to begin to tackle this issue, and invites responses to its proposals (a complete list of interim recommendations is available in the report):

Make tech companies responsible and liable

We recommend that a new category of tech company is formulated, which tightens tech companies' liabilities, and which is not necessarily either a 'platform' or a 'publisher'. We anticipate that the Government will put forward these proposals in a White Paper later this year.

This process should establish clear legal liability for the tech companies to act against harmful and illegal content on their platforms. Tech companies are not passive platforms on which users input content; they reward what is most engaging, because engagement is part of their business model and their growth strategy. They have profited greatly by using this model. This manipulation of the sites by tech companies must be made more transparent.

Just as the finances of companies are audited and scrutinised, the same type of auditing and scrutinising should be carried out on the non-financial aspects of technology companies, including their security mechanisms and algorithms, to ensure they are operating responsibly.

Impose a levy on tech companies to fund education and the Information Commissioners Office (ICO)

We recommend that the Government put forward proposals in its forthcoming White Paper for an educational levy to be raised by social media companies, to finance a comprehensive media educational framework.

We suggest there could be a levy on tech companies operating in the UK, to help pay for the expanded work of the ICO, similar to the way in which the banking sector pays for the upkeep of the Financial Conduct Authority.

Change the rules on political campaigning

There should be a public register for political advertising, requiring all political advertising work to be listed for public display so that, even if work is not requiring regulation, it is accountable and transparent for all to see. There should be a ban on micro-targeted political advertising to Facebook 'lookalike audiences' where users have requested not to receive political adverts.

The Electoral Commission should come forward with proposals for more stringent requirements for major donors to demonstrate the source of their donations. We support the Electoral Commission's suggestion that all electronic campaigning should have easily accessible digital imprint requirements, including information on the publishing organisation and who is legally responsible for the spending, so that it is obvious, at a glance, who has sponsored that campaigning material, thereby bringing all online adverts and messages into line with physically published leaflets, circulars and advertisements.

The Electoral Commission should also establish a code for advertising through social media during election periods, giving consideration to whether such activity should be restricted during the regulated period, to political organisations or campaigns that have registered with the Commission.

The Government should investigate ways in which to enforce transparency requirements on tech companies, to ensure that paid-for political advertising data on social media platforms, particularly in relation to political adverts, are publicly accessible, are clear and easily searchable, and identify the source, explaining who uploaded it, who sponsored it, and its country of origin.

Tech companies must also address the issue of shell corporations and other professional attempts to hide identity in advert purchasing, especially around election advertising. There should be full disclosure of targeting used as part of advert transparency.

Competition and Market Authority (CMA) audit of fake accounts

If companies like Facebook and Twitter fail to act against fake accounts, and properly account for the estimated total of fake accounts on their sites at any one time, this could not only damage the user experience, but potentially defraud advertisers who could be buying target audiences on the basis that the user profiles are connected to real people. We ask the Competition and Markets Authority to consider conducting an audit of the operation of the advertising market on social media.

Digital Atlantic Charter

The UK Government should consider establishing a digital Atlantic Charter as a new mechanism to reassure users that their digital rights are guaranteed. This innovation would demonstrate the UK's commitment to protecting and supporting users, and establish a formal basis for collaboration with the US on this issue. The Charter would be voluntary, but would be underpinned by a framework setting out clearly the respective legal obligations in signatory countries. This would help ensure alignment, if not in law, then in what users can expect in terms of liability and protections.

Other 'malign actors'

The interim report outlines disturbing evidence of the activities undertaken by companies in various political campaigns dating from around 2010, including the use of hacking, of disinformation, and of voter suppression through alleged violence and intimidation.

One company, SCL, used behavioural micro-targeting to support their campaign messages ahead of USA mid-term elections in 2014, later claiming that in just one of their campaigns the 1.5 million advertising impressions they generated created a 30% uplift in voter turnout, against the predicted turnout, for the targeted groups.

The Committee found evidence that AIQ used tools that "scrape" user profile data from LinkedIn.

The tool acts similarly to online human behaviour, searching LinkedIn user profiles, scraping their contacts, and all accompanying information such as users' place of work, location and job title.

The report begins to attempt to expose the shady, secretive world of these tech companies, and the high level international links between companies, their subsidiaries, and individuals. In one example, the inquiry heard of the links between SCL and Christian Kalin of Henley Partners, and their involvement in election campaigns in which Mr Kalin ran "citizenship-by-investment" programmes involving the selling of certain states' passports to investors, usually from countries that face extensive travel restrictions.

The Committee's final report, which will also include further conclusions based on the interrogation of data and other evidence, is expected before the end of the year.

Further information

Image: iStockphoto