The BBC is responding to the House of Lords’ Select Committee on Democracy and Digital Technology’s call for evidence. The BBC is submitting evidence as an organisation that delivers world-class, impartial and accurate news and a major provider of internet services.
The internet has transformed the media landscape in the UK and how the BBC delivers its mission and services. The exponential growth in digital technologies enables the BBC to share impartial and accurate news with larger audiences, inform the democratic debate and offer richer, more targeted experiences for audiences in the UK and across the world.
Nevertheless, there are well-documented concerns about how digital technologies disrupt the democratic debate, whether through misinformation, abuse of those in public life or polarisation of the online debate.
The BBC, as the leading public service broadcaster (PSB) with a mission to inform, educate and entertain, can continue to play a vital role in delivering accurate and impartial news, informing the debate, bringing partners together and leading responsible technological development, with stable funding and support for its position as a PSB.
Impact of digital technologies on democracy
Misinformation and disinformation
The BBC is concerned about the impact of misinformation on the democratic process. Despite some reforms, misinformation still circulates extensively online, both on public channels such as Twitter and encrypted messaging services such as WhatsApp.
This includes disinformation from those acting for commercial gain, the amplification of unintentionally false information, and the spread of state-backed disinformation from state-funded broadcasters and inauthentic organised behaviour from hidden operatives who seek to manipulate national conversations.
In addition, technology can increasingly be used to create, as well as spread, misinformation. Advances in Artificial Intelligence (AI) have made it possible to create profoundly misleading content including fake audio and video, which can then be distributed rapidly through social media. For example, an AI-generated or edited video created to mislead viewers might go viral on a social network and there is a risk that newsrooms could unwittingly use synthetic audio as legitimate evidence in reporting.
Misinformation poses a two-fold threat to democracy. Firstly, it leaves voters and politicians ill-informed. For example, the BBC reported on the extensive misinformation circulated online during the 2019 Indian Election.
Secondly, it undermines trust in and engagement with accurate, impartial news. Public concern about misinformation in the UK rose to 70% in 2019 and people increasingly say they actively avoid news. Given their extensive use of social media for news, the impact on young audiences is of particular concern. The Commission on Fake News and the Teaching of National Literacy Skills found that fake news was “driving a culture of fear and uncertainty among young people”.
Algorithms and online targeting
Algorithms are increasingly responsible for what news audiences view online. The Reuters Digital News Report (2019) found that a majority (55%) of users polled prefer to get news digitally through a ‘side door’ i.e. search engines, social media, or news aggregators - interfaces which use ranking algorithms to select stories rather than editors.
Inquiries including the Government’s White Paper on Online Harms have raised concerns about how algorithms create echo chambers and filter bubbles. While there is some debate around the extent of their impact, there is sufficient evidence that algorithms on social media platforms do optimise for engagement, amplifying emotional, ‘click bait’ style content, and shaping what information is seen in opaque ways. We are concerned about the consequences for universally accessible and reliable information online.
Recent media investigations have also highlighted the radicalising effects of algorithms. For example, YouTube was found to be recommending increasingly radical political content to users who had started watching ‘mainstream’ political videos. Opaque profiling and targeting of users also has political implications, for example, micro-targeting of political adverts during election campaigns.
The BBC is concerned about how accurate, impartial news from PSBs will be accessed in a world increasingly curated by algorithms. While some platforms are taking action to address particular harms, for example the surfacing of content related to self-harm or vaccinations, there is little focus on how algorithms can proactively surface high-quality and impartial news on news aggregators or social media newsfeeds.
The BBC is engaging with the Centre for Data Ethics and Innovation’s review of online targeting, which is also considering the trustworthiness of news, and looks forward to its recommendations later this year.
Anonymity, privacy and encryption
Online anonymity has fuelled an increasingly toxic online debate, in which views are polarised and those in public life face unacceptable levels of abuse. As highlighted by the BBC’s Chairman, Sir David Clementi, some journalists – and particularly female journalists – are subject to abuse on an almost daily basis. An international survey of female journalists found two thirds (64%) had experienced online abuse – death or rape threats, sexist comments, cyberstalking, account impersonation and obscene messages. Almost half (47%) did not report the abuse they had received, and two fifths (38%) admitted to self-censorship in the face of this abuse.
The use of private and encrypted methods of communication to share news also threatens people’s access to accurate information. This is particularly prevalent in some non-Western countries. For instance, WhatsApp has become a primary network for sharing news in Brazil (53%), Malaysia (50%), and South Africa (49%) and people are increasingly part of large WhatsApp groups with people they don’t know.
Such groups enable misinformation to be shared at scale and, due to the private and encrypted nature of the communication, often beyond the sight of the platform. For example, the rapid spread of misinformation on WhatsApp in India in 2017 was linked to mob-related killings. In contrast to misinformation shared visibly – which platforms can take down or label, and journalists or fact-checkers can rebut – misinformation shared privately is extremely difficult to identify and tackle.
The BBC World Service continues to find false, BBC-branded stories circulating on encrypted platforms, including false reports of BBC election polls during the 2019 Indian Election. This has significant implications for the BBC’s reputation as a trusted news provider globally.
In the UK, users are far more likely to turn to Facebook than WhatsApp for news. However, with Facebook increasingly repositioning itself as a ‘privacy-focused’ service, the same issues around private and encrypted news-sharing are likely to come the fore.
BBC role and response
The BBC, as the most trusted source of news in the UK and major provider of online services, is well placed to respond to some the challenges posed to democracy by digital technologies through the provision of accurate and impartial news, tackling misinformation, promoting media literacy and working with partners on responsible technological development.
Providing impartial news
The BBC’s most important contribution to tackling misinformation and supporting democracy is the provision of accurate, impartial news. As the BBC’s Director General said “If fake news is the poison, those who stand up for integrity and impartiality in news must be the antidote.”
The BBC has a unique role as the most trusted and popular source of news in the UK. BBC News services are used by eight out of ten UK adults each week and 51% of UK adults turn to the BBC for the news they trust the most – significantly more than any other news provider.
BBC News is the most trusted news provider in the world – reaching a record 394 million people in 2019. BBC World Service is available in 42 languages with correspondents on location in more countries than any other broadcaster. It is a digital innovator, pioneering new ideas, such as delivery of news on chat apps.
This trust comes from the strength of our journalism and commitments enshrined in the BBC’s Charter and the BBC’s Editorial Guidelines. The BBC’s Public Purposes include the duty “To provide impartial news and information to help people understand and engage with the world around them”. In accordance with the Editorial Guidelines, we are committed to achieving the highest standards of accuracy and impartiality and checking the veracity of every story.
The value of accurate, impartial news has grown in the digital age. As concerns about disinformation online increase, people need somewhere to find news they trust. The BBC was one of the first news providers online, and the BBC’s digital services play a vital role in providing news, including to an average of 33 million UK browsers a week through BBC News Online.
As a trusted news providers and partners, the BBC and BBC World Service also play important roles in proactively tackling misinformation.
Following the Trusted News Summit hosted by the BBC and attended by partners including the European Broadcasting Union (EBU), Facebook, Financial Times, First Draft, Google, The Hindu, and The Wall Street Journal, the BBC and partners have launched a major industry collaboration to tackle dangerous misinformation. This will include:
Early Warning System: creating a system so organisations can alert each other rapidly when they discover disinformation which threatens human life or disrupts democracy during elections. The emphasis will be on moving quickly and collectively to undermine disinformation before it can take hold.
Media Education: a joint online media education campaign to support and promote media education messages.
Voter Information: co-operation on civic information around elections, so there is a common way to explain how and where to vote.
Shared learning: particularly around high-profile elections.
The BBC is also part of the Trust Project, a news industry initiative supported by over 70 international news organisations and Google and Facebook. This aims to distinguish quality news from ‘noise’ online via a system of trust indicators (e.g. Ethics Policy, Corrections Policy, content category labels), which provide audiences and platforms with evidence of reliable news.
Reality Check is the BBC’s flagship fact-checking service dedicated to creating content which examines the facts and claims behind a story, across TV, radio and online, including social media. BBC World Service and BBC Monitoring are also at the forefront of delivering practical solutions to tackle the spread of disinformation. The World Service has taken a lead on addressing the issue of misinformation through its “Beyond Fake News” project. This included concrete plans to tackle disinformation via a season of programming, a focus on global media literacy and outreach events in India, Nigeria, Kenya, Ethiopia, Brazil and Belgrade. This was supported by in-depth research commissioned by the BBC into online behaviours and their impact around the world.
Examples of content include:
Global Reality Check, launched in 2018, brings the expertise of the BBC’s UK Reality Check teams to a global audience.
BBC Africa TV content that gives audiences the tools to tackle disinformation. These include What’s New? for 13-15 year olds, Factfinder which analyses disinformation and Africa Eye which has raised the bar for investigative journalism in Africa, with the episode on killings in Cameroon winning the 2019 RTS Award for News Technology.
BBC Monitoring has a dedicated disinformation team, who provide fact-checking and analysis of the trends and tactics adopted by disinformation actors.
The most important counter to misinformation and its threat to the democratic process is the provision of accurate, impartial and universally accessible news. Nevertheless, media literacy can help audiences to discern reliable news sources and filter out misinformation.
The BBC is part of a number of initiatives to promote this:
BBC Young Reporter empowers 11-18 year olds to navigate, understand and create news. It offers mentoring to up to 1,000 schools from respected and well-known BBC journalists. BBC iReporter, an online interactive game to help young people in the UK identify ‘fake news’ has been played by over 115,000 people.
Similar initiatives have been launched in India, Kenya and Brazil by the World Service.
Radio 4’s Moral Maze and the University of Dundee wrote a programme to help students to learn ask critical questions and locate key elements within news articles.
My World – a TV programme jointly produced by the World Service and Angelina Jolie in partnership with Microsoft and aimed at providing global teenagers with the skills to discern between fact and fiction in today’s news stories launches last 2019.
The BBC is a UK-leader in providing guidance to children and parents on how to navigate online. This includes the BBC Own It app, which uses machine-learning technology to develop children’s digital literacy.
As the most trusted source of news in the UK, the BBC also increases audiences’ understanding of advances in digital technologies. For example, we are committed to informing audiences about developments in AI, as a subject that will have far-reaching implications for society.
The BBC works with partners across technology, media, civil society, and academia to promote the responsible development of technologies and co-ordinate practical, technological responses to issues such as ‘deep fakes' in news.
Algorithms and Public Service Curation
Algorithms can play a beneficial role in how they surface content for users and the BBC is at the forefront of responsible technological development in this area. Drawing on our unparalleled expertise in content curation, the BBC is bringing together data scientists and editorial leads to develop our recommendation and personalisation algorithms. We see these as an extension to the BBC’s approach to public service curation.
Personalised recommendations allow the BBC to better connect with audiences and showcase great content which our users would not otherwise find. Public service curation in this context is different to some commercial models of personalisation, as we are seeking seeking to broaden our audience’s horizons, and will not optimise for attention and engagement at any cost. We are looking at many different measures of success – not just hit rate, but also relevance, diversity, recency, impartiality and editorial priority.
It is also important to note that algorithms form just part of BBC content discovery in a digital space. Due to the risks described in the section above, BBC algorithmic curation sits alongside human curation so that we can continue to serve our audiences responsibly.
The BBC and partners are also exploring how machine learning can be used to help detect and flag misinformation. For example, the BBC uses machine learning to check quotes in news. The project uses natural language processing techniques to verify quotations that are automatically extracted from news articles. It has been applied to 8 years of BBC News to create a searchable database.
However, outsourcing the problem to algorithms is not a sufficient solution in itself, in part because cases of misinformation are not often clear cut and rely on careful interpretation.
Over a decade ago, the BBC established a User Generated Content (UGC) Hub as the first dedicated eyewitness verification unit in a newsroom. Over time, we have adopted a number of techniques for scrutinising contributions from members of the public (pictures, videos and audio). In recent years, the team’s focus has shifted towards contributions and stories on social media – these stories are carefully scrutinised for their reliability, with staff making direct content with individuals involved to verify their stories.
As part of the BBC’s partnership with Canadian PSB CBC, we are working on technological solutions to demonstrating the provenance of verified news sources e.g. through watermarking.
The BBC is a member of the Partnership on AI and co-hosted with the Partnership and WITNESS, global human rights organisation, a workshop in May on how media prepares for the increasing impact of AI with partners including Google, Facebook etc.
The workshop explored:
Authentication and Provenance: Technological solutions to checking the provenance of and signalling the authenticity of news sources online e.g. How can users on social media know an article purporting to be from the BBC is valid? What common tools can media organisations and platforms agree to use to signal to audiences that content is valid e.g. a watermark. How can this be achieved without brand misappropriation?
Co-ordination: The establishment of an alliance against misinformation, in which newsrooms globally can co-ordinate with each other to ensure misinformation is not spread, particularly in the wake of an international disaster/terrorist attack when misinformation is often more prevalent.
Synthetic Media Detection: How to respond to synthetic media, including deep and shallow fakes e.g. Are there sufficiently robust mechanisms between organisations to deal with deep fakes? Can we use technology to identify them?
Alerting the Wider World: e.g. How do we talk to audiences about harmful and truthful content and share information about new forms of audio-visual manipulation?
Role of Government
The BBC and other PSBs can play a vital role in delivering accurate and impartial news, tackling misinformation, and promoting media literacy and responsible technological development. However, PSBs are making difficult financial choices in the face of an increasingly competitive media market and declining real-terms revenues.
By 2018/19, the BBC’s funding for UK services was 24% lower compared to what funding for UK services would have been had the cost of the licence fee risen with inflation since 2010/11.
For the BBC to continue to deliver impartial, accurate news to the benefit of democracy, it needs stable finances and a fair licence fee settlement. Future licence fee settlements need to be more open and evidence based, ensuring the BBC has resources to deliver its remit, including in news.
The BBC World Service’s important role in tackling misinformation globally is partly supported by UK Government funding. Government funding of additional services under the World 2020 scheme has allowed the World Service to build new audiences through 12 new language services and enhancement of existing services, invest extensively in digital innovation to keep pace with changing audience behaviours and pioneer important investigative journalism like BBC Africa Eye.
The BBC welcomed the news in the Government’s Spending Review in September that existing funding levels for the World 2020 programme would be rolled over for the full year 2020/21. Discussions with the Government about a further uplift in funding beyond 2021 to increase reach, strengthen output and boost editorial priorities like investigative journalism are ongoing. This would allow the BBC to better compete against the growing challenge from international broadcasters and other global players who are increasingly funded by state sponsors with no commitment to impartiality.
There is a role for regulation in protecting democracy from the risks caused by digital technology.
The BBC supports the Government’s aim to introduce further regulation to address the worst excesses of the internet, such as misinformation and online abuse. As outlined in the Online Harms White Paper, this includes the introduction of independent regulatory oversight of the internet platforms and codes of conduct enforced by the regulator.
We agree with the Government that it is not reasonable to expect the public to navigate all online harms themselves. Nor is it viable to expect social media to make all the judgements on what is acceptable without independent oversight. An independent regulator and code of conduct would help to tackle issues such as the spread of misinformation and online abuse. In relation to this inquiry, we particularly support the proposal in the White Paper draft code of conduct for disinformation to promote authoritative news providers.
The BBC responded to the Government’s consultation on the Online Harms White Paper in support of its central proposals and await the Government’s next steps.
Prominence and discoverability of BBC content
As outlined, algorithms are increasingly responsible for what content audiences view online, whether through social media platforms or news aggregator apps. This has implications for the discoverability of accurate and impartial news from the BBC and other PSBs. A market driven by global commercial principles does not have the right incentives to make PSB content easy to find and could fail to deliver the accurate and impartial news that democracy benefits from. The BBC is working with platforms to ensure that trusted news stories can be found.
We welcome Ofcom’s recommendations to give PSB content prominence in the era of on-demand and internet TV viewing. These recommendations would ensure viewers can easily find PSB content across a range of devices including smart TVs, set-top boxes and streaming sticks. Rapid changes in technology mean the flexible framework recommended - so the new rules can quickly be adapted to changes in technology and viewer behaviour - is also warmly welcomed. We ask Parliamentarians to support the legislation needed to underpin these changes.
We believe that discoverabilty of trusted news content merits further policy and regulatory consideration. For example, the advent of voice activated search for news has created a new challenge. When consumers use voice to search for e.g. “today’s headlines”, it is the platform owner who determines what product or news provider(s) are presented. There are clear issues around user choice and transparency given the binary way search results are presented.
Data offers huge potential and is central to the future of online services. But the volume of user data tech giants are building up could reinforce market-distorting monopolies and prevent others from innovating. For the BBC, this data is crucial for audience insight and our ability to develop and improve our own personalisation services.
Data regulation to-date has been primarily focussed on giving consumers greater transparency and control over their personal data. New EU regulation will be adopted in the UK and extend data transparency to business users of platforms from May 2020 onwards. The BBC is campaigning for future data regulation to give content providers (and other business users) greater rights to access data generated around the consumption of their content and services on third party platforms and to require our authorisation for its use by the platform (directly or by making it available to other business users).
 https://reutersinstitute.politics.ox.ac.uk/sites/default/files/inline-files/DNR_2019_FINAL_27_08_2019.pdf. Users saying they activity avoid the news has risen from 29% in 2017 to 32% in 2019
 WhatsApp has since restricted the forwarding of messages to a maximum of five people