Written evidence submitted by DMG Media (OSB0133)

 

This submission is made on behalf of DMG Media, publishers of the Daily Mail, Mail on Sunday, Metro and i newspapers; MailOnline, metro.co.uk and inews websites; and New Scientist magazine.  It concentrates on issues of which we have direct experience as a leading British news publisher with extensive digital publishing operations in the USA and Australia as well as the UK.

 

       Introduction and Executive Summary

 

1.                  We have no issue with the overall objectives of the Draft Online Safety Bill, which we strongly support as far as they relate to indisputable harms such as child sexual exploitation and terrorism. However we are deeply concerned that the lack of clear definitions of ‘legal but harmful content’, and the lack of a clear and concrete exemption for legitimate journalism produced by recognised news publishers, means that the Bill as drafted poses a very serious threat to freedom of expression. It must be amended to remove this risk.

 

2.                  A free and independent news media is a cornerstone of democracy, and the news media cannot be free and independent if the news it publishes is subject to interference and control by the Government, or other third parties such as external commercial interests. Given that broadcasters in the UK operate under Government licence and regulation (through Ofcom), Britain’s claim to have a free news media rests on the press, and the newspapers and news websites they publish.

 

3.                  This was why the newspaper industry rejected Lord Justice Leveson’s proposals for press recognition and regulation by a body ultimately established and sanctioned by Parliament, and instead submitted to voluntary self-regulation, principally through the Editors’ Code of Practice and the Independent Press Standards Organisation, of which all our titles except the New Scientist are members.

 

 

 

4.                  The draft Online Safety legislation threatens the freedom and independence of news websites in two main ways:

 

 

 

5.                  Between the publication of the Online Harms White Paper and the Online Safety Bill we had extensive discussions with the Department of Digital, Media, Culture and Sport, during which we argued that content produced by legitimate news publishers should be completely exempted from the proposed legislation. We argued that news publishers were not responsible for any of the harms the legislation was intended to address. On the contrary, if the government was concerned about harmful content and misinformation, it should be encouraging publishers of responsible, independently regulated news.

 

6.                  As far as direct traffic is concerned, we were assured that that our own websites would be out of scope of the legislation, and therefore untouched by it. We believe the draft legislation is intended to achieve this aim, but the way it has been drafted is unclear and appears to impose restrictions. We examine this further in Part A below, and make recommendations on how the legislation needs to be amended to address these issues, which are further set out in Appendix 1.

 

7.                  Indirect traffic referred by social media and search engines was more problematic. We were assured that the duty of care imposed on digital platforms would not oblige them to apply online safety codes of conduct to our content when they distribute it. However nor would the Bill as currently drafted prevent the platforms from applying the codes of conduct if they choose to do so.

 

8.                  Acting as a content moderator is something Facebook, in particular, is already very ready to do. Not only does it make arbitrary decisions to block or restrict content, without reference to the news publisher concerned, it resists any attempt to challenge or overturn those decisions. Defects in its rules can be challenged through its Oversight Board, but this takes months. We examine this problem in more detail in Part B below.

 

9.                  Under the Draft Bill, instead of a complete exemption, if our content is blocked or restricted by a social media platform we would have to rely on ‘Journalistic Protections’. This would be a system of appeal, set up and administered by the very commercial monopolies (online platforms) and state regulator (Ofcom), which make online safety legislation incompatible with a free and independent news media in the first place.

 

10.              This would mean that when the public buy our newspapers in print form, or visit our websites directly, they would enjoy a genuinely free press, unrestricted by any restrictions or censorship imposed by the state or external commercial interests. When they try to access the same journalistic content via social media or search, they may find it blocked by commercial interests operating under rules ultimately set by the state.

 

11.              We believe it is therefore incompatible with freedom of expression and democracy to apply the online safety duty of care to legitimate news publishers. Not only must news publishers have a full and complete exemption from the duty of care - as promised by the DCMS in public comments but still lacking from the actual text of the draft Bill – but the duty of care on the platforms must also include a prohibition, backed by penalties, on interfering with news publisher content.

 

12.            We examine this, and evidence of how online platforms are already turning themselves into quasi-regulators, in Parts A, B and C, while Part D looks the issue of payment for content.  We have put forward some suggested amendments to the Bill to address these concerns, as further detailed in the Appendices.  .

 

 

 

 

 

 

 

 

 

 

  1. The exemption for news publishers’ own websites, when visited directly by the public

 

 

13.              As far as news publishers’ own websites are concerned, the Government’s response to the Online Harms White Paper[1] was very clear on how journalistic freedom would be protected:

 

1.10 Freedom of expression is at the heart of the regulatory framework and there will be strong safeguards to ensure that media freedom is upheld. Content and articles produced and published by news services on their own sites do not constitute user-generated content and so are out of scope. The government recognises the importance of below-the-line comments for enabling reader engagement with the news. User comments below articles on news publishers’ sites will be explicitly exempted from scope. This will be achieved via the low-risk functionality exemption (see above).

 

14.              We at DMG Media took this to mean that the news content on our websites would be unaffected by the Bill because it is not user-generated and therefore out of scope. Readers’ comments, which are user-generated, would be protected by a specific exemption based on their limited functionality.

 

15.              However the fact that news websites include readers’ comments has led some to believe that paragraph 3 (7) of the draft Bill, which states:

 

A user-to-user service or a search service is exempt if it is a service of a

description that is exempt as provided for by Schedule 1.

 

means that the overall exemption for news publishers’ websites rests on the continued limited functionality of readers’ comments, as defined in Schedule 1, Clause 5 of the Bill, as follows:

 

A user-to-user service is exempt if the functionalities of the service are

limited, such that users are able to communicate by means of the service

only in the following ways—

(a) posting comments or reviews relating to content produced and

published by the provider of the service (or by a person acting on

behalf of the provider of the service);

(b) sharing such comments or reviews on a different internet service;

(c) expressing a view on such comments or reviews, or on content

mentioned in sub-paragraph (a), by means of—

(i) applying a “like” or “dislike” button or other button of that

nature,

(ii) applying an emoji or symbol of any kind,

(iii) engaging in yes/no voting, or

(iv) rating or scoring the content (or the comments or reviews) in any way (including giving star or numerical ratings)

 

 

16.              If this interpretation is the case, it seems very unsatisfactory, for a number of reasons:

 

 

 

 

17.              We at DMG Media do not agree with this interpretation. Our reading of paragraph 1.10 is that it is describing two exemptions:

 

 

 

18.              However, unlike the Government Response to the White Paper, the Draft Bill makes no reference to news websites being out of scope, presumably on the basis of the legal principle that any digital service which is not specifically included in the Bill is therefore excluded.

 

19.              Yet if that is the case, it is less than clear from the current draft Bill. This is novel legislation, as yet untried anywhere else in the world. Press freedom is under constant threat, and we have to be eternally vigilant to protect it. If the Government intends news publishers’ websites to be out of the scope of the Bill, as it has said it does, then the Bill should be amended to make that expressly clear.

 

20.              The Bill contains a very good definition of a news publisher in Clause 40, and that should be the basis of a complete and watertight exemption.

 

21.              This exemption should protect not only news publishers’ own websites, but also their content when it is distributed by online platforms through social media and search, to which we turn in Part B.

 

22.              The only circumstances when platforms should be free to take down or restrict access to news publisher content is when they are required to do so by legal or technical reasons other than compliance with the Online Safety Bill (e.g. the GDPR right to be forgotten).

 

23.              We believe this complete exemption would give the Bill the clarity it currently lacks with regard to news publisher content, and can be achieved with a limited number of amendments:

 

 

 

 

 

 

 

 

 

 

24.              Together with the News Media Association - which represents national, regional and local news publishers across the UK, of all political viewpoints – and Andrew Caldecott QC, we have drafted amendments to the Draft Bill which we believe would achieve this purpose. They are attached to this submission as Appendix 1.

 

25.              Clearly, for such an exemption to function efficiently and seamlessly the algorithms used by in-scope platforms would need to be able to identify news publisher content electronically whenever they encounter it. Together with the News Media Association we have developed a detailed proposal for a digital kite-marking scheme which would deliver this. It is attached to this submission as Appendix 2.

 

 

 

 

 

  1. Will ‘Journalistic Protections’ in the Bill prevent news publisher content being blocked or taken down when distributed on social media?

 

26.              The position of news publishers’ own websites under the Bill may be less than clear in the Bill as drafted, but we are ready to believe the Government did not intend they should be within its scope. The position of news publisher content when it is distributed by social media and search is far less satisfactory.

 

27.              We and other news publishers have argued from the beginning that the only way to guarantee press freedom and prevent digital news content becoming subject to a parallel system of regulation, sanctioned by the state and administered by commercial monopolies, is a complete and watertight exemption from the Bill.

 

28.              For reasons that have never properly been explained, the Government has seemed very reluctant to provide this. The is what the Government Response to the White Paper said:

 

1.11 Journalistic content is shared across the internet, on social media, forums and other websites. Journalists use social media services to report directly to their audiences. This content is subject to in-scope services’ existing content moderation processes. This can result in journalistic content being removed for vague reasons, with limited opportunities for appeal. Media stakeholders have raised concerns that regulation may result in increased takedowns of journalistic content.

 

1.12 In order to protect media freedom, legislation will include robust protections for journalistic content shared on in-scope services. The government will continue to engage with a wide range of stakeholders to develop proposals that protect the invaluable role of a free media and ensure that the UK is the safest place in the world to be online.

 

29.              The ‘robust protection for journalistic content’ in the Bill begins with a partial exemption. It is clear from Clause 39 (2) of the Bill that news publisher content is not classed as ‘regulated content’ and is therefore not subject to the duty of care under which the platforms can be fined up to 10pc of global revenue if they fail to provide and enforce codes of conduct. This is supported by a strong definition of a ‘recognised news publisher’ in Clause 40.

 

30.              However, while the draft Bill imposes no obligation on social media companies to ‘moderate’ (i.e. block or take down) content from recognised news publishers, neither does it place them under any obligation NOT to do so. Instead it relies on a duty to ‘protect journalistic content’, which is set out in Clause 14:

 

A duty to operate a service using systems and processes designed to ensure that the importance of the free expression of journalistic content is taken into account when making decisions about… how to treat such content (especially decisions about whether to take it down or restrict users’ access to it)

 

31.              This is very weak indeed – all a social media company will have to do to justify blocking or taking down news publishers’ content (which, far from being exempt, is specifically included in Clause 14 (8)) is to show that it gave consideration to freedom of expression, but decided it was secondary to other issues. We very much doubt that would be anything more than an exercise in box-ticking.

 

32.              The fact that the authors of the Bill themselves expect that social media companies WILL block or take down legitimate content is demonstrated by the inclusion of an appeals process (Clause 14 (3)):

 

A duty, in relation to a decision by a provider to take down content or to

restrict access to it, to make a dedicated and expedited complaints procedure

available to a person who considers the content to be journalistic content and

who is—

(a) the user who generated, uploaded or shared the content on the service,

or

(b) the creator of the content (see subsection (11)).

 

33.              This appeals process is also very weak. News is a perishable commodity. If Facebook blocks a story it does not like, for whatever reason, there will be no value at all in having it restored after an appeals process which may take weeks or even months – particularly if stories from rival publishers on the same topic are not blocked. It will also impose an extra burden on publishers, which will have to employ staff to monitor whether content is being blocked and to take cases through a lengthy appeals process. Doubtless high profile cases will demand the use of lawyers, which will multiply the cost.

 

34.              A further problem with the ‘journalistic protections’ is that the scope of the Bill is extraordinarily wide. No one would dispute that its primary targets, child sexual exploitation and terrorism, are evils which should be expurgated from the internet. However the Bill does not stop at these obvious and incontestable aims. It also requires social media companies, under threat of draconian penalties, to establish and enforce codes of conduct to prevent the sharing of content which is ‘legal but harmful’.

 

 

35.              The definition in the Bill of ‘legal but harmful’ is vague in the extreme:

 

Content is within this subsection if the provider of the service has reasonable

grounds to believe that the nature of the content is such that there is a material risk of the content having, or indirectly having, a significant adverse physical or psychological impact on an adult of ordinary sensibilities (“A”).

 

For the purposes of subsection (3), in the case of content which may reasonably be assumed to particularly affect people with a certain characteristic (or combination of characteristics), or to particularly affect a certain group of people, the provider is to assume that A possesses that characteristic (or combination of characteristics), or is a member of that group (as the case may be).[2]

 

36.              Doubtless the intention is to protect individuals from trolling. But this is also precisely the argument – that no opinion can be expressed if it might cause psychological harm (i.e. offence) to any person or group of people – which has been used to justify the no-platforming of speakers in universities. Allowing such an approach to be extended to the news publishers would be a very serious development, since tech platforms would then be able to prevent access to content they do not like by arguing that, under the terms of the Bill as drafted, it could upset “an adult of ordinary sensibilities”.

 

37.              We also believe the very concept of content that is legal but harmful is deeply flawed. If content is genuinely harmful it should be illegal – indeed, as the House of Lords Communications Committee report on Freedom of Expression[3] noted, there is no shortage of legislation prohibiting harmful content; the problem is that it is not enforced. Parliament cannot just pass the buck by allowing the introduction of the highly contentious concept of legal but harmful content, with the widest and vaguest definition possible, then leaving its practical application to Ofcom and Facebook.

 

38.              There is a very real danger – indeed almost a certainty - that every group with a grievance will claim that any news coverage that does not conform to their agenda will cause their members psychological harm. Members of Parliament should not imagine this will only affect one side of any particular argument: it will be easy enough for groups of all kinds of viewpoints, from across the political spectrum, to make such an argument. Unless news publishers are protected by a complete exemption, coverage of any contentious subject, from transgender rights to conflict in the Middle East, will become impossible.

 

39.              The threat to freedom of expression is exacerbated by the fact that the Draft Bill also gives the Secretary of State for Digital, Culture, Media and Sport arbitrary and undefined powers to designate ‘priority content that is harmful to adults’.[4]

In the hands of a ruthless and authoritarian government this could rapidly become a tool to suppress political opposition.

 

40.              We, and the rest of the news publishing industry, have argued ever since the original Online Harms White Paper was published that the combination of a vague, all-encompassing definition of ‘legal but harmful’ and draconian penalties for code breaches would cause social media companies to set their algorithms to block any content which might present a risk, wherever it came from and however legitimate it may be as a contribution to public debate.

 

The risk of creating a parallel system of press regulation

 

41.              This is not our only concern. We also fear one of the consequences of the Bill will be the development of a parallel system of press regulation. Our titles (with the exception of the New Scientist) are already regulated by IPSO, which has a well-established Code of Practice, complaints procedures, and rulings made by an independent committee with lay and industry members.

 

42.              What will Facebook’s code of conduct look like? We do not know. Will it define key concepts like accuracy, privacy and the public interest in the same way as the Editors’ Code we currently follow, or differently? Is it compatible with freedom of expression and democracy in the UK for an American-owned commercial monopoly to be deciding what journalism the British public are entitled to read and what they are not? And is Ofcom the right body to oversee this, given that it is state institution regulating broadcasters which operate under licence, according to a code of practice which demands impartiality and prohibits causing offence, both of which are issues expressly excluded from the Editors’ Code?

 

43.              We ask these questions not only because they are very important points of principle but because our experience in the USA, where we are also one of the largest digital news publishers, shows that Facebook is rapidly turning into a quasi-regulator of news, and a very unsatisfactory one at that. This was acknowledged in the Government Response to the Online Harms White Paper, which described how online platforms’ existing content moderation results in ‘journalistic content being removed for vague reasons, with limited opportunities for appeal’[5].

 

44.              For a long time Facebook was the great champion of the First Amendment and was notoriously prepared to allow just about any content which was not clearly illegal, and sometimes even content that arguably crossed that line. It and other platforms also relied on s.230 of the United States Communications Decency Act, which gave it immunity from libel law so long as it did not itself create or edit the content it hosted.

 

45.              Then last year Facebook suffered a major advertiser boycott[6], following which it and Twitter banned Donald Trump. This is not the place to argue the rights and wrongs of that decision. However the company also started blocking a lot of other content, some of it at the behest of fact-checkers which in some cases we strongly suspect are lobby groups under another guise.

 

46.              To cite a well-known example, one major issue was the possibility of Covid being the result of a lab escape at the Institute of Virology in Wuhan, seen by some as a conspiracy theory associated with Donald Trump. A decision by Facebook to block such content has now been reversed following a change of view by the Biden administration.[7] Determining the origin of Covid is central to finding ways to prevent it, and it is deeply worrying if Facebook is suppressing/allowing debate to keep itself in step with what it perceives to be the political orthodoxy of the day.

 

47.              Another notorious example of platforms allowing political expediency to interfere with legitimate journalism was during last year’s presidential election campaign, when Twitter prevented users for tweeting New York Post stories about emails related to Joe Biden’s foreign business deals, found on his son Hunter’s abandoned laptop - and locked the newspaper’s Twitter account. Two weeks later, after an outcry over its interference in freedom of expression, Twitter reversed its decision.[8]

 

48.              There have been other instances involving MailOnline. When President Biden announced a very ambitious climate change programme, criticised by some for lack of detail on how the targets were to be achieved, our US website published a speculative story looking at the sort of measures which might be required to meet those targets. One suggestion was restrictions on meat consumption.

 

49.              A Facebook-funded single-issue fact-checker called Climate Feedback read the story as though it was a factual news report and said it was inaccurate on the basis that Biden’s plan did not explicitly include restrictions on meat consumption. As a result Facebook blocked access to the story.

 

50.              Unlike IPSO here in the UK, neither Climate Feedback nor Facebook itself give news publishers any opportunity to defend their journalism before issuing what they call ‘strikes’ against stories. We complained to Climate Feedback, explaining how it had misread the story. It partially corrected its strike – so it is now contradictory – yet Facebook’s ban is still in place.

 

51.              We have no issue with Climate Feedback disagreeing with our story, even if it failed to understand that speculation is a perfectly acceptable journalistic device to demonstrate the lack of substance in a politician’s promise. What is not acceptable is for Facebook to take an arbitrary decision to censor a story at the behest of one body with an agenda, and without any process.

 

52.              Facebook has also blocked stories about the four homes accumulated by one of Black Lives Matter’s founders, who has described herself as a ‘trained Marxist’. Details of the properties, which did not include street addresses, first appeared on a property website. When the New York Post and DailyMail.com, our US website, published versions of the story, the BLM leader concerned complained to Facebook, which blocked the stories, again without reference to us.

 

53.              Facebook’s initial strike was not communicated to us, and when we published another article about developments in the story, we were warned that a second strike might result in all our Facebook content being demonetised.

 

54.              The story was blocked under a Facebook Community Standards rule under which it may restrict access to images of private residences. The rule was no doubt introduced to prevent private individuals harassing other private individuals by posting pictures of their homes with malevolent comments.

 

55.              However it makes no provision for journalism, unlike the Editors’ Code of Practice, which protects the individual’s reasonable expectation of privacy in their home, but also allows exceptions where the public interest is engaged. One prominent example of this was coverage of Dominic Cummings’s now notorious trip to Barnard Castle, which involved publication in almost all news outlets of numerous pictures of his and his family’s homes.

 

56.              The BLM case has clearly concerned Facebook, which has referred it to its Oversight Board, to whom we have made a submission, suggesting that if Facebook is going to block journalistic content in such circumstances, then at the very least it needs to amend its Community Standards to allow editors to defend their journalism, particularly where the public interest is involved.

 

57.              But this brings its own risks. It is quite clear that through the operation of its Community Standards and its Oversight Board, Facebook is turning itself into a quasi-regulator. We are very concerned that if this is given the sanction of official approval by Parliament through the Online Safety Bill, with Ofcom as the oversight regulator, Facebook will start receiving complaints from individuals seeking to chill legitimate journalistic inquiry into their activities, and lobby groups seeking to censor journalism which does not fit their agendas.

 

58.              Facebook began blocking journalistic content in the USA following an advertising boycott organised by the campaign #StopHateForProfit. Although it is not linked, the UK organisation Stop Funding Hate has been trying for some years to censor news coverage it disagrees with by persuading advertisers to boycott publications. Its most recent campaign was against broadcaster GB News, where it tried to organise a boycott even before the station launched.

 

59.              We are concerned that Clauses 12 and 23 of the Bill, which set out regulated services’ duties concerning rights to freedom of expression and privacy, also make no reference to journalism or the public interest. This could mean that an important investigative story about a powerful individual, which was fully justified under the law and the Editors’ Code, could still be banned on Facebook.

 

60.              We believe it is incompatible with freedom of expression and media plurality for legitimate, responsible news content to be subject to blocking and take-down by a commercial organisation which is open to business pressures such as advertising boycotts, operates without due process, and has no authority to make judgments about the value of journalism.

 

61.              Attempting to remedy this by making Facebook effectively part of a state-sanctioned system of regulation would make matters worse, not better, as it would create a competing, parallel system of regulation for the press, administered by a company which is bound to put its own commercial interest in maximising profits and maintaining market dominance before other considerations, including freedom of expression.

 

The risk in relying on artificial intelligence, algorithms and ‘safety by design

 

62.              A further flaw in the Draft Bill, and cause for concern among news publishers, is its reliance on the concept of ‘safety by design’. This appears to be a belief that if the right systems and processes are in place, the online platforms will be able block harmful content from the internet automatically, through the operation of their algorithms. This faith in the effectiveness of artificial intelligence is seriously misplaced.

 

63.              Algorithms and artificial intelligence are only as good as the human beings who create them and in truth, and as news publishers know to their cost, are highly unreliable. MailOnline was a victim of an egregious example of this unreliability very recently.

 

64.              We discovered, via an article in the New York Times[9], prompted by a tweet from a former Facebook staffer, that Facebook’s artificial intelligence had placed an automated prompt saying ‘keep seeing videos about Primates’ on a MailOnline news video dated June 27, 2020, which showed a white man appearing to harass a black man and call the police to get him arrested.  The MailOnline video had no connection to monkeys or primates.

 

65.              Associating black people with primates is of course a highly offensive racist trope, and it is very worrying that Facebook did not notify MailOnine of its error until it was exposed by the New York Times. It then issued an apology which confirmed the unreliability of its algorithms: 'This was an algorithmic error on Facebook and did not reflect the content of the Daily Mail's post… 'As we have said, while we have made improvements to our AI we know it's not perfect and we have more progress to make.’

 

66.              However this was not the end of the matter. In order to rectify the AI problem which had created the offensive prompt, Facebook took its entire recommendation mechanism offline, which then had a knock-on effect on content safety measures across the whole Facebook system. One of these was that many publishers, including MailOnline, found that any animal video content was demonetised. Animal videos are very popular with both the public and advertisers, and consequently are a substantial source of revenue on Facebook. So, having falsely associated MailOnline with a racial slur, Facebook then made MailOnline and other publishers pay for its own AI error by cutting off one of their most important revenue streams.

 

67.              The fact that Facebook’s AI not only made such an error, but its attempts to correct it caused further damage, demonstrates to us that platform algorithms are wholly unsuited to the task of administering a system of regulation based on systems and processes. Nor are they currently even remotely capable of making the fine judgments necessary to allow platforms any role in determining what news content users should or should not be allowed to share on social media. 

 

68.              Given that another fundamental principle of the Online Safety Bill duty of care is to encourage social media platforms to vet content as cautiously as possible by threatening them with massive financial penalties if they do not do so, we believe legitimate journalism and press freedom must be protected not only by a complete exemption from its provisions, but also by extending the duty of care to impose similar penalties on platforms that interfere with news publisher content (as defined in Clause 40).

 

69.              We believe the amendments to the Bill we propose in paragraph 22 and Appendix 1 would achieve those aims.

 

 

 

 

 

  1. Is there sufficient protection for news publisher content when distributed through search?

 

70.              The position under the draft Bill of news publisher content when it is distributed by search engines is even less clear than it is for social media. It is true that the duty of care applied to search engines excludes news publishers (Clause 18 (2)). It is also true that the same duty of care only covers content that is illegal, at least as far as adults are concerned, though for children it also covers content that is harmful (Clause 19 (3) and (4)). Search engines are required to carry out risks assessments then take steps (i.e. adjust their algorithms) to minimise the risk of users encountering content that is either illegal, or harmful to children.

 

71.              As with social media, however, while the duty of care does not apply to news publisher content, there is also nothing to prevent search engines setting their algorithms to prevent content from any publisher being ranked in their search results if, whether for reasons of operational convenience, or commercial or political expedience, they choose to do so. Beyond a general duty to protect freedom of expression (Clause 23), there is no specific protection for journalism or news publisher content.

 

72.              It is a matter of deep concern to us that there is a growing body of evidence that Google, the dominant search engine, sets its algorithms to favour certain news publishers and discriminate against others. As with social media we fear that the Online Safety Bill will legitimise and institutionalise the skewing of search algorithms to further the aims of a commercial monopoly, to the detriment of freedom of expression, a pluralistic media, and open democratic debate.

 

73.              MailOnline is the most visited news website in the UK (excluding broadcasters)[10] and the fifth most visited English-language news website in the world[11]. It would be logical therefore to expect, when a member of the public uses Google to search for a news subject such as ‘Covid’ or ‘Brexit’, that MailOnline stories would generally appear high up in the first page of search results.

 

74.              That is not the case. Data from search analytics companies Sistrix and NewsDashboard UK shows that Google overwhelmingly favours two news websites - the Guardian and BBC - in search results and discriminates heavily against most other major British news websites, including and in some respects particularly MailOnline. Indeed MailOnline’s share of search visibility for many important news search terms is close to zero – for example, for the term ‘Covid’ it was just 0.22pc for the month of July this year. Google’s algorithms have in fact consistently reduced MailOnline’s search visibility since 2013.

 

75.              Search visibility is significant because it measures not the choices made by users, but the choices made by Google’s algorithms when users make requests for particular search terms. The Sistrix search visibility index[12] is the industry standard and measures Google’s ranking across sets of representative keywords.

 

76.              Particularly striking evidence that MailOnline was being discriminated against by Google came in 2019, when an algorithm change in early June cut MailOnline’s search visibility by 50pc, while other news websites’ visibility improved. Three months later, following protests to Google at the highest level, MailOnline’s search visibility was equally suddenly restored. At neither point did we make any changes to the structure or presentation of the site which would explain its rejection or subsequent re-acceptance by Google’s algorithms.

 

77.              At the time we were only able to guess why Google had reduced MailOnline’s search visibility. However when reviewing evidence for the Competition and Markets Authority’s (CMA) market study into online platforms and digital advertising, it became apparent to us that the June algorithm change coincided with the introduction by Google of its new Unified Pricing rules for digital ad markets. These rules had the effect of limiting the use by publishers of header bidding, a means of setting price floors which enabled us to fill more of our ad inventory with better-paying non-Google demand.

 

78.              We have since learned that other major publishers which made use of header bidding, such as the News York Times and Conde Nast magazines, also saw search visibility drop in June 2019.

 

79.              The consequence of Unified Pricing was that by the end of the three-month period June-September 2019, Google had forced MailOnline to sell twice as much ad inventory through Google’s ad exchange, while Google paid half as much for each ad slot.

 

80.              For most of 2020 MailOnline’s overall UK search visibility index, as measured by Sistrix, hovered at around 100 – only a quarter of the best figures recorded in 2012-15, but similar to the level before the dramatic drop in June 2019. However, from January this year we started to see another steady decline in MailOnline’s overall search visibility index, which has now stood for several months at just over 50, around half the level seen through most of 2020, and one-eighth of the 2012-2015 peak. (See Tables 1 and 2).

 

Table 1. MailOnline UK overall search visibility index (desktop) – last 10 years (source: Sistrix)

Chart, line chart

Description automatically generated

 

 

 

 

 

 

 

 

Table 2. MailOnline UK overall search visibility index (desktop) – last 12 months (source: Sistrix)

Chart, line chart

Description automatically generated

81.              When MailOnline’s search visibility is plotted against its UK rivals a disturbing pattern emerges. Pre-2015 Google heavily favoured the BBC, with the Guardian, MailOnline and Telegraph broadly grouped together[13]. From 2015 onwards two distinct groups start emerging. The Guardian and BBC are consistently favoured, with a visibility index currently standing at around 400, while the Mail, Telegraph and Sun are grouped together with consistently poor visibility, currently standing at around 50-75. (See Table 3 - this chart does not include the Mirror and Express, both of which score slightly below MailOnline).

 

 

 

 

 

 

 

 

 

 

Table 3. UK competitive search visibility last 10 years (source: Sistrix)

Chart, histogram

Description automatically generated

 

82.              The implications for freedom of expression, media plurality and democratic debate should be obvious. How Google’s search algorithms work is the company’s most closely-guarded secret. Google tells the public:

 

‘To give you the most useful information, Search algorithms look at many factors, including the words of your query, relevance and usability of pages, expertise of sources and your location and settings.

 

The public place great faith in Google, and imagine that when they search for news on politics, health, business, or any number of other topics, Google’s emphasis on relevance and expertise means the content they are shown has been picked because it gives the most reliable and useful information. Unless they are students of search visibility they have no idea that when they search for news Google’s algorithms invariably steer them towards two particular news sources, the Guardian and BBC.

 

83.              Moreover, there is nothing in the Bill as presently drafted to prevent Google from picking two other preferred news providers in the future, if it should suit its interests – indeed, as Table 3 shows,  at times the Guardian’s own search visibility index has risen, then fallen sharply, as determined by Google’s whim.

 

84.              This pattern is repeated across many search terms, as the following charts demonstrate. The most striking is Table 4, which shows shares of UK mobile search visibility for the term ‘Covid’ during July this year, in which the Telegraph and Independent each score barely over 1pc and MailOnline scores only 0.22pc, while the BBC scores 12.65pc – 58 times the share of MailOnline – putting it ahead of the World Health Organisation and NHS websites, and the Guardian on 8.63pc.

 

 

Table 4.Covid’ - UK Mobile Overall Share of Search Visibility, July 4 – August 2, 2021. (source: NewsDashboard UK)

85.              When it comes to politics, the Guardian is the winner for the term ‘Brexit’, along with other pro-Remain websites. Table 5 shows the Guardian scored 19.45pc, while the FT, not normally a good performer because of its paywall, won 17.83pc of Google search requests, closely followed by the Independent. Despite Brexit being a subject to which MailOnline devoted a great deal of coverage, MailOnline received close to zero – a derisory 0.1pc. This is not to repeat the arguments over Brexit, but to make the point that Google can and does direct search traffic overwhelmingly to one side of an issue only.

Table 5.Brexit’ - UK Mobile Overall Share of Search Visibility, July 4 – August 2, 2021. (source: NewsDashboard UK)

 

86.              In contrast, traffic for Facebook, where results are determined by users decisions to share content rather than choices made by the platform’s algorithms, and therefore indicate which news the public rather than the platform prefers, tells a very different story. Despite fluctuations caused by individual stories going viral, MailOnline traffic has remained largely consistent over the last year, as table 7 shows.

Table 7. Facebook MailOnline article views – 12 months to August 1. (source: Adobe)cid:image004.png@01D78918.B3E5EF10

87.              Indeed, global figures for June this year show MailOnline is the second most popular English-language news website across the whole of Facebook, receiving twice as many visits as the BBC and Guardian combined (see Table 8).

Table 8. Top publishers on Facebook, June 2021 (source: Newswhip)

cid:image001.png@01D7886A.B8599AF0

88.              What are the reasons for this stark discrepancy? Google never explains how its algorithms work, so we cannot be certain why it discriminates so consistently against some publishers and in favour of others. We believe we now have a convincing case that its two dramatic algorithm changes in June and September 2019 were dictated by its commercial self-interest, and part of its successful campaign to maximise its profits by preventing publishers from using header bidding in digital advertising.

 

89.              For the rest, it certainly appears that Google’s bias against MailOnline is much more pronounced when its algorithms are ranking political stories than for stories of more general interest. Whether this is a deliberate company policy, or simply the result of unconscious bias on the part of the Californian web engineers programming algorithms preferring websites that echo their own left-liberal views, we cannot know. But it is clearly happening.

 

90.              Our concerns about the commercial effects of discrimination in search led us to argue successfully that the Competition and Markets Authority (CMA) should include search in its market study into online platforms and digital advertising. We maintained that it was impossible to plan our business without fair warning and explanation of algorithm changes, and without remedies when those changes cause commercial damage. The CMA’s Final Report found:

 

It is clear that many publishers rely on Google and Facebook for a significant

proportion of their traffic and that changes to key search algorithms by either

of these can have a significant impact on publisher businesses. We would,

therefore, consider it reasonable that publishers have sufficient explanation of

how these algorithms work and sufficient notification of changes to them

where they might notably impact upon their businesses. We consider that

provision to publishers of sufficient explanation about how the key search

algorithms work as well as explanation and notification of changes to these

are areas that would appropriately be covered by the proposed code of

conduct.’[14]

 

 

91.              Slowly, the government is putting in place regulatory structures to deal with these problems. The Digital Markets Unit recommended by the CMA has now started work, and is drawing up codes of conduct in advance of the forthcoming Digital Competition Bill, which will give it statutory powers. Its original remit was economic – to prevent Google reinforcing its dominant market position by directing traffic to its favoured publishers, and away from those such as MailOnline which seek to protect their revenue by using methods such as header bidding to secure advertising from non-Google sources.

 

92.              However, news publishers are not just businesses, they are also participants in the political process – in particular those which are not broadcasters, and are therefore free to editorialise and campaign on the great issues of the day. Google’s policy in the UK over the last decade has been to direct search to the two publishers which it currently favours – and marginalise other voices such as the Mail, Telegraph, Mirror and the Sun. This is a serious threat to a pluralistic media, and in turn to democracy. Whatever one’s political views, democracy cannot thrive unless all voices can be heard.

 

93.              For many years the only digital regulator in the UK has been the Information Commissioner, which is solely focused on privacy. Google has consistently used privacy regulation as a reason to deny user data to rival companies in the digital advertising market, and move the digital advertising industry into its own walled garden. The damaging effect single-issue regulation has had on other matters of concern in the digital ecosystem – in this case commercial competition – was recognised by the CMA in its digital advertising market study and addressed by the Government when it set up the Digital Markets Unit, in which the ICO and Ofcom participate as well as the CMA.

 

94.              We are very concerned that even before Parliament gives the DMU the promised statutory powers to impose codes of conduct that will require platform algorithms to operate in a way which is fair, consistent, transparent and non-discriminatory, online safety regulation without a clear, cast-iron exemption for news publishers will provide Google with a new lever to legitimise discrimination against those it does not favour.

 

95.              Therefore it is vital not only that news publishers’ content is fully exempted from the duty of care as it applies to search engines, but that the duty of care is also extended so that search engines are prohibited from using it as a lever to operate their algorithms in ways that are arbitrary and/or discriminatory.

 

 

 

  1. Economic harms: the case for amending the Draft Bill to ensure fair payment from news content online

 

96.              The Draft Bill is a courageous and wide-ranging attempt to prevent the dissemination of harmful content on the internet. However it completely overlooks another consequence of the domination of the digital landscape by two monopolies, which is the hugely damaging effect that the difficulty in monetising online news content has had on journalism in the UK.

97.              We do not intend to rehearse here all the evidence for this, which has been investigated in detail by the Cairncross Review and the CMA Digital Advertising Market Study. Suffice it to say that as advertising has migrated online, news publishers have been forced to substitute digital pennies for newsprint pounds, and journalism has paid the price.

 

98.              This is partly because the digital advertising market is dominated by one company, Google, which effectively controls every stage of the process, and extracts far greater proportion of revenue than intermediaries in the print advertising market ever have.

 

99.              But it is also due to the fact that digital news websites receive no payment for their news content when it is exploited by the platforms. Legislators around the world have tried to address this, with limited success in Europe, but considerably more in Australia.

 

100.          The original Australian mandatory bargaining code would have guaranteed fair payment for all news publishers of any substance. Regrettably, following well-publicised stand-offs with Google and Facebook, it was diluted by amendments as it passed though the Australian Parliament. The result is that the mandatory arbitration process has still to be activated and while some of the biggest pushing groups in Australia have won payment for content contracts with Google and Facebook, others (including MaiOnline) have still to be offered acceptable terms.

 

101.          Meanwhile a number of publishers in the UK have entered into limited payment for content deals with the platforms. This incudes MailOnline, which has a contract with Facebook but not with Google. However, although better than absolutely nothing, these cobtracts are unsatisfactory for a number of reasons:

 

 

 

 

 

 

102.          For these reasons we believe it is essential that the UK has a mandatory bargaining code, ideally based on the original Australian model. This is something we have discussed with the DCMS, the CMA and the Digital Markets Unit, and we are optimistic that powers to bring in some form of mandatory bargaining code will be included in the promised Digital Competition Bill.

 

103.          However progress on bringing forward such a Bill has been slow. It is now more than three years since the Cairncross review was commissioned in June 2018, and despite reports from Cairncross, Furman, the CMA (interim and final), and the Digital Markets Taskforce, the government is still taking evidence on the issue, with the latest consultation not due to close until next month. Although we accept that the promised Digital Competition Bill would be the natural home for a mandatory bargaining code, our understanding is that work has not yet begun on drafting that legislation, let alone introducing it.

 

104.          In the meantime an equally good case could be made that if the Government is serious about improving users’ experience of the internet, for the benefit of society as a whole, then guaranteeing reliable and pluralistic digital news coverage by obliging online platforms to pay fairly for news publisher content should a natural and highly desirable aspect of this legislation.

 

105.          We have therefore, with the NMA and the help of lawyers, drafted an amendment to the Online Safety Bill, which we believe would be capable of delivering a mandatory bargaining code considerably sooner than waiting for the Digital Competition Bill. The amendment is attached to this submission as Appendix 3.

 

Peter Wright

Editor Emeritus

DMG Media

September 2021

 

 

 

 

 

 

Appendix 1

Re: The Draft Online Safety Bill

 

DRAFT AMENDMENTS AS AT 16.9.21

 

  1. Amendment to expressly exclude news publisher websites from the scope of the Bill

Clause 2

At the start of subsection (1) insert

(1)   Subject to subsection (7) below, […]

At the start of subsection (5) insert

(5) Subject to subsection (7) below, […]

After subsection (6), add:

(7) Notwithstanding the provisions of this section, where an internet service is provided by a Recognised News Publisher it is not a “user-to-user service” or “search service” for the purposes of this Act.

(8) For the definition of “Recognised News Publisher” see section 40. 

 

[NOTE: The purpose of this amendment is to ensure that websites operated by recognised news publishers (as defined in Clause 40) do not constitute “user-to-user services” or “search services” for the purposes of the Bill, ensuring they are outside the scope of the proposed legislation.]

 

  1. Amendments to Clause 14 to protect news publisher content and provide a scheme of effective redress for breaches of the duty to protect.

Clause 14

Leave out subsection (2) and (3) and insert –

(2) A duty not to apply any of online safety duties in this Act to news publisher content.

(3) A duty not to remove or restrict users’ access to news publisher content, or to materially alter the way in which news publisher content would ordinarily be viewed, on the basis that the content infringes a content moderation rule.

After subsection (4) insert –

(4A) Without prejudice to the duties in (2) or (3) above, where the creator of the journalistic content is a recognised news publisher, a duty to:

(a)   Notify the creator within 24 hours of any decision to take down or restrict access to the content and the reasons for that decision.

(b)   Consider any representations made by the creator in response and, in the event those representations are not accepted, notify the creator of the reasons for their rejection”

(c)    Promptly reinstate any content removed or restricted in breach of the duty in (2) or (3) above.

Leave out subsection (6) and insert:

(6) A duty to specify in terms of service by what methods content present on the service is to be identified as journalistic content including specifically by reference to OFCOM’s register of broadcast licence holders and any register of recognised news publishers maintained by:

(a)   News Media Association;

(b)   Professional Publishers’ Association;

(c)    Independent Community News Network; or 

(d)   Any other trade or industry body, established by two or more UK linked entities whose principal purpose is the publication of news, and which is identified for this purpose by OFCOM in a Code of Practice.

 

After subsection (7) add –

(7B) A Recognised News Publisher may make a complaint to OFCOM that a regulated service has failed to comply with any of the duties in this section.

(7C) The Secretary of State must make regulations containing provision about procedural matters relating to complaints under subsection (7B) which shall, in particular, include provision about the following matters –

(a)   Notification to OFCOM of an intention to make a complaint under subsection (7B);

(b)   The form and manner of such a complaint, including requirements for supporting evidence;

(c)    The steps that OFCOM must take in relation to such a complaint; and

(d)   Time limits for taking steps in relation to such a complaint.

After subsection (11) add

(11A) In this section “content moderation rule” means any term of service, rule, policy or other standard maintained by the provider or any third party acting on the provider’s behalf, which by itself imposes limits or restrictions on the uploading or sharing of content by reference to the substance or subject matter of the content itself.

[NOTE: the purpose of these amendments is to:

(a)   Ensure that exempt news publisher content does not, in practice, become subject to the online safety duties as a result of the practical application of those duties by Category 1 Services.

(b)   Protect the fundamental right of freedom of expression, including the public’s right to receive information, by preventing Category 1 services from applying content moderation or other standards-based regulation to news publisher content shared on those services.

(c)    Ensure that news publishers are notified where their content is removed or restricted and that the reason for this is given. This provides a mechanism by which news publishers can ascertain whether decisions are being made to restrict access to their content in breach of the duties in the amended provision and provision is made for representations in response to improper removal.

(d)   Provide a mechanism by why recognised news publishers may complain to OFCOM in cases where a regulated service has failed to comply with the duties to protect journalistic content. It is proposed that the procedure for handling such complaints is to be dealt with in secondary legislation (this is consistent with the approach to super-complaints under the Bill). In order to ensure this complaints mechanism is effective, a consequential amendment is needed to Clause 70 to ensure OFCOM can issue information notices in support of the exercise of its complaints handling function.]

Clause 70

In subsection (4)(f), after “(super-complaints)” add “or complaints under sections 14 or 18A”.

[NOTE: this amendment is consequential on the amendments to Clause 14 and 18A to provide for complaints to OFCOM. Its purpose is to ensure that OFCOM can exercise its enforcement powers when handling those complaints.]

  1. Ensuring equivalent protection for news publisher content in search results

In Clause 18 omit subsection (2)

After Clause 18 add Clause 18A:

18A   Duties of care: protection of news publisher content

(1)   The “duties to protect news publisher content” in relation to search services are the duties set out in this section.

(2)   A duty not to apply any of the online safety duties in this Act to news publisher content.

(3)   A duty not to remove or restrict users’ access to news publisher content, or to materially alter the way in which news publisher content would ordinarily be viewed, on the basis that the content infringes a content moderation rule (as defined in sub-section 12).

(4)   A duty, in relation to a decision by a provider to remove news publisher content from search results or otherwise to restrict access to it, to make a dedicated and expedited complaints procedure available to the creator of that content.

(5)   Without prejudice to the duties in (3) and (4) above, a duty to:

(a)   Notify the creator within 24 hours of any decision to take down or restrict access to news publisher content and the reasons for that decision, and

(b)   Consider any representations made by the creator in response and, in the event those representations are not accepted, notify the creator of the reasons for their rejection.”

(6)   A duty to specify in terms of service by what method content appearing in search results is to be identified as news publisher content including specifically by reference to OFCOM’s register of broadcast licence holders and any register of recognised news publishers maintained by:

(a)   News Media Association;

(b)   Professional Publishers’ Association;

(c)    Independent Community News Network; or 

(d)   Any other trade or industry body, established by two or more UK linked entities whose principal purpose is the publication of news, and which is identified for this purpose by OFCOM in a Code of Practice.

(7)   A recognised news publisher may make a complaint to OFCOM that a regulated service has failed to comply with any of the duties in this section.

(8)   The Secretary of State must make regulations containing provision about procedural matters relating to complaints under subsection (5) which shall, in particular, include provision about the following matters –

(a)   Notification to OFCOM of an intention to make a complaint under subsection (5);

(b)   The form and manner of such a complaint, including requirements for supporting evidence;

(c)    The steps that OFCOM must take in relation to such a complaint;

(d)   Time limits for taking steps in relation to such a complaint

(9)   For the purposes of this section content is “news publisher content”, in relation to a search service, if the content is located on an internet service operated by a recognised news publisher.

(10)           In this section the reference to a person who is the “creator” of content is to the recognised news publisher in question.

(11)           For the meaning of “recognised news publisher” see section 40.

(12)           In this section “content moderation rule” means any term of service, rule, policy or other standard maintained by the provider or any third party acting on the provider’s behalf, which by itself would justify the removal or restriction of content from search results by reference to the substance or subject matter of the content itself.

[NOTE: the purpose of this amendment is to impose an equivalent duty to protect news publisher content as in amended Clause 14 (above) on providers of search services.]

  1. Other

In Clause 39 subsection (9) add after “recognised news publisher”

“or its employees or agents”

[NOTE: The purpose of this amendment is to bring content produced by journalists working for a recognised news publisher within the scope of “news publisher content” (and so outside the scope of “regulated content” when published on a regulated user-to-user service).]

 

 

Andrew Caldecott QC

Ben Gallop

 

16.9.21

 

 

 

 

 

 

 

 

 

 

 

Appendix 2

 

Online Safety Bill and the News Publisher Exemption:

Industry-Owned Kite Marking Scheme

 

Background

 

There is a growing need to exempt verifiable news sites from legislation or regulatory codes and to differentiate and protect genuine content from disinformation or other harmful content distributed online by search engines and social media platforms, in a way which does not undermine basic freedom of expression principles. Tied to this is a need for a kite marking scheme so that the platforms can algorithmically recognise content which is produced by legitimate news publishers and therefore exempt it from online safety requirements.

 

Understandably, there is deep-seated resistance from publishers and editors for any kite marking scheme to be run by a third party which could end up becoming a powerful licensing agency, thereby effectively ruling out commercial operators and fact checking sites, press regulators, the government, statutory regulators such as Ofcom, or the global tech platforms. However, support has been expressed, including from the government, Ofcom and the platforms, for an effective kite marking scheme to be established by the industry, for the industry.

 

The Online Safety Bill intends to exempt the online content of news publishers and it includes a workable definition of ‘Recognised News Publisher’ which covers news broadcasters and news publishers whose content is subject to editorial control, who are subject to a standards code and who have a complaints-handling process, amongst other criteria (see full definition wording below). The exemption explicitly does not cover members of proscribed organisations.

 

In practice, the platforms who are subject to the new laws will need to be able automatically and algorithmically to identify the domains of recognised news publishers in order to ensure they do not apply any moderation or take down measures to the content on those domains. However, the platforms will not want responsibility for keeping a list or register of recognised news publishers and neither will the government or Ofcom. This is where an industry-owned kite marking scheme could help them identify the main recognised news publishers in the UK.

 

Operation in practice: how could an industry-owned scheme work

The NMA’s proposal for a voluntary industry kite marking scheme has already been put forward to the DCMS and discussed with Ofcom. The NMA considers that such a kite marking scheme could be established and allow news media content from the main national, regional and local publishers, to be excluded from the new Online Safety regime and other legislation and codes aimed at the platforms, avoiding state intervention and without compromising press freedom.

We have discussed this in detail with DCMS policy teams and provided DCMS with papers outlining the scheme, criteria and administration and the simple technical means by which news media content which can be carried by the platforms can be easily recognised.

Working with broadcasters and other relevant industry organisations as appropriate, the NMA could maintain an up-to-date list of the main established relevant news publishers across the UK. This would include both NMA members and non-members who meet the Online Safety Bill definition. It would not be a definitive list but would cover the main news providers.

 

Online safety legislation is likely to mean algorithms being adjusted to block large categories of content from search and news feeds. For the intended news publisher exemption to work, we would need to give the platforms a means of distinguishing between genuine news publishers and others.

 

One potential back-end solution, as opposed to front-end visualisation, would involve creating a register of publisher IDs from the NMA’s corresponding list of news site domains in a way which is not easily replicated – the equivalent of the holograms and issuer details on the press cards issued by industry bodies under the UK Press Card scheme.

The register would be accessible to all third-party online platforms who would be able to scan it and create a filter in their user interface for all the domains listed in the register.

It would avoid platforms having to deploy keyword blocking or other measures against specific types of content, because any content originating from eligible news publisher sites could be automatically cross-referenced by the algorithms against the register of domains and would be exempt.

 

Next steps:

 

NMA to draw up an initial list of domains which meet the definition to share this with members, and with broadcasters and other industry organisations as appropriate, to ensure it covers the main recognised news publishers.

 

NMA to set up technical discussions with the platforms over how the exemption would work in practice. Google has already expressed an interest in exploring this with us.

 

 

 

 

 

 

 

 

 

 

 

Appendix 3

Proposed addition to the Bill

 

 

Amendments to Part 3 of the Online Safety Bill in relation to trusted news content distributed by digital platforms

 

Insert a new Chapter 1 (with current Chapter 1 becoming Chapter 2 and so on), starting at Clause 49

 

49.              The mandatory bargaining process

 

(1)    The regulator[15] must prepare a code of practice describing duties that digital platforms and recognised news publishers have for the purpose of protecting the provision of original news-related material in the United Kingdom by remedying the significant bargaining imbalance between digital platforms and recognised news publishers when agreeing terms for the distribution of news-related material by digital platforms.

 

(2)    In drafting its code of practice under sub-section (1), the regulator may include provisions relating to—

 

    1. bargaining between digital platforms and recognised news publishers in good faith in relation to news-related material made available by digital platforms, whether collectively or individually;
    2. compulsory final offer arbitration where parties cannot come to a negotiated agreement about remuneration relating to the making available of news-related material on digital platforms within three months after bargaining starts;
    3. a requirement that digital platforms provide recognised news publishers with a list and explanation of the data that their platform collects (whether or not it shares the data with the recognised news publisher) about the recognised news publisher’s users through their engagement with news-related material made available by the platform, this list and explanation to be updated and supplied annually;
    4. a requirement that, if requested by a recognised news publisher or the regulator, the digital platform supplies it with the data and information relevant to assessing the benefit that the platform receives from the news-related material of each recognised news publisher;
    5. a requirement for digital platforms to provide recognised news publishers with 28 days’ advance notification of planned changes to an algorithm or internal policy or practice that is likely to have a significant effect on either (i) the ranking of the recognised news publisher’s covered news-related material made available by the platform, or (ii) the display and presentation of advertising directly associated with that content;
    6. a requirement that this notification describes the changes to be made and their expected effect in comprehensible terms, and explain how the recognised news publisher can minimise any negative effects;
    7. non-differentiation requirements stipulating that digital platforms shall not differentiate between recognised news publishers because of matters that arise in relation to their participation or non-participation in the process;
    8. contracting out, so that a digital platform may reach a commercial bargain with a recognised news publisher outside the process about remuneration or other matters.

 

(3)    The regulator may include any other provisions which it believes are required or desirable to achieve the purpose of the code of practice or matters reasonably related to that purpose.

 

(4)    The regulator may exclude from the code of practice prepared for the purposes of sub-section (1) any digital platform or recognised news publisher that is of insufficient importance or size, if it does not have links with the United Kingdom within the definition in section 3, or if there is another reasonable justification for doing so.

 

(5)    Before issuing its code of practice under sub-section (1), the regulator must consult digital platforms, recognised news publishers and persons who appear to the regulator to be affected by the code of practice.

 

(6)    Sections 32 to 35 shall apply to a code of practice prepared under sub-section (1).

 

(7)    The submission by the regulator of the proposed code of practice under section 32(1) must take place within four months of this Act coming into force.

 

(8)    Digital platforms and recognised news publishers must comply with the code of practice prepared under sub-section (1).

 

(9)    Where a code of practice under this section is in force, the regulator may—

 

    1. prepare amendments of the code of practice;
    2. prepare a code of practice under sub-section (1) as a replacement for a code of practice previously prepared under that sub-section;
    3. withdraw the code of practice.

 

(10) In this Part, “digital platform” means any of the following—

 

    1. a “user-to-user service” as defined in section 2;
    2. a “search service” as defined in section 2; or
    3. an entity with “strategic market status”.

 

(11) In this Part, an entity with “strategic market status” means an entity that has been found in whole or in part to have strategic market status by a body with statutory authority to make such a finding, or is or will be otherwise subject to regulation owing to the market power or strategic status it holds in the United Kingdom’s economy.

 

(12) In this Part, “recognised news publisher” has the definition given in section 40.

 

Consequential amendments

 

Section 70 (now 71) (power to require information): 

 

 

Section 82 (now 83) (requirements enforceable by the regulator against providers of regulated services):

 

 

Section 119 (now 120) (liability of parent entities for failures by subsidiary entities):

 

 

Section 121 (now 122) (liability of subsidiary entities for failures by parent or fellow subsidiary entities):

 

 

Section 127 (now 128) (Extra-territorial application):

 

 

 

27 September 2021

 

36

 


[1] https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response

[2]https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/985033/Draft_Online_Safety_Bill_Bookmarked.pdf  s.46 (3) and (4).

[3] https://committees.parliament.uk/publications/6878/documents/72529/default/

 

[4] 

[5] https://www.gov.uk/government/consultations/online-harms-white-paper/outcome/online-harms-white-paper-full-government-response para 1.11

[6] https://www.cnbc.com/2020/08/04/some-major-companies-will-keep-pausing-facebook-ads-as-boycott-ends.html

[7] https://www.theguardian.com/technology/2021/may/27/facebook-lifts-ban-on-posts-claiming-covid-19-was-man-made

[8] https://www.wsj.com/articles/twitter-reinstates-new-york-post-account-11604096659?mod=searchresults_pos1&page=1

[9] https://www.nytimes.com/2021/09/03/technology/facebook-ai-race-primates.html

[10] https://pressgazette.co.uk/mail-online-biggest-uk-news-website-july-2021/

[11] https://pressgazette.co.uk/top-50-largest-news-websites-in-the-world-sputnik-drudge-and-fox-see-biggest-traffic-falls-in-february/

[12] https://www.sistrix.com/support/sistrix-visibility-index-explanation-background-and-calculation/

[13] The Guardian had no visibility pre-2013 because at that point it changed to its current domain name, theguardian.com. The Sun had very low visibility 2013-15 because it was behind a paywall.

[14] CMA Digital Advertising Market Study, Appendix S, p.10

[15] The regulator could be Ofcom if the mandatory bargaining code is included in the Online Safety Bill; the Digital Markets Unit it is included in the Digital Competition Bill.