Written evidence submitted by the Age Verification Providers Association

 

Julian Knight MP

Chair

DCMS Committee (Online Harms and Disinformation),

House of Commons,

London, SW1A 0AA

dcmscom@parliament.uk

3 September 2021

 

Dear Mr Knight,

Response to call for evidence Online safety and online harms

Thank you for the opportunity to contribute to your call for evidence to inform the Committee’s investigation into online safety and online harms.

The Age Verification Providers Association is a global trade body which represents over 20 of the main technology suppliers who have invested in the development of age assurance solutions to support the implementation of age restrictions online.  The UK has led the way in developing age verification, innovative age estimation solutions, and international industry standardsOur members perform millions of accurate, privacy-preserving, independent and standards-based online age checks every year.

 

Summary of our submission

  • The Bill’s scope is too narrow because it was designed to address social media.  Without amendment to the scope and altering a major exemption, pornographic websites will escape regulation entirely.  To avoid this, Parliament  should:
    • Add a third category of sites in scope to include all sites with content considered harmful to children (with or without user-to-user services)
    • Exclude sites in this new category from the “limited functionality services” exemption
  • The Secretary of State should publish a draft list of Primary Priority and Priority Content to allow for meaningful scrutiny of the Bill in context.
  • The Bill should regulate to promote an independent, privacy-protecting, standards based, open, competitive and interoperable age verification sector as a foundation for a safer internet for children
  • Parliament should add a 3 month time limit for the laying before it of the suite of codes of conduct and statutory guidance to avoid the risk of 2-3 years delay while this is perfected by the Secretary of State and Ofcom.
  • Enforcement powers should be amended to enable them to be applied at scale to 1.3 million adult websites without an individual application to the Court for each of them.

 

How has the shifting focus between ‘online harms’ and ‘online safety’ influenced the development of the new regime and draft Bill?

We sympathise with the government’s decision to limit the scope of the Bill, rather than trying to regulate the extremely wide variety of activities and associated risks presented by the internet, all in one go.  It is worth remembering that in the real world, often there are multiple pieces of primary and secondary regulation that aim to prevent harms from a specific economic activity e.g. the sale of alcohol or pensions.  As we seek to make what is illegal offline also illegal online, much specific legislation will need to be updated to maintain relevance in an online world. For example, it is long overdue that we allow for a proof of age on a smartphone by removing the legal requirement for a physical hologram!

The draft Bill was conceived very specifically to address the risks presented by social media  - this is evident from its narrowly drawn definition of scope, limiting its effect to user-to-user services, with search engines added.  At the time it was initiated, ministers fully expected Part 3 on the Digital Economy Act 2017 to take care of the risks to children, and the indirect impact, particularly on women and girls, from the ubiquitous availability of online pornography, and in turn the effect on relationships, body image and self-esteem.

With the government deciding in the summer of 2019 to abandon Part 3 ahead of the general election, this Bill also seeks to repeal that legislation.  Ironically, the only mention of ‘pornography’ in the Bill which the Secretary of State told the High Court would be an improved replacement for the DEA, is solely to repeal Part 3.

The Secretary of State has perhaps recognised the difficulty he will have in persuading either House to repeal legislation it has waited 4 years to see implemented, with a Bill that does not explicitly address it, when he confirmed to your main Committee that he was open to extending the protection offered by the Bill to Children.

Indeed, along with the Secretary of State for Education, he has given the Children’s Commissioner the task of finding  a way to block access to online pornography before this Bill will have any real effect in 2024 at the earliest.  As Dame Rachel de Souza has pointed out, the proposed law still does not mandate age verification either on the social media firms or adult porn companies.  We note she has set out three options when speaking to the Telegraph on 29 August 2021[1]:

 

Is it necessary to have an explicit definition and process for determining harm to children and adults in the Online Safety Bill, and what should it be?

The current draft gives the Secretary of State discretion to designate Primary Priority Content and Priority Content, which he must then ask Parliament to confirm.  There is little indication as to the difference between these two lists, and no information as to what is likely to be included.  Priority content is referenced in relation to both children and adults.

It is extremely hard to assess the impact of the Bill without knowing what content will be deemed as harmful and in scope.  While secondary legislation allows for flexibility over time, scrutinising the Bill without the context of the proposed scope is almost too theoretical as to be effective.  We would propose that the Secretary of State is asked to generate a draft of each list of priority content ahead of the Bill being presented to Parliament at Second Reading.

Quite apart from that gap in the picture, the Bill in any case leaves the inclusion of pornographic websites to chance, as they will only be indirectly in scope if they happen to allow for user-to-user services.  Some adult sites currently do fall into scope on this basis but it would be a very simple matter to disable the user-to-user functionality for UK-based users and escape the duties created in the Bill.  (We set out below how the “Limited functionality services” exemption also creates a massive loophole)

 

Does the draft Bill focus enough on the ways tech companies could be encouraged to consider safety and/or the risk of harm in platform design and the systems and processes that they put in place?

Given the wider variety of online content and functionality, the Bill’s principle-based approach seems pragmatic and appropriate.  We do believe age assurance to be a foundation stone for the building of a safer online world, because without it, no differentiation can be made between adult and child users in order to offer additional protections to the latter while preserving the freedoms of the formerSo it is an exception, and would merit being addressed explicitly in this Bill or in its own legislation. 

The AVPA has endorsed Baroness Kidron’s Age Assurance Standards Bill, introduced in the current session in their Lordships’ House, and if that does not secure government support, its efforts to regulate an independent, privacy-protecting, standards-based age verification sector should be incorporated into this Bill.  (As a sector, we somewhat unusually, would welcome further regulation – and are indeed self-regulating in the absence of any legal requirement to do so.)

The AVPA is also a leading member of the euCONSENT team, delivering EU-wide, interoperable, independent, privacy-preserving, standards-based age verification and parental consent infrastructure with funding from the European Commission.  This will make it much easier for techy companies to implement proportionate  levels  of age assurance without an intrusive effect on the user experience.

 

What are the key omissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically included without compromising rights such as freedom of expression?

When the Secretary of State addressed the main Select Committee after the publication of the text, he conceded he was open to extending the protections for children afforded by the Bill during pre-legislative scrutiny, provided this was congruent with the Bill’s approach.

We are proposing, with the agreement of children’s charities, church groups and campaigners that the bill is amended as follows to create a third category of services which would be within scope by virtue of containing content harmful to children, irrespective of whether or not they offer user-to-user services.

 

Scope of the Bill

To (i) user-to-user services and (ii) search engines, Parliament should add:

(iii) all services which include content designated in regulations made by the Secretary of State as primary priority content that is harmful to children or priority content that is harmful to children (whether or not the services allow for user-to-user functions).  

Services in scope under type (iii) will be subject to the same duties as (i) and (ii) already set out in the Bill, including the additional duties applicable if the service is likely to be accessed by children.

This would ensure that websites which include not only pornographic content, but perhaps sites dedicated to promoting anorexia, “Incel” and others which might not have any user-generated content but are covering topics that the Secretary of State will be defining as priority content, are still obliged to apply the new duties of care.             

 

Are there any contested inclusions, tensions or contradictions in the draft Bill that need to be more carefully considered before the final Bill is put to Parliament?

We would also like to suggest some further essential amendments to the Bill.

Avoid loopholes via exemptions

The “Limited functionality services” exemption should not apply to the new type (iii) services we suggest adding to scope.  This is because as drafted a porn site which produced all its own porn would be exempt.  The same logic could apply to a site on anorexia that produced its own content.

Schedule 1, Paragraph 5 sets out that, under the limited functionality services exemption, a user to-user service is exempt if the only functionalities enabling user-generated content on the service are the following:

a)      The posting of comment and reviews in relation to content that has been directly posted or uploaded by the service provider or website;

b)      The sharing of these comments and reviews on other internet services;

c)      Expressing views on such comments and reviews either through; (i): the “like or dislike” button, (ii): applying an emoji or symbol of any kind, (iii): engaging in yes/no voting and (iv): rating or scoring content.

While the explanatory notes state that “this exemption is expected primarily to exempt ‘below the line’ content on media articles and reviews of directly provided goods and services. Any services that meet the above requirements but have additional user-to-user functionalities will remain in regulatory scope,” as currently drafted the ‘limited functionality services exemption’ would apply to almost all commercial pornographic websites. Only those which do not allow for users to upload content themselves would remain in scope.


Provisionally identify harms for context

Services which include content designated in regulations made by the Secretary of State as primary priority content that is harmful to children will be subject to the same duties as Category 1 Services, including the additional adult risk assessment duties.

For these harms which are deemed the most serious, it is proportionate to extend the duties placed on the service to include those duties designed to protect adults as well as children which will otherwise only apply to the largest global platforms – this measure is to introduce a test of harm as well as simply size in applying these additional requirements.

 

Prevent a 2-3 year implementation delay

Ofcom must prepare codes of practice relating to the additional duties applicable if the service is likely to be accessed by children within 3 months of Royal Assent. 

Our experience of the thorough Ofcom process for developing guidance suggests there could be a very long delay before the new law is implemented and fully enforced unless statutory deadlines are included in the primary legislation.  Ofcom is expected to issue a Call for Evidence, then draft proposals for consultation, including Codes of Conduct in relation to each of the duties; these then need to be agreed by the Secretary of State and laid before Parliament, with delays possible for amendments or objections at any stage. Realistically, this could easily become a 2-3 year process before duties apply, and typically enforcement will only be implemented in a staged fashion; from monitoring, to supervision and eventually to regulatory action.

We are not criticising the rigor with which Ofcom apply themselves to their duties; rather we are recognising the urgency of the problems this Bill is designed to address, and arguing that the regulator should not let the best be the enemy of the good, or perfection trump speed.  Clearly the guidance, codes of conduct and other output from the regulator will need to evolve over time, not least to address changes in technology, but putting a ‘good enough’ regime in place quickly needs to be ensured within the legislation.  The risk otherwise is of the same degree of delay that beset Part 3 of the Digital Economy Act and still affects the Audio Visual Media Services Directive.

 

Be consistent when adding to scope

The Online Safety Objectives for (iii) shall be the same as for (i) user-to-user services. 

For consistency, objectives will need to be set out for the additional category of harmful sites which we propose to add to the scope.

 

Make enforcement at the scale required practical

Service restriction orders and access restriction orders must be applicable to unnamed services which meet stated criteria, so Ofcom can apply to the court for a general order applicable to multiple services– enforcement requiring a separate order for each service does not scale sufficiently. 

The current enforcement mechanism would require the regulator to apply to the court for an order for each of some 1.3 million pornographic websites, which is clearly not a practical proposal.  While there is an argument for retaining judicial oversight of the process as a whole, this amendment is designed to allow for largescale enforcement action.  The risk otherwise is that regulators only tackle the largest sites, and traffic rapidly diverts to smaller sites which are out of sight of the regulator. Adult sites were generally willing to comply with the Digital Economy Act, provided, as they were assured by the British Board of Film Classification, that there would be a level playing field, and action would be taken swiftly against any site that refused to adopt age verification.  Their greatest fear was the diversion of traffic to mid-market competitors as users were deflected by age checks to sites that were still not requiring them.

 

What are the lessons that the Government should learn when directly comparing the draft Bill to existing and proposed legislation around the world?

In terms of our field of age assurance, the UK remains the leading jurisdiction globally, in spite of the decision to refuse to implement Parliament’s will when it came to Part 3 of the DEA.  This Bill could allow the UK to retain that leading role, creating substantial export opportunities for the UK safety tech sector, if it is amended appropriatelySo we welcome it, but it is overdue, still subject to further delays, and unlikely to have any frontline impact before 2024 or 2025. 

We remain of the view that the use of Part 3 as an interim measure, re-appointing the BBFC as the statutory regulator working closely with Ofcom, perhaps through secondments, offers the fastest route to addressing one of the highest risk “harmful but legal” features of the internet – unfettered access by children of any age to pornography, often featuring violent, non-consensual or in many cases unrealistic content, that is having a highly detrimental affect on a new generation of children every year that goes by without a change in the law.  We have seen the impact of this on male violence towards women and girls, in a recent report[2] eventually published by HM Government in response to the Women and Equalities Committee, if the Committee requires evidence beyond their own instinctive knowledge that our children should not be exposed to this content a moment longer.

We are at the disposal of the Committee should you wish to hear from us as witnesses before your inquiry.

Yours sincerely,

 

Iain Corby

 

Executive Director

Age Verification Providers Association

 

 

About the AVPA

As an association, we work to:

The AVPA was formed in 2018 from organisations involved in the UK’s Digital Policy Alliance age verification working group, and created and in response to a need for a uniform voice for of the industry.

The AVPA is governed by a representative Board drawn from its member organisations, and its members comply with a comprehensive code of conduct, requiring:

 


  1. Fairness and transparency
  2. Use of appropriate verification methods
  3. Privacy and Security
  4. Accuracy
  5. Independence
  6. Responsibility

 


.

About age assurance

 

The essence of age verification is proving your age without disclosing your full identity.

Millions of UK citizens have completed online age checks, using them repeatedly on a daily basis to buy age-restricted goods such as alcohol and cigarettes or to access adult services such as gambling websites.

Age assurance is the collective term for age verification and age estimation.  Verification methods require evidence such as a passport, electoral registration or adult mobile phone account; Estimation using artificial intelligence to assess age within a statistically-proven range based usually on biometric features such as facial image or voiceprints, or behaviours such as use of language or keyboards.

Age assurance is carried out to a BSI standard PAS 1296:2018.  An independent audit and certification scheme demonstrates compliance with this, in addition to a UKAS and ICO approved scheme to show conformance with UK GDPR[3].

Age Verification Providers apply the most rigorous privacy-by-design methods to develop their architecture, are required by UK GDPR to be transparent with users about the use or retention of their data, which must be minimised.

 

Learn more at www.avpassociation.com

 


[1] https://www.telegraph.co.uk/news/2021/08/29/tougher-age-checks-needed-stop-children-stumbling-across-porn/

[2] https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/976730/The_Relationship_between_Pornography_use_and_Harmful_Sexual_Attitudes_and_Behaviours-_literature_review_v1.pdf

[3] Both operated by the Age Check Certification Scheme