Supplementary written evidence submitted by the Age Verification Providers Association
Additional Evidence: DCMS Select Committee hearing – Online Safety
9 November 2021
Bill in general - what works well and what any areas of concern?
- In five years’ time, we will look back on the days where there were no universal age restrictions online with incredulity, just as would today if children were allowed to wonder into nightclubs, casinos and strip-clubs in person, where the level of potential harm is arguably far lower.
- The basic concept of a duty of care is sensible – it allows for principle-based approach which can stand the test of time.
- The Bill should regulate to promote an independent, privacy-protecting, standards based, open, competitive and interoperable age verification sector as a foundation for a safer internet for children
Our main concerns are:
- Scope – there is a significant loophole for sites which just drop user-generated content and interactions – and we were pleased to hear new Secretary of State repeat the point made by her predecessor that she accepts the Bill does not go far enough on commercial pornography.
- Enforcement – Ofcom is used to dealing with a small number of broadcasters or telcos – there are millions of pornographic websites to pick just one category of harmful content, so a enforcement mechanisms on a more industrial scale which do not require separate court orders for each site are a critical success factor.
- Timetable – It has taken the best part of a year since it became law for Ofcom to publish guidance on the AudioVisual Media Services Directive (AVMSD), and that only applies to 18 UK-based video sharing platforms. We must ensure that the best does not become the enemy of the good when it comes to risk assessments, codes of conduct etc. and include on the face of the bill a 3 month time limit for these key documents to be submitted to Parliament by the Secretary of State so we don’t wait years for any real impact.
- We remain of the view that the use of Part 3 of the Digital Economy Act as an interim measure offers the fastest route to addressing one of the highest risk “harmful but legal” features of the internet – unfettered access by children of any age to pornography, often featuring violent, non-consensual or in many cases unrealistic content, that is having a highly detrimental effect on a new generation of children every year that goes by without a change in the law.
- We have seen the impact of this on male violence towards women and girls, in a recent report eventually published by HM Government in response to the Women and Equalities Committee, if the Committee requires evidence beyond their own instinctive knowledge that our children should not be exposed to this content a moment longer.
- This is a point made forcefully by Minister Damien Hinds who is not stranger to this field given his own contributions to the work of this Committee.
- We disagree with government assertions that the process would have to start afresh – and if there was a will, then there’s a way to see children protected from exposure to extreme and violent porn within 3 months – and its already on the statute books so would take just the flick of a ministerial pen.
- The High Court gave permission a second time the day after our evidence session for a judicial review of the government’s ongoing failure to keep commencement of Part 3 under review, acknowledging the gap in between this legislation and the draft Bill, and the fact that there is a lengthy delay between the two.
What is the spectrum of Age Assurance - what does it cover
The essence of age assurance is proving your age online without having to disclose your identity
- Ranges from softer age estimation techniques which provide a good indication of the age range of a user, to harder age verification drawn from ID or databases where the actual date of birth is obtained.
Is Age Assurance mandated in the bill
- If as a society we wish to offer a higher level of protection to children, and preserve the freedoms of adults, in the online world in the way we do in the real world, then a fundamental starting point is knowing which of your users are children.
- Wherever the Bill makes reference to children, it is therefore effectively mandating Age Assurance.
- But in fact, we have already reached a tipping point where Age Assurance is required.
- Since 2 September the Age Appropriate Design Code has required any site that processes personal data, and has potentially harmful content, to apply AA.
- The AVMSD requires it for video sharing platforms, and this will affect more sites as EU states such as Cyprus and Ireland begin to enforce it as well.
- Gambling, cigarettes and alcohol sales have ensured that millions of age checks are already completed.
- And advertising regulations add further pressure to platforms to know the age of their users. The government is introducing a total ban on digital ads for High Fat Salt and Sugar (HFSS) foods because it was not convinced platforms could prevent children from seeing them.
What are the confidence thresholds
- Generally legislation requires age assurance to be applied in proportion to the risk of harm, which we support.
- It would be easier if there was a common level of understanding about this. BSI PAS 1296 standard already sets a framework for that, and we are working with the IEEE and ISO to create international standards, with 5 simple levels of confidence, ranging from self-stated, up to the most rigorous checks against official ID accompanied by regular liveness checks to ensure that ID belongs to the user at the time. These mirror the standards for identity set by the UK Government in GPG45.
Who should set the confidence thresholds?
- Regulators should give clear guidance as the minimum standard level of confidence they expect for each use-case. This should take into account the costs of more rigorous checks to ensure the regulatory burden is proportionate to the risk. So in their risk assessments, Ofcom could indicate the pornography should be subject to a ‘standard’ level of check, while the Home Office may prescribe an ‘enhanced’ check for sales of knives.
What is your view of the Kidron Bill
- The AVPA has endorsed this Bill.
- It reflects closely our existing code of conduct for AVPA members
- It should be either incorporated into the Bill if it does not get government time in its own right, or be beefed up to require the application of AV more quickly than it is likely the Online Safety Bill could ever take effect, by adding commencement of Part 3 of the Digital Economy Act to it, allowing the preparatory work already completed and approved by Parliament to be the basis of rapid enforcement.
How do you respond to the criticisms around Privacy
- UK GDPR legislation is very clear that privacy-by-design and data minimisation are legal requirements; but also commercially age assurance solutions which create any risk of individuals being tracked on the internet are not going to be successful.
- Minister Philp was clear to the Joint Committee that data provided for age assurance may only be used for that purpose alone.
- AV was invented primarily in preparation for the DEA Part 3 requirement for AV for pornography so right from the start, privacy was at the top of the list of requirements. Systems were designed to ensure that neither the adult sites knew who was being age checked, nor did the AV providers keep a record of which sites were making those checks.
- Our members are leading experts in this field and all recognise that the only non-hackable database is no database at all – so we consistently avoid creating new central databases of personal information for the purpose of AV.
- Sadly, we find the attitude of some campaigners to be deliberately obtuse, given they are often tech experts themselves, and certainly know it is perfectly possible to design the privacy risks out of systems.
- We do have sympathy for their concerns that bad actors could set up fake AV systems to harvest data – which is why we support a Certification Scheme, and now support Baroness Kidron’s Bill and believe it is essential the Bill gives Ofcom the powers to approve audit and certification schemes, in the same what that the ICO already does under Article 42 of GDPR. We recognise we are occupying a position of trust, and favour a strong co-regulatory approach.
How do you respond to the criticisms of Coadec and TechUK, that Age Assurance would be a barrier to startups
- First AV is inexpensive – pennies not pounds, usually paid just once a year
- Start-ups do not get exemptions to other safety-related legislation in any other sector, nor do the Financial Conduct Authority or Gambling Commission give them respite from Anti-Money-Laundering legislation.
- The European Commission is funding euCONSENT, a project to deliver interoperable age assurance and parental consent mechanisms across the EU and UK which we are directly engaged in designing. When it launches next year, you will be able to prove your age once, then re-use that check many times over often without any further interruption to your user experience (to avoid the tyranny of the cookie popup being created for age checks)
How does the new Bill compare to the Digital Economy Act
- It is noticeable that “pornography” is only used once in the new Bill, and that is solely for the purpose of repealing Part 3
- The government committed to the High Court that this Bill would be an improved replacement for Part 3 – so it needs to at least include all commercial pornographic websites within scope – with or without UGC.
- It is essential to copy across all the effective international enforcement mechanisms of the DEA which we know got the attention of the leading global sites – so both payment and site blocking powers are critical and must be implemented universally and rapidly for non-compliant sites so those which do comply are not unfairly penalised.
- The new Bill is a good opportunity to put a certification scheme on a statutory basis as well, giving Ofcom equivalent powers to those the ICO has under Article 42 of UK GDPR to approve certification schemes.
What about the online porn loophole
- While leading adult sites are not opposed to AV per se, they are commercially unwilling to adopt it unless they are confident the regulator will enforce it universally and simultaneously.
- So unless that is guaranteed, sites will look for any loophole, and as the Bill is drafted, all they need to do is to remove user-generated content and interactions which we predict will be their response.
- To show how fast they can change, we know the leading global porn site removed over 2/3 of its uploaded content within days of Mastercard withdrawing payment services.
- A third category of sites in scope should be added which includes all sites with Priority or Primary Priority Content likely to be accessed by children (the same rule as the Age Appropriate Design Code)
- Also the limited functionality services exemption needs to be re-read from the point of view of a porn site trying to evade regulation and then re-drafted!
- This would ensure that websites which include not only pornographic content, but perhaps sites dedicated to promoting anorexia, “Incel” and others which might not have any user-generated content but are covering topics that the Secretary of State will be defining as priority content, are still obliged to apply the new duties of care.
What are the technical challenges to AA and AV - what are the limits
- It’s a fast moving field – just 7 years ago the us National Institute of Standards and Technology (NIST) reported mean average error of around 5 years it is now under 2.
- It is challenging to do AV at ages below 18 because kids don’t have driving licences or credit records but we are working with DCMS to open up access to other data sources which would facilitate AV for minors such as DfE data. HM Passport Office already offers one-way blind checks for children who have a passport. And 65% of children have bank accounts which is another good source gradually being accessed through Open Banking.
- There is no reason today, however, not to double check when they are recorded as turning 18 they actually are now adults – so all those who opened a social media account by lying when they were 10 years old that they were 13 are not assumed to have celebrated they 18th birthday when they are in fact still just 15. This could be done tomorrow to create a sub-set of users of all social media platforms who are verified adults.
- AV has to be part of a layered approach working with parents, ISPs, parental controls and child protection software, as well as digital media literacy to build resilience. Unless we are willing to adopt the highest levels of assurance, with liveness checks from selfies required all the time, then shared devices left unlocked after a parent logs in will always be an issue.
What do you think about business disruption measures?
- We would agree the powers need to cover a wide range of services – app stores, search engines, DNS Resolvers, payment services, advertising platforms, not put all the burden on ISPs.
- As we state above, separate court orders per site are insufficient when dealing with millions of porn sites.
Delegation of powers to ministers
It would be informative to have a draft of the Priority and Primary Priority Content schedules to consider alongside the Bill. Otherwise there is a lack of context which makes testing the clauses of the Bill very theoretical.