Written evidence submitted by Yoti
Yoti Consultation Response DCMS Sub-Committee Call for Evidence
Online Safety and Online Harms
September 2021
Summary
● This response is made on behalf of an organisation, Yoti.
● Yoti owns and operates a free digital identity app and wider online identity platform that allows organisations to verify who people are, online and in person. This could be using the Yoti app, which allows individuals to share verified information about themselves on a granular basis or it could be using Yoti’s ‘embedded’ services which allow organisations to add a white label identity verification flow into their website or app. It could also be using Yoti’s authentication algorithms such as facial recognition, age estimation, voice recognition or lip reading.
● Yoti has a team of around 300 based in London, with offices in Bangalore, Los Angeles, Melbourne and Vancouver. There have been over 10 million installs of the Yoti app globally, following its launch in November 2017. Similarly, over 550 million checks have been conducted using the Yoti age estimation algorithm since February 2019.
● Yoti holds the ISO 27001 certification and continues to be audited every year. Further, in 2019 Yoti was certified to SOC 2 Type 2 for its technical and organisational security controls by a top four auditing company. The SOC 2 standard is an internationally recognised security standard. Yoti also holds the Age Verification Certificate of Compliance, issued by the BBFC. Yoti is certified to the publicly available specification PAS:1296 Age Checking.
● If there are any questions raised by this response, or additional information that would be of assistance, please do not hesitate to contact Yoti at:
Julie Dawson
Director of Regulatory & Policy
Florian Chevoppe-Verdier
Policy & Legal Associate
Valentina Dotto
Policy and Trust Framework researcher
● Yoti is happy for this response to be published.
Yoti Overview
Introduction
Yoti is a global identity, verification and biometric technology company. We provide identity and verification solutions to organisations (businesses, governments, charities) and individuals.Yoti is also the name of our flagship consumer app: the Yoti app.
The Yoti identity verification platform allows organisations to verify who people are, online and in person. We count circa 10 million installs of the Yoti app, following our launch in November 2017.
We have five core Yoti solutions
1) Identity verification;
2) Age verification;
3) E-signatures;
4) Authentication; and
5) Digital ID via the Yoti app
The CEO and Co-Founder of Yoti is Robin Tombs.
Our consumer-focused products
The Yoti a pp helps consumers prove who they are and confirm the identities of others. We distinguish ourselves with our approach to privacy and security: Yoti’s system has been architected so that it's impossible for us to monetise users’ personal data. Set-up involves a four-minute process, where you link your facial biometrics to your phone and validate against your driving license/passport. Identities are verified using NIST-approved facial recognition technology, government issued identity documents and where possible, biometric passport chips.
Once you've completed set-up, your Yoti wallet securely holds the verified attributes of your identity, such as Date Of Birth, gender, nationality. You can then use the app to scan QR codes to pass specific attributes to other people/organisations/websites.
Our platform solution allows consumers to create a reusable digital identity profile that they can use anywhere across mobile and web services that have integrated Yoti into their systems. Yoti enables businesses to verify consumers’ identities using biometrics and government-issued IDs. Yoti also offers an embedded Yoti Doc Scan service within the organisation’s app and website flows, to allow consumers to present their ID document, which can be digitised and the document checked for authenticity.
Our business-focused products
Yoti’s business model is very transparent. It is free for consumers and for eligible non profit organisations.
Businesses can use Yoti age verification when selling age-restricted goods or services. Yoti’s multi-factor authentication keeps websites and personal information secure and GDPR-compliant.
Our age estimation technology securely estimates a person’s age by looking at their face; we have performed over 550 million age estimates since we began this in 2019.
Yoti has also launched Y oti Sign which offers the convenience and simplicity of e-signing platforms, but with the added security of biometric verification.
Yoti is working with a range of commercial companies (NCR, national retailers, online dating,
social media, a irports and e-commerce players). We are the strategic partner of the Post
Office for digital identity, the eID provider for the States of Jersey, and the Improvement
Service Scotland and are working with a number of national charities such as the NSPCC. We are supporting a range of c ryptocurrencies to onboard customers and are a part of the
Yoti’s submission
I - How has the shifting focus between ‘online harms’ and ‘online safety’ influenced the development of the new regime and draft Bill?
1. We welcome the Government’s recent decision to expand the scope of the Bill to include online scams and fraud. Yoti is very active in this field and its various age verification and Digital ID technological solutions specifically seek to empower citizens to own and protect their digital identity. We allow users to prove their age and identity such as when engaging in social and commercial transactions in a wide range of settings. This includes financial services, logging into public sector services, dating and e-commerce platforms.
2. We also welcome the Bill’s positive focus to mandate platforms and firms to produce clearer terms and conditions, and by encouraging stricter identity and or age verification procedures at the onboarding stage. This is now the preferred solution for many social media platforms such as Yubo, which currently uses Yoti’s Age Verification technology to protect minors from harmful content and interactions with adults Our identity verification and e-signatures are supporting platforms such as Mindgeek to meet their regulatory requirements to ascertain content uploaders and gain permission from co-actors.
II - Is it necessary to have an explicit definition and process for determining harm to children and adults in the Online Safety Bill, and what should it be?
3. We would advise that the government liaise with the relevant civil society bodies which are experienced in recognising, preventing and dealing with the aftermath of harms.
III - Does the draft Bill focus enough on the ways tech companies could be encouraged to consider safety and/or the risk of harm in platform design and the systems and processes that they put in place?
4. One of the elephants in the room has been the ease to circumvent age gating and parental consent mechanisms or provide tokenistic weak age gating approaches - such as tick boxes or self assertion of age or reliance on second-hand or historic checks or knowledge based checks which can be shared or traded.
5. If the spirit of the Age Appropriate Design Code is followed, then both regulators and platforms should be required to consider the best interests of the child, and review which approaches are deemed too weak to offer appropriate safeguards.
6. The same tools that tech companies are offering to platforms can also be employed by regulators to audit the effectiveness and ease to circumnavigate age gating approaches.
7. It is Yoti’s view that age assurance has the potential to substantially improve the safety of children, who are the Bill’s target demographic and whose online experience it aims to secure. The likelihood of children accessing illegal content or
being exposed to harmful content will be greatly reduced once the use of robust age verification technology is widely adopted.
8. Yoti would be delighted to demonstrate its groundbreaking age and identity technological solutions before the Sub-Committee, including through the use of case studies, the Yoti World platform which members of the public can use to test several of our technological solutions, and practical examples of use of our technology on a wide range of social media, live streaming and ecommerce platforms.
9. Yoti would be delighted to share with Members of the House its experience in participating in the euCONSENT project to create a pan-European age assurance and parental consent system and in the development of the PAS 1296:2018 Age Checking Standard and the subsequent ISO Age Checking standard which is currently under development; where we are part of the UK drafting committee.
10. We believe that the inclusion of clearer definitions of appropriate technological solutions, based on international standards, to achieve the aims of the Bill will support relying parties to comply and ultimately benefit users. This will in particular support smaller platforms with fewer staff to comply. We would recommend that the Bill makes reference to specific known standards.
11. In the same way that tech for good hackathons and alliances such as the Technology Coalition have been successful in uniting competitors to develop solutions to tackle CSAM, it would be helpful if tech experts were convened to sessions to consider the various pseudonymous and anonymous verification technologies available and how they could be deployed to meet stated policy aims of both inclusion and safeguarding.
IV - What are the key omissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically included without compromising rights such as freedom of expression?
12. Yoti would welcome the inclusion in the Online Safety regime’s scope of websites that offer their users content designated as ‘priority illegal content’ where that content is not user-generated. This would extend the requirement for age assurance measures to these sites.
V - Are there any contested inclusions, tensions or contradictions in the draft Bill that need to be more carefully considered before the final Bill is put to Parliament?
13. We would encourage a review of the impact of like/dislike buttons, emojis and symbols, voting and rating/scoring. We would question if these should be considered to be added to the list of user-to-user functionalities that are being manipulated and so leading to harms against people with protected characteristics