Written evidence from Match Group
Executive Summary
- Online dating has played a pivotal role in shaping todays’ internet savvy society and the way personal relationships are formed (Nowadays, one marriage in three begin on an App[1], while in the US, 40% of all relationships[2] and 65% of all LGBTQ+ couples[3] meet online).
- As a market leader in online dating services, Match Group strongly supports and believes it is right for government to scrutinise market players across the digital ecosystem. Match Group welcomes the opportunity to engage with the UK Government on online safety and consumer protection, hoping to see the level playing field and standards that Match Group adheres to applied across the sector by all market participants.
- This document includes: an overview of Match Group and its brands and the format of our business model; our commitment to a safety by design approach that encompasses proactive measures to promote user safety and prevention of harms; our view on the reasons why not having an explicit definition for determining harm will effectively recognise the nuances across industries as well as the evolving nature of harms themselves; and our suggestions on potential key omissions to the Bill and any contested inclusions, tensions or contradictions. We conclude with any lessons the UK government could learn through international comparisons with proposed legislations around the world.
About Match Group
- Match Group is a leading provider of online dating services across the globe.
- We operate a portfolio of trusted brands, including Tinder, Match, PlentyOfFish, OkCupid, OurTime, Meetic, and Pairs each designed to increase our users' likelihood of finding a romantic connection. Through our portfolio of trusted brands, we provide tailored products to meet the varying preferences of our users. We currently offer our dating products in 42 languages across more than 190 countries (Group official website: http://mtch.com), including in the United Kingdom where some of our main brands are widely used.
- Our platforms are mainly closed networks enabling private, peer-to-peer communications between adults, enabling them to establish a meaningful connection in real life. Our platforms are not social networks enabling one-to-many communication nor do they rely on selling targeted advertising to make money: 98% of our revenues come from subscriptions paid by users.
- As a result, our business model generally focuses on facilitating in-person interactions as a result of 1-1 messaging. The aim of our business model is therefore to reduce online dependency, meaning we want users to move away from online connections to offline in-person relationships. In contrast, other businesses are trying to move more of people’s time online to support revenue streams like the sale of advertisements and the harvesting of personal data.
- Our market leading position and the responsibilities that come with this are not lost on Match Group. We continually review our safety protocols in line with best practice and ensure that online safety, including protecting consumers from online fraud and scams, is as much prioritised as the user experience.
- All Match Group platforms have built-in tools which are specifically designed to promote the responsible use of our platforms and tackle any potential illegal, illicit, or harmful behaviour. The policies that Match Group currently have in place to protect users online and increase online safety provide a solid foundation on which to tackle online harms within Match Group. Match Group believes these could be replicated across other areas of activity in the digital ecosystem.
- Match Group brands invest meaningful resources, both in terms of capital and human resources, with the aim of providing a safe user experience. Our customer care team represents 20% of our workforce. The focus on safety begins at registration and continues throughout our members’ user journey on our platforms. We will spend more than $100 million on product, technology and moderation efforts related to trust and safety in 2021 to prevent, monitor and remove inappropriate, illegal, or harmful content.
How has the shifting focus between ‘online harms’ and ‘online safety’ influenced the development of the new regime and draft Bill?
- The prevention of online harms, or the fulfilment of online safety, must be the responsibility of everyone online, whether in a personal or a professional capacity.
- At Match Group, a safety by design approach is integrated across all our platforms to ensure we can adopt a proactive approach to online safety rather than a reactive approach to online harms. Additionally, the prevention of harms, via Ofcom’s codes of practices and processes, will be critical to safety by design.
- Fully recognising our role, Match Group has in-built the principles of protecting our users from all types of online harms. The safety, security, and well-being of our users is something we take very seriously and are continuing to design into the product. While relatively few of the hundreds of millions of people that have used our dating services have been harmed by bad actors, we believe that any incident of misconduct or criminal behaviour is one too many.
- Match Group uses an array of proactive safety tools and processes to ensure the safety of our users. These include the automatic scanning of profiles upon creation for red-flag language and images, ongoing scans for fraudulent accounts or messaging activity and manual reviews of suspicious profiles, activity, and user-generated reports. We would also welcome improved cooperation between Platforms and police
- Our Terms and Conditions prohibit harmful behaviour, and we take significant steps to prevent the following groups of people from using our product: anyone younger than 18 years of age; anyone who has been convicted or pleaded no contest to a felony, violent or sexual crime; registered sex offenders; anyone suspected of sex trafficking.
- Safety and transparency are among our key priorities, and we are already working hard to balance that effectively alongside user privacy. Using our role as a market leader, we are employing that expertise to ensure that users feel both safe and confident that their privacy is being protected online. We welcome the development of the Online Safety Bill and the onus it puts onto platforms across different markets to do their part to prevent online harms and ensure the online safety of all our users.
Is it necessary to have an explicit definition and process for determining harm to children and adults in the Online Safety Bill, and what should it be?
- Having an explicit definition and process for determining harm would not reflect variation across platforms, brands, and industries, and the constantly changing nature of harms. Different industries are likely to see different issues, and different harms may result from the same behaviour. For example, a 16-year-old interacting with a 40-year-old in the comments section of a news source is very different from a 40-year-old trying to talk to a 16-year-old on social media or a dating app.
- Our moderation efforts are as rigorous for a 16-year-old as they are for a 40-year-old. We moderate everyone 18+ the same way, but we are also looking for underage users to ensure our platforms are moderated in as much detail as possible. By preventing underage access to our platforms, we can prevent instances of grooming and related harm to children.
- Match Group delivers this aim through the combination of technology and human resources, working diligently to keep underage users off our platforms. In addition to using sophisticated artificial intelligence, our brands collect birthdates, phone numbers, pictures, bios, and other inputs used for age verification, as well as check profiles for red flags to keep underage users off our platforms. Human moderators also review accounts that have been flagged either by automated systems or by user reports, and act on those reports accordingly.
- Similarly, we prioritise the safety and wellbeing of our legitimate users, and that means believing user reports and taking swift action to protect our communities. At the same time, users whose online content on their public profile has been removed, or whose profile has been banned following a complaint stemming from 1-1 conversation on a platform, can request a review of that decision to our customer service teams.
- We are very willing to engage with users who seek a review of the decision not to accept them on our platform because they seemed underage – in this instance, we act in order to protect their own safety outright and we tell them so. However, we believe that an appeals process allowing a user who has been reported as potentially posing a safety risk to others needs to be very carefully designed. Indeed, an appeals requirement transposed to online dating where people are supposed to interact first online, and then potentially meet a “stranger” in real life, could put users at risk. For example, telling an abuser that they were reported by the abused party for such abuse, especially when the abuser may be able to interact with the abused party in real life, could result in an event greater risk to the reporting user.
Does the draft Bill focus enough on the ways tech companies could be encouraged to consider safety and/or the risk of harm in platform design and the systems and processes that they put in place?
- Match Group brands invest meaningful resources, in terms of both capital and human resources, with the aim of providing a safe customer experience and have developed multi-layered systems against illegal content.
- Our market leading position and the responsibilities that come with this led to the development of those various internal mechanisms to safeguard consumer safety at our platforms.
- The Online Safety draft Bill goes a long way to consider safety and the risk of harm online more generally, however more focus could be placed on the ways in which tech companies use their platform design and systems and processes to protect their users. Currently it states that they have a duty to operate such systems and processes but does not explicitly state what these would look like on the various platforms.
- We welcome additional guidance in the form of Codes of Practice for companies to adopt greater consideration of safety within platform design. However, we believe that overly prescriptive processes, if published in the public domain, could potentially undermine safety efforts by providing bad actors with intelligence to circumvent protocols. As it stands, the Online Safety Bill is well positioned to maintain safety protocols without eroding necessary privacy standards to reporting users.
What are the key omissions to the draft Bill, such as a general safety duty or powers to deal with urgent security threats, and (how) could they be practically included without compromising rights such as freedom of expression?
- Match Group has a longstanding commitment to tackling romance fraud, online harms and is committed to ensuring it has measures in place that make sure its user-base is protected from online harms. However, fraud is largely absent in the draft Bill text, though it is due to be covered within the final legislation.
- Given the recent rise in online and romance fraud and the amount of harm it is causing for the victims involved, as well as the persistent threat of fraudulent adverts which are not covered, a greater focus on this from actors across the digital and financial ecosystems is necessary.
- Match Group therefore welcomes the coverage of scams and fraud in the draft Online Safety Bill but believes there is a necessity to involve actors from the different types of platforms to the app stores, to the marketing platforms and financial institutions, to work together to tackle online scams and fraud.
- For example, within platforms, there is a substantial difference between those platforms who take commission on transactions and those on which money is not involved. This stark difference can influence behaviours and equally require a different approach.
- As a result, the complexity of online scams and frauds, and the varied range of stakeholders concerned, necessitates a joined-up approach combining data and position within the ecosystem to find and cut off scammers and fraudsters wherever they are identified.
- However, this challenge is a substantial one, factoring in many sectors, stakeholders, and other existing pieces of legislation. The current approach, which does not yet include fraud within the draft Bill, is unlike to provide the outcomes that the Plan for Digital Regulation would encourage.
- Elsewhere, Match Group believes that for the system in which Ofcom is the regulator to work effectively, Ofcom must engage effectively with those entities who are covered by the Online Safety Bill. Understanding the range of tools and measures in place, the variety across the digital ecosystem, and the differing range of remedies required will benefit Ofcom’s future programme of work and fall into line with previous precedents such as the GDPR and privacy by design.
Are there any contested inclusions, tensions or contradictions in the draft Bill that need to be more carefully considered before the final Bill is put to Parliament?
- We support the principle of enhancing consumer rights as regards to explaining take-down decisions and the right to appeal accordingly. However, we believe that this cannot result in putting other users at risk, and it should not teach bad actors how to circumvent our tools, nor should it allow bad actors to find out about those who report them since this could put innocent users at a risk of greater harm.
- As such, Match Group believe that providing detailed reasons about why a user profile would be taken down is problematic.
- In respect of online dating, explaining a decision to a bad actor could equate to telling an abuser that they were reported by the abused party for specific abuse. Bad actors may take advantage of this insight and use it to circumvent safety policies in future or further abuse their victims. In fact, our experience shows that where bad actors know why they are banned, they use that information to better escape future bans while still utilising the platform in harmful ways.
- The tech industry is large and varied and with the different types of organisations involved, and the varying regulations they are subject to, mean that a one size fits all approach will often fail to achieve its aims.
- Currently, the categorisation of companies is unclear. We believe that the categorisation of companies by the Bill is most effectively determined by a company’s business model. As it stands, the vague nature of the Bill as to how the categorisation would be applied is unclear and would appear to group companies with vastly different business models and, importantly, corporate aims. Fundamental differences include:
- The online dating platforms and services that Match Group operates are largely closed networks, where users interact on a peer-to-peer 1:1 basis, in contrast to traditional social media platforms such as Facebook or Twitter.
- It is important to stress that Match Group’s business model does not rely on data monetization/targeted advertising, but on subscriptions paid for directly by users. We only collect and process the data necessary to provide the best user experience. Since we do not sell user data, collecting such unnecessary or unrelated data is not in our business interest and does not add value to the user experience. Match has made a global, company-wide commitment not to sell or share our users’ data to third parties for commercial purposes and we proudly stand behind that decision.
- Furthermore, the aim of our business model is to reduce online dependency, meaning we want users to move away from online connections to offline in-person relationships. In contrast, other businesses are trying to move more of people’s time online to support revenue streams like advertisements and data. This fundamental difference necessitates a more nuanced approach and greater clarity now that reflects this could help deliver more effectively targeted legislation.
What are the lessons that the Government should learn when directly comparing the draft Bill to existing and proposed legislation around the world?
- The draft Online Safety Bill provides the outline of an effective piece of legislation that balances safety concerns with essential rights to privacy. However, some areas of the Bill could be developed to better reflect best case examples elsewhere.
- The EU’s Digital Services Act imposes different sets of obligations for distinct categories of online intermediaries according to their role, size, and impact in the online ecosystem. The categories are as follows:
- Intermediary services; provided by network infrastructure providers, including ‘mere conduit services’ (e.g. internet access) and ‘catching services’ (e.g. automatic, intermediate, and temporary storage of information)
- Hosting services; provided by providers storing and disseminating information to the public, such as cloud and webhosting services
- Online platform services; by providers bringing together sellers and consumers, such as online marketplaces, app stores, collaborative economy platforms and social media platforms
- Very large online platforms (or VLOP) services; provided by platforms that have a particular impact on the economy and society and pose risks in the dissemination of illegal content and societal harms. Specific rules are set out for platforms that reach more than 45 million active recipients in the EU monthly.
- The UK Government could learn from this example of categorisation and apply something similar to the Online Safety Bill, providing more clarity and recognition of the enormity and variety within the digital ecosystem. Not only do different companies and organisations have different roles and functions, but their aims can vary wildly necessitating different approaches to ensure remedies achieve their stated aims.
- Another concern from Match Group is the uncertainty caused by the onus upon Ofcom. While Ofcom’s role as an effective regulator is not in doubt, the work to create codes of practice is a sizeable body of work, as will be the responsibility of companies to comply with those practices.
- As with the element of categorisation, the EU’s Digital Services Act provides more clarity on the topic. It states that Member States will have to designate independent digital services coordinators who will be granted specific oversight powers, will be entitled to receive complaints against providers of intermediary services, will have to cooperate with digital services coordinators of other Member States and will be able to take part in joint investigations. A European board for digital services (EDPB) was also set up to ensure effective coordination and consistent application of the new legislation.