Tackling Online Abuse: Written evidence submitted by UK Interactive Entertainment on 12/08/2020 (TOA0014)

 

About Ukie

 

  1. Ukie is the trade body for the UK’s games and interactive entertainment industry. A not-for-profit, it represents more than 480 games businesses of all sizes from start-ups to multinational developers, publishers and service companies, working across online, mobile, console, PC, esports, virtual reality and augmented reality. We welcome the opportunity to respond to the Committee’s inquiry.

 

About the UK games industry

 

  1. The UK video games industry is a significant economic contributor to the UK, supporting 47,620 FTEs and £2.87 billion in gross value add. The games industry is 35% more productive than the UK industrial average and is spread across the country: 55% of games development jobs are based outside of London and the South East. Games businesses are natural exporters, too: over 90% of the UK video games industry export products and services3.
     
  2. Outside of the sector’s economic impact, the cultural and creative contribution it makes should not be undervalued. Existing at the crossroads of technology and creativity, video games represent the entertainment form of the future. They are already a mainstream hobby for many, and appeal to a broad and diverse audience – they are estimated to be enjoyed by 37 million people in the UK of all ages[1], with people playing mobile, console and PC games both online and offline. Recent data by Ofcom indicates that nearly half of those aged 16-24 play games online, and among children aged 12-15, this number jumps to 72%. The numbers of people who enjoy playing games online appears to be increasing across most age groups as well[2]. This is unsurprising considering the growing significance of digital connectivity and socialisation across society – of which games are at the forefront.
     

Games and player safety

 

  1. Our industry has always recognised the safety of our player community as our paramount responsibility. Now more than ever protecting our players is crucial as people turn to interactive entertainment to maintain social connections, keep entertained, and look after their mental wellbeing during this period of social isolation. With over 2 billion players worldwide, it’s crucial to our industry to create a safe environment and provide information and tools to allow parents, carers and players to enjoy a safe, fun, fair and inclusive playing experience.

 

  1. Our industry has a strong track record of keeping players safe. This is the right thing to do, and it is also a commercial imperative. Healthy competition exists between games businesses in retaining players as games do not tend to benefit from or rely on strong network effects. Players are free to continue to play games and their associated environments they enjoy the most – equally, it is not difficult or disruptive for a player to leave a game or community for another. If developers and publishers do not take active steps to ensure their games, and communities, are fun, safe and engaging, then players will simply leave.

 

  1. This has led to a long history of proactive steps taken by games companies to protect their player communities. There are a range of strong and consistent measures put into place from developers and publishers, making games communities one of the safest and most sophisticated online environments around. Some examples of safety features that are currently deployed in games and at platform level include:

 

 

  1. Below we provide case studies from four prominent games businesses on how they work to ensure online safety for the communities and players of their games. We believe the tools and techniques used by our industry can be serious contribution to best practice for others in protecting online communities.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

Case Study: Jagex

Top line
We work very closely with the Internet Watch Foundation who say “Jagex is a leader across its sector in online safety initiatives. Their commitment to the safety of their gaming community is clear.” – full quote below.

Summary
RuneScape has a minimum player age of 13. We have a robust set of player rules and sanctions in place and any player can report any behaviour of any other player directly. In addition, we have over 1,500 players who act as in-game moderators.

We screen all in-game chat between players 24/7 against a series of trigger words and phrases and potential actions in a proprietary system, developed with the assistance of the Internet Watch Foundation and our own empirical learning with over 16 years of running online communities.

Detail
Millions of people play RuneScape games every month. All player chat is monitored – over a course of a year, that's around 4.8billion lines of chat. Our proprietary monitoring technology, called Player Watch, is considered best-in-class and checks all player interactions and can spot unusual behaviour and identify inappropriate or concerning key words and phrases.

In addition to a customer service team that operates 24/7, 365 days a year, we have a volunteer group of 2,500 players who act as live moderators; each moderator has a mature outlook and approach and a track record of accurate reporting.

We set a permanent chat filter which redacts the most obvious and abusive hate terms. In addition to this, players can personally set filters to automatically remove highly offensive language, allowing each player to customise the chat they see in line with their personal tolerances.

We moderate all chat through a combination of reports from players, our Player Watch system, our moderators' escalation system and monitoring and manual review.

The following behaviour types can be dealt with by means of delivering in game pop up warnings to players, or short ‘time out’ mutes from our volunteers. If this type of behaviour persists we automatically issue a warning, and if that warning is ignored, we permanently mute the player: 


We use IWF key words and our own historical knowledge, to scan any incoming reports for content likely to relate to a ‘risk to life’ – these reports are then prioritised and reviewed manually quickly, with 24/7 cover including external escalation for law enforcement intervention if required: 
 

We log and review all chat (not just reported) to identify high risk behaviour relating to child abuse, or to identify players who may be under 13. This approach also ensures that if a young person is being blackmailed or feels too afraid to submit a report to us, we will still see the chat logs and can escalate if required. We escalate any issue where we feel a genuine risk is present in relation to: 

 



As well as escalation to law enforcement, internally taken actions against any player can include: locking their account, blocking their IP for current or future access, limiting their account, muting their account, or a complete ban on playing.

The external agencies we primarily work with when we identify high risk behaviour are:
 

Thankfully just 0.16% of "offences" are related to online safety.

“Jagex is a leader across its sector in online safety initiatives. Their commitment to the safety of their gaming community is clear. They implement such a vast portfolio of safety feature, we have asked Jagex to talk to others within IWF membership to share their knowledge in this area. Jagex is a valued partner of IWF since 2008.” - Emma Hardy, director of External Relations, Internet Watch Foundation.

In addition, we have extensive anti-cheating measures, also within Playerwatch, as well as hijack detection systems, continually identifying compromised accounts and ensuring that the account owners can recover them – all online communities experience problems with account takeovers/hijacking, and we know this can also be a cause of harassment and bullying

 

 

Case Study: Roblox

 

Safety and Tools

Our priority is to make sure our users are safe and having positive experiences on the platform. Roblox was designed for kids and teens from the beginning, and we have a responsibility to make sure they can explore their creativity freely and safely. This is most important to us and will never change: safety is not a retrofit, it’s been in our DNA since day one.

We have a stringent safety system, which we believe is one of the most rigorous of any gaming platform, going well beyond regulatory requirements.

We have a team of over 1,200 members dedicated to protecting our users. This is a global organization and includes an internal team at our headquarters in California, as well as numerous teams around the world to ensure round the clock coverage and multilingual capability.

Additionally, we are constantly reviewing and improving our technology. We recognize there will always be individuals who are deliberate and determined to break our rules, as they do on other platforms. We have a continued and ongoing effort to make these efforts ineffectual.

Specifically, we review every single image, audio file, and video before it is uploaded with a combination of human moderation and automated machine learning technology. We employ PhotoDNA across the site.

We use various AI and machine learning solutions to monitor and filter our in-game chat. Our chat filters are extremely strict and do not allow the sharing of any personally identifiable information such as name, address, telephone number, or usernames for other platforms.

We also offer in-game “Report Abuse” functions throughout.

We were one of the founding members of Project Artemis in collaboration with Microsoft, The Meet Group, Kik, Thorn and others which aims to help identify potential instances of child online grooming for sexual purposes and to operationalize an effective response. This project was a success and has now been offered as a free resource for any tech company to embed in their systems.

 

Law Enforcement

We work closely with law enforcement and proactively report concerns for safety to the National Center for Missing and Exploited Children (NCMEC) who handle reports from US companies. We are members of the Internet Watch Foundation (IWF), the WePROTECT Global Alliance, and are active participants in the UK National Crime Agency/CEOP Gaming roundtables.

 

Safety Partnerships

We have a safety advisory board made up of representatives from KidSafe, UK Safer Internet Centre, Family Online Safety Institute, Insafe (representing the European Safer Internet Centre Helplines, part of the Better Internet for Kids EC project), ConnectSafely and Unterhaltungssoftware Selbstkontrolle (USK) which meets regularly to discuss safety and policy issues.

In addition, we also work with other international partners such as e-Enfance in France, Pantallas Amigas in Spain, and are board members of the Tech Coalition, Fair Play Alliance, Raising Good Gamers (part of Games for Change) and the Family Online Safety Institute.

 

Parental Controls

In addition to our own security measures and industry collaboration, we believe that education is incredibly important. In 2019 we created our Digital Civility Initiative to provide actionable resources for parents, caregivers and educators to promote learning about safety and digital civility. This is a topic we include in our webinars for educators too.

 

We offer a range of tools for parents including the ability to turn in-game chat off, friend only or chat with anyone, they can access a curated list of games suited to the youngest players on our platform, and can also see at a glance what age their child is signed in as. All our parental controls are pin protected to avoid them being changed without permission.

We are continuously seeking positive ways to engage with parents and carers, for example via our dedicated Facebook page, regular blogs, and helpful media articles. 

 

Case study: EA

 

At EA we believe passionately that for our games to be fun, they must be safe. We want our communities to be places where our players are free to connect, share, and compete without having to endure threats, harassment, or other toxic or harmful conduct, or unlawful behaviour. We use technological, legal, and social tools to pursue this goal.

 

We believe our player community must always sit at the heart of these efforts, and so last year we began our Building Healthy Communities initiative, centring our players in the work to ensure a safe and fair environment in all of our games. Beginning with a June 2019 summit, this initiative has brought together EA team members, outside experts, and our players to discuss how better to tackle toxicity, cheating, harassment and other problems that can arise in games communities.

 

Two of the most important results of this work have been the creation of a Player Council and the agreement of our Positive Play Charter.

 

The Player Council brings together prominent members of our player community with EA employees, for focused roundtable conversations on the real experiences of players in our games and what we can do together to combat toxicity.

 

These conversations have helped us design our Positive Play Charter, introduced in June 2020. This sets out how our community expects each other to behave, and how EA will respond if those standards are not met. The full Charter sets out in detail how we expect players to:

 

 

We have also put in place tools and procedures to address problems as they arise. We have implemented mechanisms across all our products through which players can directly and quickly report any abuse or inappropriate conduct like cheating or harassment.

 

This is accompanied by a robust and formalized process and policy for handling all complaints. Our dedicated team aims to review all complaints within four hours, escalate significant complaints and determine appropriate action within twelve hours, and complete the full review within twenty-four hours.Our remedial action varies depending on the complaint and may include removing content, blocking communications, suspending players, banning players and, where appropriate, notifying law enforcement. In the very rare instances of more serious complaints, such as threats of harm or self-harm, we have policies and processes to engage appropriate support, such as security professionals and law enforcement.

 

Beyond responding to reports from players, we use technological tools and game design solutions to support this work. For example, we implement text filtering tools like CleanSpeak in most games that contain chat functionality or other user-generated content. This allows us to filter for profanity and other denied content. We are introducing technology to filter images and other user-generated content.

 

We also have engaged our game designers around the world to develop custom solutions for each of our game spaces, with the goal of developing systems that support in-game communication between players, while preventing harmful, unfiltered, language.  One example is our Apex Legends Ping System, which allows players to communicate without chat or traditional language.  Player-to-player communication is critical to successful play in a Battle Royale game, but many people mute voice chat in multiplayer games to avoid voice communications with strangers or the potential of being exposed to inappropriate commentary.  The Ping System allows users to effectively communicate tactical information in play without speech, through “pings”. This has been widely welcomed as an alternative to traditional in-game chat that enhances the gameplay experience[3].

 

Our commitment to making our communities safer and healthier does not stop there. We are always looking for more we can do. We are doing work internally and partnering with academic institutions, on projects related to machine learning and the use of natural language processing techniques to detect toxic behaviour. And we participate in cross-industry initiatives such as the Fair Play Alliance to share data, learnings and best practices with our peers across the industry.

 


Case Study: Xbox

Focus on Xbox

 

Xbox and Windows devices come with unique family settings built-in and created to help manage screen time, social interactions, online spending and access to mature content.

 

Family groups

A free service that helps families stay connected across all of their devices and for parents to be aware of their children’s online activity. Users can personalise each family member’s online experience based on age-appropriate limits that they set for privacy, online purchases, content filters, screen time and more with the Xbox Family Settings app for console or on https://family.microsoft.com/. Those settings will apply to any Xbox One or Windows 10 device they sign in to.

Only a parent using the family group feature can change Xbox privacy and online settings for a child account. 
 

Screen time

Users can sign in to their Microsoft account and schedule time for each member in their family group. They can then customise how much time is spent each day of the week and when the device can be used. The screen time countdown starts once the person is signed in and stops when signed out.

 

Users can also turn on system notifications from your Xbox One to give players a heads-up on the console when screen time is about to run out and turn on Activity reporting on their child’s Microsoft account to receive weekly emails that show a summary of their activity on Xbox One and Windows 10 devices.

 

Content filters

Parents can filter or allow games, apps and websites based on the age of their children to ensure they are interacting with age-appropriate content. Parents decide what’s allowed, and it’s easy to change later. Children can also request access to content which parents can then approve or decline.

Just as movies and TV shows are rated, each game includes a rating symbol that suggests the appropriate age for play, plus descriptions that give you a heads-up about certain content.

Spending controls

With ask a parent”, parents receive an email when their children wants to purchase a game or app. Or, they can create a passkey to limit purchases made on their account. They can easily add money to their child’s account to limit purchases they can make on their own.

 

Xbox Ambassadors

With the Xbox Ambassadors scheme, players are encouraged to help make games fun for everyone by cultivating a safe, fun and inclusive Xbox community. Xbox Ambassadors are incentivised and rewarded for creating environments that everyone can enjoy – whether it is by welcoming newbies to Xbox, friending different gamers world-wide, or simply spreading positivity across Xbox live. Xbox Ambassadors are dedicated to making Xbox the best place to play.

Xbox Ambassadors are rewarded through editorial spotlights, chances to enter sweepstakes, and exclusive physical and digital swag. This is creating a worldwide network of players who celebrate the uniqueness of everyone, promote a safe environment, and above all else, make games fun for everyone.

 

11

 


[1] https://newzoo.com/insights/infographics/uk-games-market-2018/

[2] https://www.ofcom.org.uk/__data/assets/pdf_file/0027/196407/online-nation-2020-report.pdf

[3] [1] E.g. https://www.gamesradar.com/the-apex-legends-ping-system-is-a-brilliant-solution-to-the-horror-of-playing-with-strangers-online/  and

https://www.pcgamer.com/apex-legends-ping-system-is-a-tiny-miracle-for-fps-teamwork-and-communication/#:~:text=Apex%20Legends-,Apex%20Legends'%20ping%20system%20is%20a%20tiny,for%20FPS%20teamwork%20and%20communication&text=Building%20on%20the%20'press%20Q,of%20context%2Dsensitive%20voice%20lines.