Written evidence submitted by the BBFC (OSB0006)

 

Executive Summary

 

 

 

Introduction

 

The BBFC strongly supports the child protection aims of the draft Online Safety Bill, and all efforts to make the internet a safer place for children and for users generally. We welcome this opportunity to submit written evidence to the joint pre-legislative scrutiny committee.

 

Our submission primarily relates to the Bill’s effectiveness in protecting children from online pornography, and will address the following questions set out in the call for evidence:

 

  1. Will the proposed legislation effectively deliver the policy aim of making the UK the safest place to be online?

 

  1. Are children effectively protected from harmful activity and content under the measures proposed in the draft Bill?

 

  1. The draft Bill specifically includes CSEA and terrorism content and activity as priority illegal content. Are there other types of illegal content that could or should be prioritised in the Bill?

 

  1. Are there any types of content omitted from the scope of the Bill that you consider significant e.g. commercial pornography or the promotion of financial scams? How should they be covered if so?

 

  1. What would be a suitable threshold for significant physical or psychological harm, and what would be a suitable way for service providers to determine whether this threshold had been met?

 

  1. The draft Bill applies to providers of user-to-user services and search services. Will this achieve the Government's policy aims? Should other types of services be included in the scope of the Bill?

 

  1. The draft Bill sets a threshold for services to be designated as 'Category 1' services. What threshold would be suitable for this?

 

  1. Are Ofcom’s powers under the Bill proportionate, whilst remaining sufficient to allow it to carry out its regulatory role? Does Ofcom have sufficient resources to support these powers?

 

 

About the BBFC

 

The British Board of Film Classification (BBFC) is the not-for-profit independent statutory regulator of film and video in the UK. The BBFC is also the independent regulator, on a voluntary, best-practice basis, of content delivered via the UK’s four mobile networks.

 

The BBFC’s primary aim is to protect children and other vulnerable groups from harm through classification decisions which are legally enforceable and to empower consumers, particularly parents and children, through the provision of content information and education, helping families to choose content well, wherever and however they view it.

 

Designated by the UK Government as the authority responsible for classifying video works under the Video Recordings Act 1984, we operate a transparent, trusted classification regime through years of experience. We base our Classification Guidelines on large-scale consultations involving over 10,000 members of the UK public. This extensive outreach, which we conduct every 4-5 years, ensures that our guidelines always accurately reflect societal standards and parental expectations.

 

The BBFC has for many years argued that in order to address the fundamental challenge of harmful content and activity online, we need to work towards ensuring that what is unacceptable offline is unacceptable online. Ensuring that the protection and information that the BBFC provides in the offline world is replicated online has been central to our core mission for many years.

 

For example, since 2008 the BBFC has worked with the home entertainment industry on a voluntary basis to extend the use of our trusted age-labelling system for cinema and packaged media to Video on Demand (VOD) and streaming services. We work with over twenty platforms in the UK - including Amazon Prime Video, Apple TV+, BFI Player, Britbox, BT TV, Curzon Home Cinema, Sky Store and Virgin Media Store - to ensure that their content is appropriately rated in line with UK standards. Our innovative self-rating partnership with Netflix has enabled 100% of content on Netflix’s UK platform to carry our ratings and content advice.

 

The BBFC is also a recognised expert in online pornography. We have regulated adult content released to physical media formats since the mid-1980s, under the Video Recordings Act, and we classify some online adult content on a best-practice, voluntary basis for a small number of adult services. In 2018, the BBFC was designated as the Age-verification Regulator under Part 3 of the Digital Economy Act 2017 (DEA), recognising our expertise in classifying pornographic material and online regulation. In October 2019, the Government announced that they would not introduce age-verification under the DEA, and that instead the child protection goals of the legislation would be met as part of the broader online harms strategy.

 

 

Will the proposed legislation effectively deliver the policy aim of making the UK the safest place to be online? / Are children effectively protected from harmful activity and content under the measures proposed in the draft Bill?

 

The Bill’s effectiveness is limited by its current focus on internet services which allow users to upload and share user-generated content (“user-to-user services”) and on providers of search engines which enable users to search multiple websites and databases (“search services”).

 

Rather than protecting children from harmful content online, this will leave a significant proportion of pornographic websites freely accessible to children. It will also provide a potential loophole by which pornographic sites that are currently caught by the scope of the Bill can put themselves out of scope by removing the relevant functionality.

 

Given the policy aim to make the UK the safest place to be online, and the Government’s stated intention to deliver the objectives of the DEA (which would have applied to all commercial pornography accessible in the UK), this reduction in scope is illogical and concerning. Whether pornography is user-generated or not has no bearing on the level of harm it poses to children.

 

The BBFC strongly recommends that all sites providing commercial online pornography be subject to the same rules as user-to-user services.

 

 

The draft Bill specifically includes CSEA and terrorism content and activity as priority illegal content. Are there other types of illegal content that could or should be prioritised in the Bill?

 

The BBFC, when classifying pornographic material for release on physical media formats, must have regard for various UK laws, namely:

 

  1. The Obscene Publications Acts 1959 & 1964 (England and Wales);

The Civic Government (Scotland) Act 1982;

The Obscene Publications Act 1857 (Northern Ireland)

 

It is illegal to publish a work which is obscene. A work is obscene if, taken as a whole, it has a tendency to deprave and corrupt a significant proportion of those likely to see it.

 

  1. Criminal Justice and Immigration Act 2008 (England, Wales and Northern Ireland);

Criminal Justice and Licensing (Scotland) Act 2010;

Civic Government (Scotland) Act 1982

 

It is illegal to be in possession of an extreme pornographic image. An extreme pornographic image is one which is pornographic and grossly offensive, disgusting or otherwise of an obscene character, which features an apparently real person, and which portrays, in an explicit and realistic way, acts including necrophilia, bestiality and rape.

 

  1. The Protection of Children Act 1978 (England and Wales);

Civic Government (Scotland) Act 1982;

Protection of Children (Northern Ireland) Order 1978

 

It is illegal to make, distribute, show or possess indecent photographs or pseudo-photographs of a child. It is also illegal to make, distribute, show or possess indecent images of children which have been derived from a photograph or pseudo-photograph. (Content of this nature should already be covered by the inclusion of CSEA as priority illegal content.)

 

  1. The Coroners and Justice Act 2009 (England, Wales and Northern Ireland);

The Criminal Justice and Licensing Act 2010 (Scotland)

 

It is illegal to be in possession of a prohibited image of a child. A prohibited image of a child is a non-photographic or non-pseudo-photographic image which is pornographic and grossly offensive, disgusting, or otherwise of an obscene character, and which focuses solely or principally on a child’s genitals or anal region, or which portrays specified sexual acts by, of, or in the presence of a child.

 

  1. The Sexual Offences Act 2003 (England and Wales);

Sexual Offences (Scotland) Act 2009 (Scotland);

The Sexual Offences (Northern Ireland) Order 2008 (Northern Ireland)

 

It is prohibited for a person to record the private act of another, where the intention of the recording is for the sexual gratification of himself or a third party and where the recorded party has not consented to so being filmed.

 

  1. The Criminal Justice and Courts Act 2015 (England and Wales);

Abusive Behaviour and Sexual Harm (Scotland) Act 2016 (Scotland);

Justice Act (Northern Ireland) 2016 (Northern Ireland)

 

It is an offence to disclose a private sexual photograph or film without the consent of any individual who appears in the photograph or film, if it is done with the intention of causing that individual distress.

 

Content addressed by these laws should be included as priority illegal content alongside CSEA and terrorism content, to ensure that companies be required to put systems and processes in place to minimise the presence of such content on their platforms and swiftly take it down where it does appear.

 

 

Are there any types of content omitted from the scope of the Bill that you consider significant e.g. commercial pornography or the promotion of financial scams? How should they be covered if so? / The draft Bill applies to providers of user-to-user services and search services. Will this achieve the Government's policy aims? Should other types of services be included in the scope of the Bill?

 

The vast majority of pornography available online is ‘commercial pornography’, including user-generated pornography. People generally do not produce and upload pornography to the internet without expecting some reward for doing so. Platforms would not facilitate the distribution of this content if they too were not making money from it. Where content is made available ostensibly for ‘free’, platforms make money through revenue from advertising or by upselling users on premium content that must be paid for. So, some commercial pornography (that which is carried on user-to-user services) is already within scope of the Bill. It is only commercial pornography carried on services without a user-to-user element that is currently excluded. This is a significant omission in the Bill as currently drafted. There is no difference in terms of the harm risk to children between pornographic content that is carried on user-to-user services and pornographic content that is carried on services without a user-to-user element.

 

As recommended above, all commercial pornography should be brought into scope of the legislation. And all sites carrying such content should be required to put robust age-verification measures in place, to ensure that it is not accessible to children.

 

Research published by the BBFC in January 2020 explored children’s routes to accessing pornographic content. The findings were very clear that while it was common for children to have seen pornography on social media, most respondents would usually search for it on dedicated pornographic websites. And while many of the most popular pornographic websites do have a user-to-user element, it is inevitable that children’s behaviour will change if measures are put in place to restrict their access to their current sites of preference while other sites are allowed to fall out of scope. What will happen is that traffic by children will simply divert to sites where there are no preventive mechanisms in place. The risk is that those sites that are out of scope of the Bill will very quickly become normalised as the ‘go-to’ destinations for pornography access by UK children.

 

The Bill repeals Part 3 of the DEA. This means it is important for the Bill to achieve all the child-protection measures that Part 3 would have achieved. In terms of how commercial pornography should be covered, lessons can be taken from the regulatory architecture underpinning Part 3 of the DEA. The Part 3 requirements extended to all online pornography services operating on a commercial basis, wherever they are based. “Commercial basis” as defined in the Online Pornography (Commercial Basis) Regulations includes websites which offer pornographic content for free, but which generate revenue through advertising or premium content:

 

2.—(1) Pornographic material is to be regarded as made available on the internet to persons in the United Kingdom on a commercial basis for the purposes of Part 3 of the Digital Economy Act 2017 if either paragraph (2) or (3) are met.

 

(2) This paragraph applies if access to that pornographic material is available only upon payment.

 

(3) This paragraph applies (subject to paragraph (4)) if the pornographic material is made available free of charge and the person who makes it available receives (or reasonably expects to receive) a payment, reward or other benefit in connection with making it available on the internet.

 

(4) Subject to paragraph (5), paragraph (3) does not apply in a case where it is reasonable for the age-verification regulator to assume that pornographic material makes up less than one-third of the content of the material made available on or via the internet site or other means (such as an application program) of accessing the internet by means of which the pornographic material is made available.

 

(5) Paragraph (4) does not apply if the internet site or other means (such as an application program) of accessing the internet (by means of which the pornographic material is made available) is marketed as an internet site or other means of accessing the internet by means of which pornographic material is made available to persons in the United Kingdom.

 

The ‘one third’ rule introduced in paragraph (4) was intended to exclude social media platforms from the scope of the regulation and therefore the DEA which would otherwise have encompassed social media as well. Naturally, this would not be relevant in the context of the Online Safety Bill.

 

Other key elements that should be carried over from the planned DEA regime include (a) ensuring a level playing field for industry, (b) setting sensible age-verification standards, and (c) the need for proactive monitoring and investigations.

 

(a) Ensuring a level playing field for industry

 

For maximum compliance from the adult industry, there must be a level playing field. From our engagement with the adult industry, it has always been very clear that active investigation and swift enforcement were essential to ensure that compliant sites are not commercially disadvantaged by their non-compliant competitors and thereby incentivised to become non-compliant themselves.

 

All the big players will have contingency plans to avoid regulation if they see their commercial interests being damaged, so it is vital that enforcement processes not be slow or cumbersome. We were confident of securing a high degree of compliance under the DEA (upwards of 80% from day one was realistic and achievable) based on the adult industry being convinced that our investigations would lead to swift enforcement. Consideration therefore needs to be given as to whether any delays in enforcement, for example arising from the need for Ofcom to obtain a court order to use its business disruption powers, could impact compliance by the adult industry.

 

(b) Setting sensible age-verification standards

 

As the Age-verification Regulator, the BBFC published guidance on the kind of age-verification arrangements that would have ensured that pornographic services complied with the law. The guidance sets out the criteria by which the BBFC would have assessed whether a service had met the requirements of s14(1) of the DEA, as follows:

 

We opted for a principles-based approach, rather than specifying a finite number of “approved” solutions, to allow for and encourage technological innovation within the age-verification industry. In the years we worked on the project we saw substantial developments in the sector, notably the development of age estimation technology, which had the potential to be both robust and easy for consumers to use. The proposed age-verification regime for commercial pornography has therefore already fostered the development of a thriving digital economy with an ecosystem of companies delivering innovation in online safety.

 

The BBFC’s guidance outlines best practices, such as offering a choice of age-verification solutions to consumers. It also includes information about the requirements that age-verification services and online pornography providers must adhere to under data protection legislation, and the role and functions of the Information Commissioner’s Office (ICO). The guidance also sets out what would have been the BBFC's approach and powers in relation to online commercial pornographic services and considerations in terms of enforcement action.

 

(c) The need for proactive monitoring and investigations

 

While preparing for commencement of the DEA Part 3, the BBFC developed systems, workflows and processes for the proactive investigation of pornographic websites by BBFC Compliance Officers, who currently classify film and video works, including pornography, for release in the UK in cinemas, on physical formats and on VOD.

 

Pornographic sites would have been actively investigated on a daily basis to confirm that age-verification was in place, that the measures were robust and met the requirements set out in our guidance, and that the site did not contain extreme pornographic content. We would have prioritised the most popular sites based on data supplied by analytics company Comscore. As the majority of traffic goes to the most popular sites, and these sites are owned by an even smaller number of companies, we were confident that our efforts would have made a significant impact in a relatively brief period of time. There were processes in place for sites to be periodically re-investigated, to ensure continuous compliance, and for traffic to be monitored to enable a swift response to any changes in sites’ popularity.

 

Additionally, we would have investigated sites ranked highly in search engine results, and those reported to us by charities, stakeholders and members of the public.

 

 

What would be a suitable threshold for significant physical or psychological harm, and what would be a suitable way for service providers to determine whether this threshold had been met?

 

In accordance with our statutory role under the Video Recordings Act, the BBFC is required to have “special regard (among the other relevant factors) to any harm that may be caused to potential viewers or, through their behaviour, to society by the manner in which the work deals with—(a) criminal behaviour; (b) illegal drugs; (c) violent behaviour or incidents; (d) horrific behaviour or incidents; or (e) human sexual activity” (section 4A). As a result, considering and evaluating the likely harm potential of content has been central to our role for decades.

 

In relation to harm, in our classification decisions, the BBFC will consider whether the material, either on its own or in combination with other content of a similar nature, may cause any harm at the age category concerned. This consideration includes not just any harm that may result from the behaviour of potential viewers, but also any moral or societal harm that may be caused by, for example, desensitising a potential viewer to the effects of violence, degrading a potential viewer’s sense of empathy, encouraging a dehumanised view of others, encouraging antisocial attitudes, reinforcing unhealthy fantasies, or eroding a sense of moral responsibility. Especially with regard to children, harm may also include impairing social and moral development, distorting a viewer’s sense of right and wrong, and limiting their capacity for compassion.

 

Where possible we will carry out our responsibilities through appropriate use of the classification categories, particularly in order to protect children from any potential harm. We base all our ratings decisions on the standards set out in our published Classification Guidelines, which are themselves based on regular consultation we carry out with the UK public and with experts in issues such as suicide, self-harm, sexual violence and discrimination. If necessary, however, we may cut or even refuse to classify a film or video work, in line with the objective of preventing non-trivial harm risks to potential viewers and, through their behaviour, to society.

 

Specific factors which we take into account in our classification decisions cover sex, sexual violence and sexual threat, violence, language, discrimination, drugs, nudity, dangerous behaviour and works which portray antisocial behaviour uncritically, such as bullying, are likely to receive a higher classification. We will cut portrayals of potentially dangerous behaviour which children and young people may potentially copy, for example, suicide, self-harm and asphyxiation, if a higher classification is not appropriate.

 

The BBFC’s most recent Classification Guidelines consultation, in 2018, found several areas of raised concern for the British public in terms of classification of content, most notably: sexual violence; mental health of young people (especially with regard to content depicting suicide or self-harm); and racial discrimination. There is also concern about the impact of online content on behaviour - for example, 92% of teachers are concerned about the material that their students view online, which teachers believe results in the inappropriate behaviour and language that they claim to witness among students.

 

To inform both policy and individual classification decisions, the BBFC, where appropriate, also draws on expert advice. For example, we maintain close relations with the Samaritans and other suicide prevention experts in relation to classification policy regarding suicide and self-harm. And to inform our classification policies, we commission research into specific issues. Recent examples include domestic abuse and discrimination/racism.

 

While media effects research and expert opinion can provide valuable insights, it can be inconclusive or contradictory on issues of suitability and harm. In such cases, the BBFC relies on internal expertise to make a judgement as to the suitability of a work for classification at a particular age category, taking into consideration whether the availability of the material, to the age group concerned, is clearly unacceptable to broad public opinion. We do this without infringing the right of adults to choose what they view provided that it remains within the law and is not potentially harmful.

 

These standards are applicable beyond the classification of film and video works. For example, Ofcom’s draft guidance for UK-based video-sharing platforms (VSPs) advises that it may be useful for VSPs in scope of Ofcom’s regulation to ‘understand the strength and types of material that the BBFC regards as appropriate for different age groups in its Classification Guidelines’ when determining the type of material that might impair the physical, mental or moral development of under-18s. The guidance requires that VSPs put appropriate measures in place to protect children from this content, as well as from content that the BBFC would classify R18 or would refuse to classify.

 

BBFC guidelines also form the basis of the framework used by the UK's mobile network operators (EE, O2, Three and Vodafone) to restrict access to content on 3G and 4G networks that is unsuitable for people under the age of 18. This content includes, for example, pornography and other adult sexual content, pro-anorexia websites and content which promotes or glorifies discrimination or real-life violence.

Under this system, which has been in place since 2013, hundreds of millions of websites are automatically filtered according to trusted, transparent and consistent standards that reflect the concerns and expectations of UK audiences. Customers may only remove the network filters on mobile devices if they are able to prove that they are aged 18 or over, and the BBFC will adjudicate on any cases of under- or over-blocking of websites placed behind those filters. In addition, in 2015, the BBFC was appointed by the mobile network EE to provide the classification framework for its ‘Strict’ level, defining content that is unsuitable for children under 12 based on the PG standards set out in our Classification Guidelines.

 

As regards pornographic content, there is a substantial and growing evidence base that early exposure to pornography can have a damaging and long-term impact on children’s healthy development and relationships. In its statutory role, the BBFC will only classify pornographic videos at 18 or R18.

 

There is also the potential for certain types of pornographic content to cause harm to adult viewers or, through their behaviour, to society. The BBFC will refuse to classify the following:

 

The potential for pornography of this nature to have a harmful impact on its viewers was recently higlighted in the Home Office’s strategy for tackling violence against women and girls, published on 21 July 2021:

 

The Call for Evidence showed a widespread consensus about the harmful role of violent pornography can play in violence against women and girls, with most respondents to the open public surveys and many respondents to the nationally representative survey agreeing that an increase in violent pornography has led to more people being asked to agree to violent sex acts (54% nationally representative, 79% Phase 1, 78% Phase 2), and to more people being sexually assaulted (50% nationally representative, 70% Phase 1, 71% Phase 2).

 

Content of this nature is currently freely accessible online, in abundance, to children and adults alike. While the proposals set out in the draft Online Safety Bill will go some way to limiting children’s exposure to pornographic content, it does not address the widespread availability of potentially harmful types of pornographic content to adult consumers. The BBFC recommends that further consideration be given to the extent to which online and offline standards can be aligned in relation to harmful pornographic content.

 

 

The draft Bill sets a threshold for services to be designated as 'Category 1' services. What threshold would be suitable for this?

 

The threshold for services designated “Category 1” should not be set so high as to exclude the most popular pornographic platforms. The biggest sites can receive over 10 million unique visitors each month, and currently make available significant volumes of content that the BBFC would refuse to classify under the Video Recordings Act on harm grounds (as set out above). Ensuring that these services are subject to additional, Category 1 duties in relation to content that is harmful to adults would make a significant contribution towards the objective of ensuring that what is unacceptable offline is also unacceptable online.

 

 

Are Ofcom’s powers under the Bill proportionate, whilst remaining sufficient to allow it to carry out its regulatory role? Does Ofcom have sufficient resources to support these powers?

 

The regulator’s powers as proposed in the draft Bill should be sufficient to allow Ofcom to carry out its regulatory role. However, we would be concerned if any delays to enforcement arose, for example from the need for the regulator to obtain a court order to use its business disruption powers, as is currently proposed.

 

In order to be effective in protecting children from online pornography, the Online Safety regime needs to have the support of the global adult industry. From our experience and engagement with the adult industry, the majority of companies are willing to introduce age-verification, provided that there is a level playing field and the regulator acts swiftly against any non-compliant competitors.

 

If enforcement against non-compliant sites is not swift, this will create a market distortion between sites that have imposed age-verification and those that have not. This will in turn create commercial imperatives to not comply. All the largest companies will have contingency plans to avoid regulation if they see their commercial interests being damaged, such as geo-blocking and marketing VPNs to their users, which have significant implications for child protection.

 

While preparing to undertake its role as Age-verification Regulator under the DEA Part 3, the BBFC was confident of securing a very high level of compliance (upwards of 80% from day one was realistic and achievable) based on the adult industry being convinced that our investigations would lead to swift enforcement. For example, with the support of DCMS officials, we agreed with the Internet Service Providers a three-day time limit to block a website on instruction from the BBFC.

 

It has always been very clear to us that active investigation and swift enforcement is absolutely essential to ensure that compliant sites are not commercially disadvantaged by their non-compliant competitors and thereby incentivised to become non-compliant themselves. This is why it is so important that the regulator’s enforcement efforts are not made slow or cumbersome.

 

 

Concluding remarks

 

The BBFC supports regulatory initiatives to make the internet a safer place and particularly the focus on protecting children from potentially harmful material online. As has been recognised by Ministers, the BBFC has unparalleled expertise in pornography and age-verification, and we look forward to working with Government and supporting Ofcom as the regulator to ensure that children are adequately protected from pornography and other harmful online content.

 

We would be available to give further evidence and answer any questions raised by our submission.

 

 

September 2021