Written evidence submitted by John Carr OBE, Secretary, Children’s Charities’ Coalition on Internet Safety (OSB0167)
Dear Joint Committee,
Thank you for the opportunity to make this submission in relation to the draft Online Safety Bill (OSB).
Polonius tell us “Brevity is the soul of wit”, it can also be the companion of the hard-pressed. “Consultation fatigue” is sometimes used rather pejoratively but it might accurately reflect a level of frustration rooted in a recognition of the importance of responding to legitimate questions and enquiries, particularly by Parliament, but against a background of not having the resources to ensure one can always do justice to the matter in hand.
The points made in this document therefore should not be regarded as representing the finished positions or policies of any single member organization within the Children’s Charities’ Coalition on Internet Safety. Indeed some will be making their own submissions. The points made in this document nevertheless are offered for discussion.
Summary of key points:
It is extremely important to have clarity about whether or to what extent platforms will continue to enjoy legal immunities even where it can be shown they are NOT conforming with the statutory codes of practice. Companies need to be incentivized to comply with all relevant standards straight away and not feel they can wait until a possibly overworked Ofcom gets round to looking at them at some indeterminate point in the future.
Should platforms where children are present be required to use tools to detect and remove convicted child sex offenders? This would mirror a similar (but voluntary) practice already in place in the USA. In the UK this would mean platforms would need to be given machine-readable access to a reconfigured version of the Sex Offender Register. There is no suggestion the UK Register should be made a public document.
It is well-established that sexual predators and others who would harm children can make contact with a child on one platform and persuade the child to continue their online discussions or relationship on a different one. Thus, if one platform removes someone because they have behaved inappropriately towards children or are considered a risk to children, could arrangements be made to facilitate information-sharing between platforms to minimise or eliminate the risk of these individuals simply swapping platforms, establishing a new log-in and starting again?
In the USA, where someone has been found in possession of a child sexual abuse image a system has been developed to compel offenders to pay damages which, inter alia, cover some or all of the cost of the victims’ treatment, thus relieving the taxpayer of some or all of the financial burden. It also acts as an additional deterrent to this type of crime.
The original version of the duty of care should be reinstated.
The provisions on pornography are far too weak, particularly in respect of the enforcement powers. Limiting the scope to sites or services which allow user interactivity makes no sense.
It is important to have clarity about which standard will be used to assess whether a child has been harmed or is likely to be harmed.
Children in foreign jurisdictions should not be overlooked.
Misinformation and disinformation are child protection issues.
Clarity is needed in relation to how the Law Commission’s proposals will be integrated into the legislation.
App Stores and systems used to give age ratings or other guidance to parents or children about the suitability of a particular App, game or content need to observe rules which provide a high level of consistency based on a clear rationale.
Children’s organizations are facing enormous challenges in terms of their ability to respond to the hugely increased volume of activity which lies ahead as Ofcom and others begin the process of formulating the new regulatory environment.
An obligation on platforms to try to detect convicted child sex offenders?
- In the USA the Sex Offender Registers are public documents. At one point all of the large social media platforms made clear that, irrespective of the nature of the offence, anyone on a Sex Offender Register was barred from opening or maintaining an account with them.
- The US platforms were able to access the Registers and use the data to determine whether or not an account holder was in breach of that condition. Obviously a great many of the individuals on the Registers lied about their real identity but the platforms developed a range of tools which nevertheless helped locate them with an extremely high level of accuracy and there were also appeals mechanisms in place against potential errors. Tens of thousands of convicted sex offenders were thrown off the platforms.
- There is little support for making the UK’s Sex Offender Register publicly available, but ways could be found to allow platforms which allowed children to be members to have machine-readable access so as to achieve a similar end.
- There is a strong case for ensuring a UK policy is limited only to persons on the Register who pose a threat to children but there needs to be a joined-up approach to the assessment of threats to children through both online and offline offending.
- For example, the Police, Crime, Sentencing and Courts Bill has provisions relating to the greater use of foreign travel restrictions orders via the establishment and maintenance of a list of countries where children are considered to be at high risk of sexual abuse and exploitation by UK offenders. However, it seems the threat assessments that will be used to decide whether or not an order should be made may be based only on what is known about contact offending in the offline world.
- Yet the scale of actual or potential offending in the virtual world is substantial. During the COVID-19 pandemic EUROPOL reported the number of incidents involving livestreamed child sexual abuse had intensified. According to the UK’s National Crime Agency (NCA), an estimated 550,000 to 850,000 people in Britain pose a sexual risk to children, including online. The UK is now the third largest global consumer of livestreamed child sexual abuse. The Bill should ensure the need proactively to detect child sexual abuse in livestreaming is within scope.
Information sharing between platforms
- It is well-documented that sexual predators and others who would harm children can make contact with a child on one platform and persuade the child to continue their online discussions or relationship on a different one. Thus, there may be some value in investigating whether or to what extent platforms can establish information-sharing arrangements which would allow them to inform each other when they have taken action to remove an individual from their platform for a reason connected with the protection of children. This would allow other platforms to take a view as to whether or not they should act in like manner, or at the very least it would alert them. They could monitor the person accordingly.
- Such an arrangement would be analogous to that which currently exists to allow platforms to identify and remove already known instances of child sex abuse material. It would also tie in neatly with any new system, such as that suggested earlier, which might allow platforms to identify convicted child sex offenders on the Sex Offender Register.
The position of victims
- Children have a right to redress. Is there a case for persons found in possession of child sex abuse material being made to pay compensation to any and all identifiable victims in the images and perhaps also contribute to a wider fund which could be used to help a larger class of victims who suffered sexual abuse as children?
- Such compensation could, among other things, cover some or all of the cost of treatment and support for victims, thus relieving the taxpayer of some or all of the financial burden. It would also act as an additional deterrent to this type of crime.
- Please see the case of Amy v Paroline in the US Supreme Court and the legislation which was prompted by that case. The legislation clarified how a court might calculate the amount of compensation to be paid by an individual defendant to an identified victim of child sexual abuse where the defendant was in possession of an image of that child being sexually abused.
The duty of care
- There is a strong case to be made for reinstating the principle embodied in earlier versions of the Government’s policy, namely that an overarching duty of care should be created. Subject to the principle of proportionality, it would embrace and affect every provider of an online service likely to be used by children or in fact being used by children. The duty would also embrace and affect every manufacturer of devices on sale in the consumer space which can connect to the internet where the device is likely to be used by children or is in fact being used by children. Such an overarching duty of care need not supplant or replace the specific or targeted duties of care referenced in the OSB.
- Linked to that should be an explicit statement that, where a device or service is likely to be used by children or is in fact being used by children all such companies are under an obligation to ensure that at the point of first use the device or service is as safe as it possibly can be. This means the principle of safety by design and security by default are built into the law, reinforcing and underpinning the principles enshrined in the Age Appropriate Design Code.
- It should be made explicit that while, in all appropriate cases, it would always be possible to reduce, weaken, change or remove any protections put in place at the point of first use this should only be possible following a careful explanation of the likely consequences, delivered in age-appropriate language.
- In this context the current limitation of the OSB only to online spaces which allow or facilitate user interactivity or the exchange of user generated content should be abandoned. What counts is not the nature of the platform or the environment but the nature of the likely harm irrespective of how or where it appears on a child’s screen or how or by whom it was put there.
Pornography
- It follows from these earlier remarks there is strong opposition to limiting any obligations to address online pornography only to platforms which allow user generated content or interactions between users.
- It would be a trivial matter for any and all of the world’s largest publishers of pornography to drop user generated content and user interactivity altogether without there being any significant impact on their revenues. It must be anticipated that they would do so readily in order to avoid being caught by any new UK regulations. If that were to happen absolutely nothing or very little would have changed. Exactly the same sites would remain accessible to children.
- it should be made clear on the face of the OSB that material which would ordinarily qualify for an 18 or R18 certificate is considered pornographic and for the purposes of this legislation it will always fall into the highest category of harmful material. Thus, wherever it appears on the internet it should be behind a robust age verification mechanism which will normally keep out under 18s.
- There are grave concerns about the cumbersome and time-consuming nature of the enforcement powers which Ofcom will have at its disposal to address non-compliant pornography sites, particularly those outside UK.
- Part 3 of the Digital Economy Act, 2017, foresaw the Regulator being able to act swiftly and effectively against non-compliant sites in a matter of days, not the months which would seem to be inevitable if the OSB is not materially changed before becoming law.
- Protracted enforcement times make no sense in the context of harms to children in the online space. Appeals against decisions likewise should be capable of being resolved rapidly. Any decisions, and the systems by which they are made, would, in any event be subject to the overall supervision of the courts.
Clarity about how to determine whether a child has been harmed or is at risk
- It is extremely important to have clarity in relation to the standard which will be used to determine how or whether a child has been harmed or is likely to be harmed in online environments. Different parts of the law currently use different ways of measuring or predicting the likelihood of harm. There should be no ambiguity here.
Obligations towards children in foreign jurisdictions
- It should be made explicit on the face of the Bill that any and all sanctions and responsibilities which apply vis-à-vis children in the UK apply equally in respect of children domiciled in foreign jurisdictions. At the very least this will give added moment to the pressing need to address, for example, the challenges associated with the increase in livestreaming. Live streamed abuse thrives because of the differences in legislation and enforcement across national borders.
Clarity about the circumstances in which legal immunities will apply
- Under existing UK law, it appears to be the case, unless and until notified, platforms have zero liability for the conduct of third parties who use their services.
- The OSB introduces new obligations on platforms which ought to influence their behaviour but, for the avoidance of doubt, somewhere in the OSB it should be made explicit that, in addition to any penalty Ofcom may exact for breach, where a platform governed by a code of practice or other regulations fails to honour the relevant terms, not only could it become subject to the penalties set out in the OSB it will also forfeit any and all criminal and civil immunities from which it would otherwise have benefitted.
- This could be of particular importance to individual victims depicted in child sex abuse images or to children who might have been victimised by sexual predators. Absent such a provision what the OSB is saying, in effect, is “don’t worry, you still enjoy immunity from liability for acts by third parties on your platform, even if you are wilfully or recklessly ignoring the codes of practice.”
- This will encourage platforms to think they only risk being penalised if Ofcom actually comes after them. That is unacceptable. Platforms should be given every incentive to ensure they are compliant. That will only happen at scale if they know being non-compliant is risky and does not depend entirely on the efficacy of Ofcom.
- To be clear: it is not being suggested that if platforms fail to honour the terms of a code or regulation, they forfeit all immunities in respect of everything they do. That would be unreasonable.
- But where a reasonably foreseeable actual harm has resulted or is alleged to have resulted from a failure to implement the terms of a code or regulation then whoever can be said to have been injured as a result should be free to bring an action which would previously have been barred or would have failed because of the immunity. The immunity is therefore lost only insofar as it concerns and is limited to the reasonably foreseeable harm suffered by an identifiable individual or group.
- Something like this would focus the minds of every Director or senior manager of every platform and would relieve Ofcom of a great deal of the responsibility for ensuring online businesses are routinely following the law rather than just hoping they never get caught or inspected or if they are caught it could be some time hence when perhaps the company Directors or employees who might have been responsible for the initial failure may no longer be in the company’s employ.
- Currently, the Bill (clauses 36 and 81) suggests enforcement action will be taken against platforms only if CSEA content is “prevalent” or “persistently present”. This is ambiguous.
- The technology which enables platforms to detect and deter or eliminate CSEA on their services, typically, is now trivially easy to implement, is inexpensive and getting cheaper.
- If a platform is properly honouring its duty of care, it will ensure it has systems in place which will minimise or eliminate the risk of CSEA being present in the first place. Thus, in deciding whether or not to take enforcement action, Ofcom should have regard to whether or not such systems have been put in place. Otherwise absent such a provision malevolent individuals will feel free to abuse any platform. The harm done to a child even by occasional or random exposure to CSEA may be substantial and if it can be avoided it should be.
Misinformation and disinformation are child protection issues
- There appears to be a growing trend among adults of paying to access more reliable sources of news of current affairs and other issues of importance to them in any number of fields.
- This is not an option which is open to many children thus to the extent that children are forced to rely on the wider internet as their principal source of news and of information for their school work or to further other interests, they are at risk of becoming prey to manipulative or extreme elements who intentionally distort or misrepresent the facts either in pursuit of a political project or to monetise their online business by attracting a larger number of visitors. The major platforms know very well that they benefit from “clickbait” and from bizarre distortions of facts which are more likely to attract attention.
- Media literacy is now an essential component of every modern child’s education but it is also important that platforms which are likely to attract or which in fact attract children are under a positive obligation to root out and address misinformation and disinformation which is likely to have an adverse impact on a child’s education and personal development.
Law Commission’s proposals
- It is not yet clear how the important matters raised in the Law Commission’s report on harmful communications will be integrated into the OSB, yet several of the Commission’s recommendations overlap with or touch the Bill at different points.
Inconsistent age ratings
- It should be made clear that App Stores in particular, but also other systems used to provide guidance to parents and children about the suitability of, say, a game or an App or other content, are bound by the duty of care advocated above.
- This would require the relevant parties, for example, to ensure a degree of logic and consistency, based on a clear rationale, as between the rating provided by the original publisher of the game, App or content and that used by the App Store, rating system or other vendor.
Resources to enable children’s advocacy groups’ participation
- The pandemic has had a severe adverse impact on the income streams of a great many charitable institutions. The majority of children’s groups that play an advocacy and representational role in respect of children’s interests are charities. They have not been exempted from this regrettable reality. However, even if there had been no pandemic, it remains doubtful that our collective or individual efforts will be able to match the needs of the coming moment. The OSB and therefore Government policy appear to take no account of this important fact.
- Thus, while the openness and willingness of the Government and Parliament to seek and respond to comments from the public in general and from specialist groups such as the children’s charities is appreciated, the unvarnished truth is many children’s groups will be unable to sustain the level of engagement which will be required once the OSB becomes an Act and Ofcom in particular begins drafting rules which have legal force. That cannot be in the best interests of children.
- Over the past twenty or so years, the UK Council for Internet Safety and its predecessor bodies, the UK Council for Child Internet Safety and the Home Office Task Force on Child Protection on the Internet have crafted several detailed advice or guidance notes or codes of practice which were the product of processes likely to be similar to those which lie ahead. See for example:
- In a similar vein below are listed some of the areas which have required the attention of children’s organizations either collectively or individually in the past few years. They have involved working with many different industry bodies or companies, both wholly UK focused, or with an international dimension.
- Transitioning from primary to secondary school
- Gaming
- Livestreaming
- Parenting Digital Natives
- Vulnerable Children x 2
- Sexting
- In Their Own Words – schoolchildren’s online lives
- Age Verification for adult content
- Home Tech by 2025
- Beyond Covid
- Well-being in a digital world.
- All of the recommendations for action contained in published documents shown above were voluntary in nature. They had no legal force but, even so, different parts of the high-tech industries which had an interest still employed lawyers and lobbyists to sit alongside or help their full-time staff who had day to day responsibility. Alternatively, they engaged with their trade associations to ensure their interests were consistently and strongly represented throughout the drafting and adoption processes. Children’s organizations struggled to keep up.
- Following the passage into law of the OSB all or most of the codes which Ofcom and others will produce with will have legal standing consequently their importance will be that much greater, and they are likely to cover a broader range of topics. Thus, while we can expect the level of engagement by online businesses and their associated industry bodies to step up commensurately, the children’s charities, representing children’s voices, will remain where they have been for some time, facing challenges. This means a key partner in the debate and the policy making processes may at times be working sub-optimally. That does not augur well for the important outcomes which will affect children’s welfare.
- It is not being suggested that civil society organizations or indeed the Government, will ever be able to match the resources available to the industry in relation to matters of this kind but it is being suggested the playing field is far too vertiginously tipped against children’s interests at the moment. Given children are a core target group for the OSB this is unacceptable.
- A way must therefore be found to fund some sort of resource centre or base which can act as a support to children’s advocacy groups. Direct funding from high tech companies or their trade associations will not be acceptable. If a compulsory industry levy had already existed, administered by an independent third party, it might have been a sufficiently insulated source of funding, but it does not exist so another source must be found.
- The children’s organizations cannot rely on the generosity of charitable foundations which, in any event, tend to focus their financial support on meeting the immediate needs of children and families who are often in quite desperate situations.
- Asking charitable foundations to divert funds to ensure the likes of Facebook, Google, Apple, Microsoft, Tik Tok et al are doing right by children and for this to be sustained over several years, is quite a leap. In that respect the same can be said for many individual children’s charities. They have faced great difficulty maintaining the funding for their core or historic work. They have not yet been able to adjust their levels of funding to accommodate this still relatively new stream. While one might expect that to change over time we are not there yet, and it may be some time before we are.
- In future, might a proportion of the revenues obtained from any fines levied under the Act also be devoted to some aspects of the work in this area (not core funding)?
---ooo---
28 September 2021.