Increased online child sexual abuse: duty of care in the school environment

Dr Rachel O'Connell
trustelevate
Published in
7 min readApr 30, 2021

--

by Dr Rachel O’Connell and Aebha Curtis

New analysis in the Internet Watch Foundation’s (IWF) annual report has revealed the growing risk of children, aged between 7 and 16 years, and particularly girls aged 11–13, being targeted by criminal sex predators.

Predators groom, bully, and coerce their victims into filming their own sexual abuse on internet enabled devices, often in the child’s own bedroom. The images and videos of this abuse are then shared widely online.

IWF experts, who work internationally to find and remove child sexual abuse material from the internet, warn this abuse now, for the first time, makes up almost half of what they are finding online.

The IWF’s annual report reveals:

  • In 2020, the IWF confirmed 68,000 cases of self-generated imagery. It now accounts for nearly half (44%) of the imagery IWF took action on last year
  • This is a 77% increase on 2019’s total of 38,400 reports which included “self-generated” material.
  • New analysis shows in 80% of these cases, the victims were 11- to 13-year-old girls.

Lockdowns have led to an increased reliance on online communications and digital media throughout the population, regardless of age. For adults with a sexual interest in children, never has it been so easy to access them and, by and large, there don’t appear to be any negative consequences for adults grooming children online. The rise in live-streaming platforms, in particular, pose a marked threat to the safety of children online: the perceived ephemerality of the content makes it more challenging to moderate on the part of platforms and embolden predators who can use the synchronicity of live-streams to bully, coerce and groom their victims in real-time.

Live-streaming

The digital environments in which these activities occur are below the radar of most parents; even if parents and teachers are aware of TikTok and the short-form videos children post and watch on that app, they may not be aware of TikTok Live.

TikTok LIVE allows users and creators to interact in real-time. Users can launch live video streams and may send and receive gifts during a ‘LIVE’. In its community guidelines, TikTok says only 16-year-olds can launch LIVE and only those users who are over 18 years old can send gifts.

However, TikTok doesn’t know the ages of their users, and so cannot enforce these age-gates. So, children are on TikTok LIVE, typically late at night when their parents are in bed. Children with their phones set up beside their beds chatting with their audience, which can consist of other children, but these streams are also open to adult male viewers who can and do view and interact with these live streams.

Audience size can range anywhere from 1–200 people viewing and responding to the video via chat. This chat function also enables adult viewers to instruct children to do various things (such as ‘bend over’, ‘touch yourself’), while incentivisting them to follow these instructions with in-stream ‘gifts’. There is pressure exerted on these children to perform sex acts that are then recorded by the audience and shared widely. This content is the highly prized ‘self-generated’ child abuse imagery referred to in the IWF report. Other locations known to be exploited for the purposes described above include the teen dating site Yubo, which offers the same live-streaming facility, as does YouNow.

Sexual subscription-based content

OnlyFans is a content subscription service. People who create content on OnlyFans can earn money from other users (the ‘fans’) who subscribe to their channel or page. It allows creators to receive funding directly from their fans every month, as well as one-time tips and the pay-per-view (PPV) feature. OnlyFans is associated with sex workers. One of the most popular channels on PornHub, one of the largest pornography sites, is comprised of videos of OnlyFans content. And, during the pandemic, more people are turning to OnlyFans to earn money after losing their jobs during the pandemic.

TikTok users are encouraging young girls to sign up to OnlyFans and young girls are lured by the prospect of popularity and making money. Note that in 2019, OnlyFans did implement an age-verification system to crack down on underage users accessing the site and creating their own accounts. Creators are required to submit a selfie of themselves with their legal ID in the picture to prove their identity and age. But underage users are using other people’s IDs or creating photoshopped images in order to bypass the verification system.

https://theredelephants.com/wp-content/uploads/2020/08/onlyfans.mp4?_=1

The narrative is that it’s empowering to girls to own their bodies and make money from it. The fact that it is other young girls encouraging children and young people to engage in these activities highlights the altered parameters of grooming and abuse scenarios, as well as the increasing legitimization and normalisation of such activities, which are now occurring in children and young people’s bedrooms.

Offline consequences

And now, with lockdowns easing, predators have access to those children with whom they interacted on live streams, offline and in the real world, such that they may (and do) coerce victims into a meeting by threatening to share the content they created. This can result in myriad further harms: children may self-harm in response to the stress of the situation, agree to create more content online to prevent the threatened sharing of the initial content generated and/or go to a real-life meeting with the predator, to name a few.

The findings of the IWF are shocking but their implications are perhaps even moreso and merit a coordinated strategic response. In this document, we will provide advice on how best to leverage these learnings to foster an environment in which children feel safe to report grooming and online sexual abuse while also implementing preventive measures.

There are numerous questions about the lack of response to this growing problem. What are law enforcement agencies doing to tackle these issues?

Industry response and awareness-raising

The IWF report doesn’t name the platforms from which the ‘self-generated’ content originates. One would expect a more coordinated IWF and industry-led public response to these activities, given that for example, both TikTok and Yubo are members of the IWF’s funding council.

The IWF did produce a highly impactful video titled Home Truths, which highlights the scale of the problem and the disturbing fact of online communication with adults with a sexual interest in children being a part of many children’s activities inside their family home. However, the campaign video has captured little media attention and garnered minimal public response. More must be done.

Vital to the success of any strategy to prevent online child sexual abuse is a nationwide campaign to raise awareness and counteract these activities’ normalisation. Advice must be provided to parents and other adults, such as teachers and social workers, who are responsible for the wellbeing of children. So, too, should children be receiving advice and guidance regarding the topic of grooming and online sexual coercion.

While the IWF report is illuminating, we must carefully consider the ways in which the learnings from it can be translated into effective safeguarding policies and procedures.

‘Self-generated content’, according to the IWF, ‘can include child sexual abuse content, created using webcams, sometimes in the child’s own room, and then shared online’. It is worth considering this label and the roles it seems to confer on the involved parties: ‘self-generated’ misallocates the onus on the child and implies an agency that has been negated by predators who engage in grooming, bullying and coercion to extract this imagery from children. So, while the term may well be helpful in the context of this report for distinguishing between that child sexual abuse imagery which is generated via in-person coercion and that which is generated via remote means, it may not be helpful in schools or when encouraging reporting.

Advice

It is vital that it is made clear to children that they are not going to be blamed for having been victimised and made subject to online child sexual abuse. School staff must be seen to be free of judgement and trustworthy.

Note that the grooming process will often involve a predator telling their victims that they love them, such that a child may not perceive their abuser to be a bad person. These issues require a great deal of sensitivity.

Creating an understanding of responsibility, particularly in relation to authority, is important in encouraging children to come forward. Getting a law enforcement officer to come to school to provide reassurance regarding consequences and the allocation of blame may be helpful.

Know that predators get children to introduce them to other children during sleepovers. Parents must be made aware of these patterns and be engaged in active dialogue with the school around digital safeguarding.

Being a victim of sexual abuse in childhood can normalise sexual behaviour and lead to peer-to-peer abuse, so it is essential to be sensitive to the complexities of what victims have experienced and maintain open lines of communication.

This is the first in a series of posts on this topic — the next post will focus on advice and guidance for dedicated safeguarding professionals in schools, and guidance for parents and guardians and most importantly children.

--

--