Self-generated sexual abuse of children aged seven to 10 rises two-thirds: IWF

Incidents of children aged between seven and 10 being manipulated into recording abuse of themselves have surged by two-thirds over the past six months, according to a global report.

Almost 20,000 reports of self-generated child sexual abuse content were seen by the Internet Watch Foundation (IWF) in the first six months of this year, compared with just under 12,000 for the same period this year. The disturbing global trend has grown rapidly since the initial coronavirus lockdown, with cases involving that age group up 360% since the first half of 2020.

To continue reading this report in The Guardian, go to:
https://www.theguardian.com/technology/2022/aug/09/self-generated-sexual-abuse-of-children-aged-seven-to-10-rises-two-thirds

Also see:

20,000 reports of coerced ‘self-generated’ sexual abuse imagery seen in first half of 2022 show 7- to 10-year-olds [news release]
New data* released by the Internet Watch Foundation (IWF) shows almost 20,000 webpages of child sexual abuse imagery in the first half of 2022** included ‘self-generated’ content of 7- to 10-year-old children.

That is nearly 8,000 more instances than the same period last year. And when compared to the first half of 2020, when the UK entered its first Covid lockdown, there’s been a 360% increase in this type of imagery of 7- to 10-year-olds.

UK-based IWF is Europe’s largest hotline dedicated to finding and removing images and videos of child sexual abuse from the internet. It is the only European hotline with the legal powers to proactively search for this kind of content.

It’s calling the rapid growth of this material, showing primary-aged children, a social and digital emergency, which needs focussed and sustained efforts from the Government, the tech industry, law enforcement, education and third sectors to combat it.

Susie Hargreaves OBE, IWF Chief Executive, said: “There is no place for child sexual abuse on the internet and we cannot simply accept, year on year, that sexual imagery of children is allowed to be exchanged without constraint online.

“That’s not to say a huge amount of effort isn’t taking place to combat it – there is. At the IWF we identify this imagery, work with partners and tech companies globally to get it removed and provide services and datasets to tech companies to stop known imagery from being re-shared, and re-uploaded. But clearly more needs to be done and it’s not a problem which sits solely with one group or sector. When we work together, we can create impact and it’s needed now, more than ever.

“Child sexual abuse which is facilitated and captured by technology using an internet connection does not require the abuser to be physically present, and most often takes place when the child is in their bedroom – a supposedly ‘safe space’ in the family home. Therefore, it should be entirely preventable. We need to attack this criminality from several directions, including providing parents and carers with support to have positive discussions around technology use and sexual abuse, within the home.

“Children are not to blame. They are often being coerced, tricked or pressured by sexual abusers on the internet.

“Only when the education of parents, carers and children comes together with efforts by tech companies, the Government, police and third sector, can we hope to stem the tide of this criminal imagery. That is why the Online Safety Bill is so essential. Children everywhere need the UK Government to be role models in internet regulation and for Ofcom to acknowledge experts, like IWF, who are experienced in tackling this criminality online. The efforts in this area are being watched around the world.”

Photo of IWF CEO Susie Hargreaves
Susie Hargreaves OBE, IWF CEO

While the fastest increases are among the 7-10 age group, the 11-13 age group still represents the biggest amount of ‘self-generated’ imagery. A 137% increase was also seen in this subset (self-generated) of sexual abuse content specifically showing boys aged between 7-13 years old.

‘Self-generated’ child sexual abuse imagery is created using webcams or smartphones and then shared online via a growing number of platforms. In some cases, children are groomed, deceived or extorted into producing and sharing a sexual image or video of themselves.

Emergence of new sites containing imagery of 7- to 10-year-olds:

Detecting and removing child sexual abuse material is complex.

IWF finds vast amounts of ‘self-generated’ child sexual abuse material being distributed through forums online and the criminal images displayed on these forums are being pulled from image host sites. When the IWF takes action to get this imagery removed, they ensure they take action on the forums, as well as making sure each image is removed.

The top five sites used to store and distribute self-generated child sexual abuse imagery of 7- to 10-year-olds in the first half of 2022 were new. This means the IWF had not seen them used for this subset (self-generated content of 7- to 10-year-olds) of child sexual abuse material before.

Rosa, an IWF Analyst, said: “Every day, I see children who have been asked to remove their clothes, stand naked or perform in front of a camera.

“They’re asked to show close ups of their genitals and sometimes use household objects to masturbate with. This happens in their bedrooms, mostly, where we see toys, laundry baskets, posters on walls, teddy bears and wardrobes full of clothes.”

https://www.buzzsprout.com/2027019?artist=Internet+Watch+Foundation+%28IWF%29&client_source=large_player&iframe=true&referrer=https%3A%2F%2Fwww.buzzsprout.com%2F2027019%2Fpodcast%2Fembed&tags=episode2

Home Secretary Priti Patel said: “The cruelty and inhumanity of people who abuse children is appalling. Since becoming Home Secretary, I have been unequivocal in my backing of law enforcement to go after those disgusting offenders who abuse children both in the U.K. and abroad. I have led every international effort to tackle this abuse and persuaded my international counterparts to do the same. 

“Online child sexual abuse has a lifelong effect on victims. I have pursued policies and actions to ensure technology companies are held accountable for keeping our children safe.”

Priti Patel Portrait
Rt Hon Priti Patel MP

DCC Ian Critchley, NPCC lead for Child Protection and Abuse Investigation, said: “The scale of the rise in this imagery should shock us all – and once again highlights the need to drive focus within society, and in our whole system response; across policing, third sector partners, HM Government, and the technology companies and platforms that host this appalling content and enable the exploitation of children. As a society, we need to stop online abuse happening in the first place. We need companies and platforms to fulfil their moral obligations – and under the Online safety bill their legal duty – to keep the online communities they create as safe as possible.

“I’m proud to partner with IWF in its work to stop the recirculation of images, build technology that makes it easier to remove images, and liaise with industry as we seek to navigate ever increasing technological developments, to ensure we continue to prevent harm to our children.

“We know that we are still seeing the effect of the Covid-19 pandemic – with lockdown restrictions increasing the vulnerability of children to online sexual abuse, with more children online and unsupervised, and vulnerable children having less interaction with professionals throughout the lockdown.

Portrait photo of DCC Ian Critchley
DCC Ian Critchley, NPCC lead for Child Protection and Abuse Investigation

“Policing restates its commitment to bringing offenders to justice. Everyday this commitment is shown across the country through my policing colleagues, arresting offenders who seek to commit the most appalling acts of abuse by grooming, coercing and exploiting children online. This allows us to safeguard and protect many children every single day, but the responsibility for this goes far beyond policing and ultimately the biggest difference could and should be made by the tech companies and platforms who have made these online communities and ultimately should be the ones who are held to account for keeping children using them safe.

“Our highly skilled and dedicated digital forensic investigators must face ever increasing technological capabilities, where offenders can and will seek to hide their offending in every way possible. We are continuing to enhance our digital forensic capability, we have increased resources in online child abuse teams and specialist officers who identify those who would groom and exploit children online.

“We must aspire to make sure the online world is a safe place for children. These figures from the IWF today shows there is much for us all to do to achieve this aspiration.”

Sarah Blight, Deputy Director for child sexual abuse at the National Crime Agency, said: “Offenders are using apps, online games and social media platforms to seek out children and coerce them into sharing sexual images of themselves. Not only has the internet made it easier for this initial contact to happen, it also facilitates the sharing of child sexual abuse material between offenders.

“What we then have is a permissive environment for them to develop their sexual interest in children and normalise their behaviour. It is essential that there is a whole system response by all of society, including the tech industry, to tackling online child sexual abuse and the proliferation of such images.

“Combatting this ever-increasing threat is a priority for the NCA and UK policing. Our coordination of the overall UK law enforcement response to online CSA saw 13,447 children protected or safeguarded and 10,181 offenders arrested or interviewed last year. This included more of those classified as high-harm than ever before.

“Education is also a key part of our response and we work with professionals, parents and carers, children and young people, to try and prevent online sexual abuse happening in the first place.

“Our aim is to reduce the vulnerability of children and young people so they don’t become victims. We also want to ensure the adults in their lives have the tools they need to offer support and talk to them about who they are communicating with, and what they are sharing online.

“Advice and resources are available at thinkuknow.co.uk.”

Parents and carers are encouraged to T.A.L.K to their children about the dangers. Visit talk.iwf.org.uk for a parent-and-carer-friendly guide on preventing this type of abuse.

  • Talk to your child about online sexual abuse. Start the conversation – and listen to their concerns.
  • Agree ground rules about the way you use technology as a family.
  • Learn about the platforms and apps your child loves. Take an interest in their online life.
  • Know how to use tools, apps and settings that can help to keep your child safe online.

Notes to editors:

Contact: Cat McShane, IWF Press Officer catherine.mcshane@iwf.org.uk +44 (0) 7572 783227

This year the IWF is marking its 25th anniversary. Since it began, 1,800,000 reports have been assessed by IWF analysts. 970,000 child sexual abuse reports have been actioned for removal. As each report contains at least one, and sometimes thousands of images, this equates to millions of criminal images removed from the internet.

*Data quoted in this press release:

**1 January 2022 to 30 June 2022. Figures quoted are compared to the same periods in 2021, and 2020.

Data tables showing age comparison

Examples of self-generated child sexual abuse featuring 7-13 year old children as described by IWF Analyst, Rosa.

Boy 1:

“A boy in early puberty is standing, naked in his bedroom. His webcam is angled to show his naked body from his head to his knees, making his naked and erect genitals the focus of the screen. The video, which is just over one minute long, shows the boy masturbating. His narrow shoulders and undeveloped body tell me he is no older than 12 years old. He says nothing, but he is fully exposed.

“Behind him is a normal boys’ bedroom: clothes hanging in the wardroom, a desk, and some posters on the wall. The edge of a blanket is visible at the top of the screen, as though it’s partially covering the webcam. This tells me he is ready to quickly shut down or hide what he’s being asked to do. The grown-ups at home may never know what has happened, but it’s clear he is dealing with this big burden alone.”

Girl 1:

“In another instance, a video opens with a young girl singing childishly to camera. Fifteen seconds into the video, she moves her face close to her device, reading something on the screen. Immediately after, she stands, and moves the device’s camera towards her already naked genitals. She is not speaking English, but her voice, her size and her movements gives away just how young she is – between 7 and 10.

“She is on her bed, in her tidy, bright bedroom with girly patterns and a pink duvet. It’s daytime and I can hear faint household noise and traffic on the video’s audio. The way she interacts with her device tell me she is responding to direction from a remote abuser.  She poses, dances awkwardly and focuses the camera on her genitals. She presses various objects against her genitals. Her eyes focus on the device, looking for instructions on what to do next. It feels strange to see her smiling, but then I realise she is in the safety of her bedroom. Her comfort in her safe space is being exploited by this abuser in front of my eyes.”

Girl 2:

“Another video shows a very small girl between 6 and 8, kneeling in her bedroom with a device on the floor. She is sitting in her knickers, with her shelves of toys and a full laundry basket behind her.  She stares at the screen and removes her underwear, holding it up to the camera. Staring at the screen, puzzled, she starts touching her genitals. From the confused expression on her face, I know she doesn’t understand what or why she is doing. She shyly draws her knees in and pulls her top down, and the video cuts out.”

The IWF is a partner in the UK Safer Internet Centre (UKSIC), with SWGfL which provides support and resources for the following:

  • The Harmful Sexual Behaviour Support Service which gives guidance and support to professionals working with children on harmful sexual behaviours, including self-generated sexual imagery, sharing sexual content online and sexual harassment and abuse. Call SWGfL practitioners on the HSB Support Service on 0344 2250623 or email: hsbsupport@swgfl.org.uk
  • So You Got Naked Online is a resource offering advice and strategies for parents, children and young people who have shared a sexting image or video online and have lost control over that content and who may be sharing it.
  • Project Evolve is an award-winning digital literacy toolkit for professionals in education. Developed by SWGfL, the toolkit defines and provides the digital skills for children and young people to be safe online and avoid exploitation.

Background from the UK Government’s Home Office:

  • The Government is firmly committed to tackling all forms of child sexual abuse. Our approach is underpinned by the Tackling Child Sexual Abuse Strategy (published in January 2021), which sets out of a whole-of-system response, including supporting frontline professionals in education, social care, health, law enforcement and the wider criminal justice system to confront child sexual abuse.
  • The Bill will deliver on the government’s manifesto commitment to make the UK the safest place in the world to be online while defending freedom of speech.
  • The government is introducing an amendment to the Online Safety Bill to strengthen Ofcom’s powers in relation to notices to deal with CSEA. This amendment will give Ofcom the power to require companies to not only use accredited tech, but to require companies to develop/source or deploy technology to prevent, identify and remove CSEA content, where no accredited technology is available.
  • Tech firms will need to have effective systems and processes in place to minimise the presence of illegal content such as CSEA and remove it. The strictest obligations in the Bill are for child sexual abuse content, including having to minimise and remove grooming content. Companies will additionally need to report CSEA content to the NCA. If they fail to do so they will be held to account through huge fines, service suspension and their bosses could be held criminally liable.
  • The Government’s five-year Child Abuse Image Database (CAID) Transformation programme will help law enforcement to manage the scale of child sexual abuse material (CSAM) by further enhancing the CAID system, enriching data and allowing greater sharing of data and capabilities.
  • https://www.iwf.org.uk/news-media/news/20-000-reports-of-coerced-self-generated-sexual-abuse-imagery-seen-in-first-half-of-2022-show-7-to-10-year-olds/

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.