Exploitation through the Screen: The Rise of Self-Generated Abuse Material
The following details are from a true case which ICMEC supported. Names have been changed to protect the identity of the child and their family.
Luisa is turning 15 in a few days. What is supposed to be a day of celebration instead feels like a ticking time bomb. Last week, Luisa’s “boyfriend”* of two years asked her to send him photos. Not just a regular selfie but a photo without her clothes on. She didn’t want to at first but she loved him and wanted to make him happy, so she finally did it. Now he is asking for more explicit photos, threatening to send them to her family on her birthday if she doesn’t comply. Luisa doesn’t know what to do.
Her friends are worried about her so they tell her parents. Luisa thought they would be mad at her but they are just concerned and want to help. Her parents tell the police. ICMEC connects the authorities in her home country to the police where her “boyfriend” lives and two days later he was arrested. Luisa’s 15th birthday doesn’t end in tragedy but rather relief that the nightmare is over.
This is a real story and unfortunately it is not a unique one. Luisa’s story is an example of the rising trend of the use of coerced self-generated material to exploit children.
*Luisa believed the offender was her boyfriend but this blog will refer to him using quotations because this was not a consensual relationship.
What is Self-Generated Child Sexual Abuse Material (SG-CSAM)?
Self-generated CSAM is sexually explicit photos or videos that a child (person under the age of 18) captures themselves. This term is widely used, however, special attention should be paid to the context and implications of its use. Firstly, ‘self-generated’ refers to the fact that the content is technically being created by the child, this does not mean it is consensual or legal nor should any blame be placed on the child.
Secondly, the term ‘self-generated’ can also encompass ‘sexting’ or the intentional consensual sharing of intimate photos or videos between children, often teenagers, in romantic relationships. However, this article is not addressing consensual sharing of nude photos but rather the rise in exploitative self-generated material produced through coercion by an offender often with the use of grooming or sextortion tactics.
Our words matter when speaking about these issues and it is essential to emphasize that SG-CSAM is a term that should be used within the proper context and without shifting blame to the child who is being exploited. Please see the Luxembourg Guidelines F.4.iv Self-generated sexual content/material to learn more.
The Issue
There has been an exponential increase in the volume of self-generated abuse material in recent years. The Internet Watch Foundation (IWF) reported almost 20,000 reports of coerced SG-CSAM in the first six months of 2022, an over 65% increase from the same period in 2021. The IWF has also reported specific increases in self-generated material involving 7- to 10-year olds, with a 360% increase in these reports since the first half of 2020. These disturbing global trends demonstrate the urgent need to address this rising form of exploitation and abuse.
One reason for this increase is the impact of the COVID-19 pandemic. The inability for many offenders to physically access children meant they sought online alternatives. This coupled with an increase in children’s time online created more opportunities for offenders to build relationships with children and instruct them to provide self-generated CSAM as part of their exploitation.
Another potential reason for this increase is that the sharing of self-generated images is becoming an increasingly normal part of children’s social behavior with many reporting that they feel sending nude photos is a natural part of their sexual exploration. However, often the most harm is caused by the non-consensual resharing of these images especially if an offender has groomed or sextorted a child into producing the content in the first place. This is especially concerning when Thorn reports 1 in 6 minors believed their friends non-consensually shared another child’s nude images at least sometimes in 2021. This was an increase in the perceived normalcy of re-sharing SG-CSAM from the previous two years and even this is probably an underreporting of the true scale because of the fear of consequences. But this fear and shame are exactly why children won’t reach out to a trusted adult when they are being exploited. It is essential that children, parents, and other trusted adults are given the education and the tools to have these conversations which can help prevent exploitation and respond effectively when a child needs help.
What can you do?
Luisa is a real 15-year-old girl living in Argentina and when her parents reported what had happened to her to the police, the Argentine Prosecutor’s Office took immediate action. They knew the phone number of her “boyfriend” was located in Spain so they reached out to ICMEC to make an immediate connection to the Spanish National Police. Within 48 hours the man exploiting Luisa was arrested in Spain — just in time for her to celebrate her 15th birthday without the threat of exploitation.
Stories like Luisa’s are unfortunately happening to children all around the world and they demonstrate the important role every person plays in protecting children from sexual exploitation. Parents must educate their children on the potential risks they face online. They must encourage conversations around online safety and demonstrate to their child that they can be a trusted support for their child’s online behavior. Technology companies must ensure that their platforms are safe for children and make education and prevention a priority. Criminal justice professionals must be equipped with the proper tools and training to respond quickly and effectively to cases of child sexual exploitation and abuse. Organizations like ICMEC must exist to provide the resources, support, and connections to those doing the work on the ground.
We all have a role to play in protecting children. Start finding yours today by downloading the below list of resources.
Read ICMEC’s report on Sexual Extortion and Nonconsensual Pornography to see how this issue impacts children around the globe and better understand obstacles to prevention, policy intervention, and prosecution.
Resources for Online Safety:
- Gaming, devices, and what you need to know: a resource for parents with advice on how to set privacy and safety controls on different devices and in different games
- National Online Safety Guides: online safety guides for specific apps, devices, games, and more
- Online blackmail and sexual extortion response kit: a roadmap for adults helping a child navigate a case of online sexual extortion
- Parent and carer social media starter kit: everything parents and carers need to know as their children begin using social media
- Thorn For Parents: Parents and caregivers today need to be equipped to have conversations with their children about safety online, and they need to understand the unique risks when sexual exploration and technology mix.
- Other resources for parents and carers from Think U Know (Australian Federal Police): resources on multiple topics related to online safety
- Take it Down: This service is one step you can take to help remove online nude, partially nude, or sexually explicit photos and videos taken before you were 18.
- INHOPE Hotlines: If you are a minor and you’ve come across sexual content of yourself or another minor, use this resource to contact your local hotline to have it removed.
- Report Remove: Report Remove can support a young person in reporting sexual images or videos shared online and enables them to get the image removed if it is illegal.