The Growing Global Threat of Child Sexual Abuse Material (CSAM)

Our reliance on the internet to facilitate our personal and professional lives has increased substantially during the COVID-19 pandemic. However, the same technology that provides us with opportunities for growth and connection can also present significant risks to children.

There is a hidden war being waged against the exploitation and abuse of children and the internet is the battleground.

Child sexual abuse material, or CSAM, is a growing global threat. CSAM is defined as “any representation, by whatever means, of a child engaged in real or simulated explicit sexual activities or any representation of the sexual parts of a child for primarily sexual purposes”¹. The production and spread of this harmful material has increased exponentially over the past decade. According to the U.S.’s National Center for Missing and Exploited Children there were 100,000 reports of online CSAM in 2008. By 2014 that number had grown to more than 1 million, and in 2021, there were more than 29 million reports, including nearly 85 million images, videos, and other files of potential abuse and exploitation².

Adapted from: The New York Times

Each one of those 85 million files potentially represents a child’s abuse, a child’s trauma. Moreover, every time these images or videos are viewed or shared, the child depicted is revictimized — forced to live with the fact that their abuse is out there for others to see. And they can rarely do anything about it.

The epidemic of child sexual abuse material transcends borders, geography, and politics. It hurts children and families of all backgrounds, races, ethnicities, economic and social classes. No country or community is immune — from Guatemala (65,076 reports in 2021) to Iraq (1,220,470) to India (4,699,515) — children everywhere face the awful threat of online exploitation and abuse.

Child sexual exploitation and abuse is a global problem that demands a global solution. That is one of the many reasons why the International Centre for Missing & Exploited Children (ICMEC) has partnered with the Internet Watch Foundation (IWF) to launch a joint Portal to Report Child Sexual Abuse Material (CSAM), which aims to provide every person around the world with access to a safe, anonymous way to report CSAM. This portal enables people like you to join the battle against online CSAM by reporting this illegal and harmful content. Reports submitted to report.icmec.org will be analyzed by IWF’s team of professionals who work to remove the content from the internet, sometimes in as little as a few hours. It is vital that this material is reported and removed to protect child victims from revictimization and to make the internet a safer place for everyone.

Currently, 79 countries do not have access to a CSAM reporting portal or hotline. ICMEC’s collaboration with IWF offers an immediate solution to provide anyone, anywhere with access to a safe, anonymous place to report CSAM.

Through this joint reporting portal and our collective efforts to address CSAM, we are creating a safer internet and safer world for every child. We all have a responsibility to protect children from exploitation and abuse. When we unite as a global community of child defenders, we can make a difference — you can make a difference.

Access the portal to report instances of child sexual abuse material online here: report.icmec.org

report.icmec.org

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store