*Machine Translation

Detection & Investigation

Thanks to state institutions, private organisations, and activists, millions of materials depicting child sexual abuse are reported yearly, and the numbers are growing! These materials are often the only traces and signs that a crime has been committed, as victims typically remain silent. The collection of reports marks the initial step towards launching an investigation and safeguarding victims. Regrettably, the sheer volume of leads and cases overwhelms law enforcement agencies, necessitating swifter and more efficient report processing.

As perpetrators' methods constantly shift parallel to technology and societal habits, practitioners must adjust to this ever-changing environment. This requires improved strategies and state-of-the-art tools to help identify the offenders and victims and speed up processing thousands of cases.

Moreover, legislation and clear guidance are required for organisations to manage and assess the risk of hosting CSAM. We must strengthen the regulations on internet service providers and technology companies to ensure they are proactively identifying and reporting CSAM on their platforms and services. 

What you should know

mln

In 2022, NCMEC’s CyberTipline received more than 32 million reports of suspected child sexual exploitation. (Source)

Reports made to the CyberTipline in 2022 include more than:

mln

Images

mln

Videos
%

About one in five children in Europe are victims of some form of sexual violence.

This includes grooming, exhibitionism, sexual touching, sexual harassment, rape, exploitation in prostitution and pornography, online sexual extortion and coercion. (Source)

Only 1 in 8 children who are sexually abused are known to the police and children's services. (Source)

%

What the COVID-19 pandemic has changed

The spread of CSAM is mostly connected with the availability of the internet. According to EUROPOL, during the COVID-19 pandemic, the volumes of new posts with illegal materials and attempts to engage in child sexual exploitation raised significantly.

“With children spending more time online due to the various restrictions introduced in response to the COVID-19 pandemic, the potential increase in demand for CSAM and attempt to engage in child sexual exploitation has been a considerable threat. […] Referrals to illegal websites with CSAM have also increased. There have been also reports received that minors were targeted with pornography during hacked Zoom conversations. Attempts to log into blocked child pornography sites have also appeared to rise.” (Source)

“EU Member States have reported an increase in the number of blocked attempts to access websites featuring CSAM during their lockdowns. Such reports are in line with figures from the Internet Watch Foundation, which reported almost 9 million blocked attempts to access CSAM in April in the United Kingdom alone. In some countries, this is matched by an increase in the number of reported CSAM offences, such as online solicitation and sextortion.” (Source)

.

Image Hash List

Use the power of tech-for-good to prevent the upload and proliferation of known child sexual abuse imagery across your networks and platforms with the Internet Watch Foundation Image Hash List. This list is a collection of digital fingerprints (hashes) created from IWF-assessed images of child sexual abuse. It is a revolutionary tool that allows ICT companies to stop the upload, storage and sharing of “potentially millions” of child sexual abuse images online. 

Benefits of using the IWF Hash List for good

  • Tech companies can protect employees and customers from accidently stumbling across known child sexual abuse imagery on their platforms.

  • They can make sure staff welfare is prioritised, by shielding teams from having to view this distressing imagery.

  • Every one of the images and videos on our Hash List has been viewed and verified by expert human eyes. Companies can be sure the pictures are criminal.

  • IWF hashes cannot be reverse-engineered back to the images. This means anyone handling this sensitive data is completely safe.

  • Preventing criminal imagery of children from being uploaded onto networks in the first place helps stop the cycle of abuse. It’s the right thing to do.

Learn more about the hash list and how they can help detect CSAM and protect children at the Internet Watch Foundation

How CSAM is detected

The work of law enforcement is critical to detect and act on the content that is identified online. Hotlines are also crucial actors in this process to have content removed from the internet. Child sexual abuse detection is when organisations use hash lists to review their platforms for CSAM.

Technology companies and online communication services use automated detection to voluntarily identify, evaluate, report, and reduce the scale of harmful content found on their platforms. However, legislation must ensure the content is proactively scanned and identified.

The European Union is actively exploring ways to extend and potentially make permanent regulations aimed at CSAM. Currently, interim rules allow companies to voluntarily detect, report, and remove CSAM, but these are set to expire. EU member states have expressed a desire to extend these interim rules, with discussions ongoing about establishing a permanent law.

AI-based technologies can be a blessing and a curse. AI can be used by LEAs to detect and classify CSAM, in this way supporting investigation and reducing the burden for individuals who need to classify materials.
 
However, if CSAM is AI-generated, it can be indistinguishable from the real materials, therefore burdening authorities resulting in less time and resources to identify real victims.

Key Recommendations

We need to identify the legal barriers that are preventing LEAs from international collaboration and data sharing and establish mechanisms that can be shared amongst organisations to overcome these obstacles.

Hash sets of known victims should be regularly shared across organisations and jurisdictions to reduce duplication of effort arising from a lack of global coordination and ensure victim ID databases are being kept up to date.

Identify feasible end-to-end welfare initiatives that can be implemented to improve the wellbeing of officers and staff, working on CSAE related cases. These initiatives should consider the use of technology to reduce exposure to disturbing content and identify key intervention points where individuals will be proactively assessed and supported.

Strengthening the regulations on internet service providers and technology companies is crucial to ensure they are proactively identifying and reporting CSAM on their platforms and services.

Featured EU detection actions

ARICA

EU Funded

ARICA – Assessing Risk Indicators of Child Sexual Abuse

Assessing Risk Indicators of Child Sexual Abuse

Coordinator:
Police University College of Finland

CYCLOPES

EU Funded

CYCLOPES: Fighting Cybercrime – Law Enforcement Practitioners’ Network

The project is a 5-year action that addresses various aspects and thematic areas of cyber-related crime. The network forges partnerships between practitioners from across Europe (and beyond), to contribute to different activities that support Europe's response to cyber-related crime, including CSEA.

Coordinator:
Polish Platform for Homeland Security

AviaTor

EU Funded

AviaTor – Augmented Visual Intelligence and Targeted Online Research

AviaTor, which stands for Augmented Visual Intelligence and Targeted Online Research, is an efficient tool that helps you prioritise all aspects of NCMEC reports, so that you can focus on identifying perpetrators and saving victims.

Coordinator:
National Police of the Netherlands

GRACE

EU Funded

GRACE – Global Response Against Child Exploitation

GRACE aims to equip European law enforcement agencies with advanced analytical and investigative capabilities to respond to the spread of online child sexual exploitation material.

Detection is just one of three pillars of countering CSAE!

Check also:

Funded by
the European Union

The website is funded through contributions from various projects, including several EU‑funded initiatives — you can find more details about them on the About Us page. However, the views and opinions expressed are those of the author(s) of specific publications only and do not necessarily reflect those of the European Union.

Neither the European Union nor any other granting authority can be held responsible for the content.

Subscribe to our newsletter
Stay up to date with important news.
© Sparks in the Dark

Get involved

Subject*