“THE EUROPEAN PARLIAMENT HAS FAILED CHILDREN” - SURVIVORS OF CHILDHOOD SEXUAL VIOLENCE AND CHILD RIGHTS ACTIVISTS ISSUE DAMNING ASSESSMENT OF THE PARLIAMENT’S COMPROMISED VERSION OF CHILD SEXUAL ABUSE REGULATION

November 9, 2023: A coalition of over 80 organisations and survivors committed to advancing children’s rights and the elimination of sexual abuse, have written to EU leaders to express deep concern at the European Parliament’s compromise on the Regulation to prevent and combat child sexual abuse. The organisations, representing a collective frontline against child sexual abuse, stated: “We cannot accept a step back from the current situation in the protection of children online.”

The European Parliament’s compromise was reached on October 26 after months of negotiations. This position is a significant shift from the Commission’s original proposal as it  would greatly limit online service providers in their detection of child sexual abuse material online by imposing targeted detections, putting an end to current voluntary detections by platforms and excluding grooming from the scope of detections.

The coalition denounces the European Parliament’s approach, which would restrict detection of child sexual abuse to the extent that it would make it ineffective in tackling these crimes on a large scale. They argue that the Parliament has not only failed to move forward in strengthening protections for children online, but has actually proposed a step backward from the current measures already in place today because:

  • Voluntary detection is critical: without voluntary detection, platforms are not able to properly safeguard their services and to protect children. Significant protection gaps will arise as soon as voluntary detections by platforms are no longer allowed, as the 58% drop in reports in 2021 showed when the EU legal framework was not yet in place. And other gaps will emerge during the long process leading to a detection order.
  • Targeted detection of suspects is not effective to protect children: If we only allow detection of images and videos when a suspect is identified, we will limit the detection to an extremely small number of images and children, enabling potential abusers to easily continue to perpetrate abuse under the radar of law enforcement.
  • Targeted detection of suspects is also not workable in practice due to the scale of the phenomenon: platforms reported 88.3 million images and videos of child sexual abuse online last year only and this is only the tip of the iceberg. Across the globe, thousands of predators are constantly opening new online accounts to target children and share millions of images and videos depicting child sexual abuse. How will we detect and remove content at this scale if we cannot use technology to do so?
  • Grooming needs to be tackled now: Reports of grooming increased by 82% from 2021 to 2022. Preventing online and offline manifestations of child sexual abuse is an essential part of the solution. Deciding against detecting grooming means that we give up on the possibility of preventing future harm from happening in the first place.

 

Mié Kohiyama, survivor, member of the Brave Movement and founder of Be Brave France said: “The number one demand from survivors is to have their material removed from the internet and this can simply not be done without detection tools. I do not know if I will ever truly heal the trauma of being raped as a 5-year-old and I can’t imagine the horror of my abuse being online, available for predators and criminals to view and monetize. But this is the fate of thousands of children and survivors. Now is the time to ensure a workable, solid and effective EU regulation fit for its purpose. We call on the co-legislators of the EU to fulfill their commitment to protect children against sexual abuse now.”

Survivors and child rights advocates are asking for Member States to take a stronger position to protect children from sexual violence online and ensure a wide scope for detection of child sexual abuse materials and grooming on the internet.