The European Parliament adopted on Tuesday (6 July) the final of the ePrivacy derogation, a temporary measure enabling providers of electronic communication services to scan and report private online messages containing material depicting child sex abuse. The provisions also allow companies to apply approved technologies to detect grooming techniques.
“This interim regulation ends uncertainty for companies. It does not end danger to children. This is only a temporary solution to fix an acute emergency. We need a permanent answer to counter a persistent threat against children,” said Ylva Johansson, Commissioner for Home Affairs, presenting the legislation in the European Parliament on Monday (5 July).
According to the European Commission, almost 4 million images and videos containing child abuse were reported last year. In the same time period, 1,500 reports were filed for grooming, a technique used by sexual predators to befriend minors. According to Europol, the situation has further worsened during the COVID-19 pandemic.
Privacy vs. child protection
The new regulation provides a legal framework for tech companies to monitor interpersonal communications on a voluntary basis with the purpose of detecting and reporting material depicting sexual abuse of minors or attempts to groom children.
Thorn, a US-based NGO, welcomed the new EU legislation, pointing to a 15,000% increase in reported files in the past 15 years. “We’re facing a growing, global crisis of online child sexual abuse, and we simply cannot afford to move backwards in this fight,” said Thorn VP Sarah Gardner.
The regulation has however been criticised as too intrusive and enabling the indiscriminate monitoring of private communications. The European Data Protection Supervisor (EDPS), the EU privacy watchdog, questioned the measure’s compatibility with the fundamental right to privacy in a non-binding opinion last year. Similar concerns were expressed in a report from the Council of Europe.
The scanning of private conversations will be conducted through automated content recognition tools, powered by artificial intelligence, but under human oversight. Service providers will also be able to use anti-grooming technologies, following consultation with data protection authorities.
Diego Naranjo, head of policy at European Digital Rights (EDRi), told EURACTIV the proposal was “rushed”, and failed to strike a balance between the right to privacy and the need to protect children online because “the discussion was instead shifted from rational to emotional arguments.”
Controversial result
In negotiating the final text, the European Parliament raised several privacy concerns. For Birgit Sippel, the MEP leading the negotiations, the main improvements included more clearly informing the user about the possible scanning of communications, along with clear data retention periods and limitations on the technology’s deployment.
However, MEP Patrick Breyer criticised the final compromise, saying that the automated tools report non-relevant content in 86% of cases, disclosing alleged suspicious communications to private organisations and police authorities without the user being notified.
Privacy concerns relate to the risk of exposing legitimate private communication among adults, such as nude pictures or sexting, potentially opening the door to abuse.
For Leanda Barrington-Leach, head of EU affairs at 5Rights, the new law provides “a proportionate response to the very real, ongoing and heinous crime of child sexual abuse with due consideration to the potential abuses of privacy rights and the prioritisation of the best interests of the child, as required by the EU Charter.”
Alexander Hanff, a former victim of child abuse, disagrees. He openly criticised the provision, arguing it will deprive victims of channels for confidential counselling. “It will not prevent children from being abused, it will simply drive the abuse further underground, make it more and more difficult to discover it. It will ultimately lead to more children being abused,” Hanff said.
Long term plans
As of the beginning of 2021, the definition of electronic communications under EU law has changed to also include messaging services. As a result, private messages no longer fall within the scope of GDPR, the EU privacy framework, but instead fall under the ePrivacy directive of 2002.
While GDPR includes measures to detect child sexual abuse, that is not the case with the ePrivacy directive. The change of legal regime made many online providers stop voluntary reporting, which fell 53% since the beginning of the year.
In 2017 the European Commission put forth a proposal to revise the ePrivacy directive, but negotiations have stalled for years, meaning temporary measures had to be put in place instead. The temporary solution will be in place until 31 December, 2025 but will be repealed at any time the revised ePrivacy directive enters into force.
The Commission anticipated a proposal for comprehensive legislation to fight child sexual abuse online and offline before the end of the year.