In a major step towards enhancing online safety for children, EU lawmakers have reached an agreement to draft legislation that will require tech giants like Google, Meta and other online service providers to detect and swiftly remove child sexual abuse material (CSAM) from their platforms.
The proposed rules, which were put forth by the European Commission last year, have sparked intense debates between online safety advocates and privacy activists worried about increased surveillance. However, legislators emphasized that end-to-end encryption would not be compromised under the new regulations.
Tightening the Noose Around Online Child Abuse Networks
The draft law is expected to tighten the noose around online child abuse networks by mandating stricter requirements for providers to identify and quickly take down such content. Companies will also need to assess and mitigate risks that their services could be misused for grooming children online or spreading child pornography.
“For far too long, predators and pedophiles have been able to exploit online platforms to target our children. These proposed rules represent a seismic shift towards preventing access to this vile content and shutting down abuse networks before they can ruin young lives,” said Vicky Ford, the Minister of State at the Department for Digital, Culture, Media and Sport.
Key measures in the draft proposal include:
- Requiring messaging services, app stores and internet providers to detect and remove both existing and new CSAM content like images, videos and grooming attempts. This expands on current voluntary systems.
- Services catering to children must enable consent for unsolicited messages, have robust blocking/muting options and improved parental controls to stop minors from being solicited.
- Providers must assess if their platforms carry significant CSAM risks and then implement targeted, effective measures to mitigate those risks. They can choose which methods to deploy.
- Porn sites need adequate age verification, CSAM flagging tools and human content moderation to process reports.
No Mass Surveillance or Encryption Backdoors
The draft explicitly excludes end-to-end encryption from the scope of detection orders to prevent backdoor access to private communications. It also has safeguards against mass surveillance of internet users.
Judicial authorities can authorize time-limited detection orders only as a last resort when providers’ own mitigation measures fail to remove CSAM content. The technology used would be independently and publicly audited as well.
New EU Center to Streamline Takedown Efforts
In addition, the proposed legislation calls for an EU Centre for Child Protection to facilitate smoother implementation of the regulations across member states.
The Centre would help companies detect and take down CSAM by collecting reports and distributing them to relevant national agencies and Europol. It would also offer guidance to providers on fulfilling legal requirements.
Tightening Content Moderation Standards
The draft law is the EU’s latest attempt at updating decades-old regulations for the digital age. Experts have stressed the urgent need for policies that compel platforms to radically strengthen content moderation, especially for protecting minors.
“Tech firms have long argued that self-regulation works well enough. But the prevalence of child abuse networks on encrypted apps and livestreaming sites proves otherwise. External pressure is clearly necessary,” said online safety researcher Clara Thomas.
The proposal will now undergo final negotiations between EU bodies before it can be enacted as law, potentially in 2024. But children’s welfare advocates were optimistic that concrete progress was underway.
“This is a vital step to creating an internet that nurtures our young generations. There is much work ahead, but the EU deserves credit for making online child safety a top priority,” said Lucy Davis, Director of the NGO Protect Children Online.