Providers of messengers and cloud services will be allowed to voluntarily screen for child abuse content worthy of prosecution, which is to become mandatory across the EU. The Council and Commission are pushing for an extension to other crime areas. Next week, the EU interior ministers will publish a declaration on this.
On 1 December, the European Commission planned to present its proposal for a regulation for “detection, removal and reporting of illegal content online” in the area of child sexual abuse. It would require providers of messenger services or chat programmes to automatically scan private communications for such material.
But the already delayed bill is now being postponed again. This emerges from a comparison of the Commission’s agendas. The latest version, dated 26 October, no longer includes the legislative proposal. Originally, the Commission wanted to present the EU regulation already in spring. So far, there is no new date.
Interim solution until 2022
Internet providers are already allowed to voluntarily screen communications until the end of 2022. The Commission had drafted an interim regulation for this purpose, which was first approved by the Council and then by the Parliament before the summer break. This concerns only unencrypted communications or platforms where providers have access to content.
Large companies such as Apple, Google and Microsoft have already been making use of this for years. However, the voluntary use of automated scanners violates the General Data Protection Regulation and the e-Privacy regulation, which is currently being negotiated. The exemption regulation, which was pushed through in all the hustle and bustle, was therefore intended to accommodate the companies.
The now planned obligation to scan communications is also to cover encrypted content, including services such as Signal, Threema or WhatsApp. The Commission recently confirmed this to MEP Patrick Breyer. Breyer has coined the word “chat control” for the initially voluntary and soon enforced screening of internet content.
Unique mass surveillance
It is not known why the Commission is delaying the proposal for the follow-up regulation. However, since the announcement last year, there has been much criticism from civil rights organisations, crypto experts and MEPs. There are fears of false hits, especially among young people, when they discuss sexual topics.
As with data retention, the planned obligation to monitor content would be a unique attack on the confidentiality of communication, which is protected by fundamental rights. The plans involve mass surveillance, which for the most part affects innocent people. It is also criticised that the checking of possibly criminal content is being outsourced to private individuals. These should then automatically forward suspicious cases to the investigating authorities.
This summer, Apple had announced that it would already detect “Child Sexual Abuse Material” (CSAM) on its devices. This procedure, which was explicitly developed for encrypted communication and cloud data, is called ” Client-Side Scanning” (CSS). At a certain number of files found, the police would then be informed. The company described this as a balancing act between the legitimate need for law enforcement and the privacy of telecommunications. The EU Commission also had similar technical procedures investigated. After some fierce criticism, Apple initially withdrew the plan.
Extension to “public security” and “terrorism”
It is foreseeable that a regulation to combat the sexual abuse of children will be extended to other criminal phenomena. In numerous conclusions or other statements on access to encrypted content, the Council and the Commission mentioned “terrorism” and “internal security”.
In a week’s time, EU interior ministers will meet in Brdo under the Slovenian Presidency for a “Conference on the Prevention and Investigation of Child Sexual Abuse”. The British civil rights organisation Statewatch has received the planned final statement, which has already been edited in some places. According to the statement, the associated Schengen states, the Western Balkan states and the USA, on whose territory most of the major providers of internet services are based, have also been invited.
In their paper, the ministers want to call for the development of “necessary tools, mechanisms and legislative solutions [to prevent and detect, investigate child abuse], and implementing them at national level”. The vague formulation is fleshed out a few paragraphs later. Necessary “solutions” would have to concern encryption, but also data retention and the handing over of “digital evidence”. Once again, the use of technology for “ensuring public security” is brought into play.
Image: Charles Deluvio on Unsplash.
Leave a Reply