Providers of messengers and cloud services will be allowed to voluntarily screen for child abuse content worthy of prosecution, which is to become mandatory across the EU. The Council and Commission are pushing for an extension to other crime areas. Next week, the EU interior ministers will publish a declaration on this.
On 1 December, the European Commission planned to present its proposal for a regulation for „detection, removal and reporting of illegal content online“ in the area of child sexual abuse. It would require providers of messenger services or chat programmes to automatically scan private communications for such material.
But the already delayed bill is now being postponed again. This emerges from a comparison of the Commission’s agendas. The latest version, dated 26 October, no longer includes the legislative proposal. Originally, the Commission wanted to present the EU regulation already in spring. So far, there is no new date. „Planned regulation: EU Commission postpones mandatory screening of encrypted chats“ weiterlesen
The Hamburg police have been researching facial analysis software for several years, which was then used for the first time after the G20 summit. The technology accesses the nationwide INPOL file for criminal offenders maintained by the Federal Criminal Police Office. The detection rates are meagre, but the system is still to be used permanently in Hamburg for the „processing of major events“.
The face analysis software used by the Hamburg Special Commission „Schwarzer Block“ („Black Block“) has led to the identification of only three people. This was written by the Hamburg Senate in response to a question by Christiane Schneider, a member of parliament. The Special Commission, which was set up after the G20 summit, uses the face recognition software „Videmo360“ from the company Videmo, which processes all common image and video formats. „G20 in Hamburg: Data protection commissioner considers face recognition illegal“ weiterlesen
Material uploaded onto the Web could soon be scanned for extremist or radicalising content with an upload filter produced by Microsoft. The filter would be installed in the systems of Internet service providers (ISPs), but the necessary databases could be held by the police authorities.
Two weeks ago in Washington, the international Counter Extremism Project presented a software solution with which extremist content is said to be detectable on upload. The process is based on PhotoDNA, an application originally developed by Microsoft to combat child pornography. It is able to detect video and audio content. The recognition rate is reportedly in the region of 98%.
PhotoDNA operates on the principle known as ‘robust hashing’ and extracts a distinct digital signature from the file. With the checksum, the software is then able to recognise images even if they have been distorted or post-edited. The comparison is made with a hash database, which is administered either by ISPs or by both ISPs and public authorities. In the United States, for example, PhotoDNA makes use of the database of the National Center for Missing & Exploited Children. Interpol, the International Criminal Police Organization, also maintains a Child Sexual Exploitation Image Database. „First child pornography, now extremism: Internet providers and police investigation authorities to use Microsoft upload filters“ weiterlesen