13 Internet companies operate a database with videos and images, the upload of which is to be prevented. The information on the files comes from police authorities. Many companies react promptly to notifications for deletion, but the European Commission nevertheless threatens to impose a legal framework. “Illegal” content should in future be detected and removed “proactively”.
An upload filter against the distribution of “terrorist content” currently contains 80,000 image files and 8,000 video files. This is the message of the current “Progress Report towards an effective and genuine Security Union“, which the European Commission now regularly publishes. The content to be removed is stored in a “Database of Hashes” currently operated by 13 Internet companies, including Facebook, Google and Youtube. The number of contents stored there has more than doubled in six months.
Smaller companies don’t like to delete
The removal of Internet content is known at EU level as “countering radicalisation”. All Internet companies which are offering services in the European Union are called upon to comply. They will report regularly to the Commission and Europol on the effectiveness of the upload filter, using indicators agreed within the framework of the “EU Internet Forum”. However, according to the “progress report”, the required transparency regarding the censorship measures is only mediocre.
The database of hash values maintained by companies is based, inter alia, on police reports from the Member States. However, most requests for deletion come from Europol’s “Internet Referral Unit”, which itself actively searches the Internet for criminal or extremist content. Within two years of its establishment in 2016, the “Internet Referral Unit” had reported around 41,000 “terrorist” or “extremist” content to a total of 80 Internet platforms for removal. In the first quarter of 2018 alone, 5,708 reports “to an increased number of smaller, less known, companies” were added.
Criteria are unknown
The “Internet Referral Unit” is now also dedicated to pursuing irregular migration. However, according to the Commission, the upload filter is only used in an Islamist context. Around 90% of the content reported to the major Internet companies is actually deleted. For the smaller companies, only 61% of the forwarded messages lead to removal. So far it is unknown according to which criteria files are classified and removed. Nevertheless, according to the Commission, the filter “continues to expand, both in terms of members and in terms of the amount of terrorist content captured in the database”.
One of the deficits is according to the current “progress report” the often lacking feedback on the notifications received for deletion. Only one (not mentioned) company provides “full information on receipt, timing and action”. The Europol “Referral Unit” operates an “Internet Referral Management Application” (IRMA) for incoming reports on “terrorism-related” content. There it is noted which files are already known, to which companies they have been reported for removal and which are at the end deleted. According to the Commission, only three Member States, apart from Europol, participate in the IRMA-Platform.
Reaction time should be reduced to one hour
There are also differences in the speed at which companies react, which according to the Commission range from “between under an hour to days”. The companies are therefore called upon to reduce the response speed to one hour. The Commission has threatened a legislative proposal in this regard. An impact assessment is now to clarify “whether additional measures are needed, in order to ensure the swift and proactive detection and removal of illegal content online”.
In none of the “progress reports” has the Commission ever explained what it means by “proactive detection”. This probably refers to algorithms that use artificial intelligence to search for suspected “extremist” or “terrorist” content. According to the report, more and more companies are taking such “proactive measures” to identify corresponding images or videos. A “higher volume of such content” is therefore being removed. According to the Commission, companies that have developed “automated tools” for searching content already available online have significantly increased the speed of identification and deletion.