German Constitutional Court hears case on automated data analysis by the police
Is a company allowed to use technical tools on behalf of the police to create a profile of people and determine their supposed dangerousness? Is this a “data mining” and is it also in line with fundamental rights to prevent dangers? On Tuesday, the Federal Constitutional Court in Karlsruhe heard a case on this issue. It was preceded by two constitutional complaints filed by the Society for Civil Liberties (Gesellschaft für Freiheitsrechte e.V., GFF) and other civil rights organisations against police laws in Hesse and Hamburg.
In Hesse, the authorities have purchased the “AI-enabled” software “Gotham” from the US company Palantir. Called “Hessendata”, it is supposed to find out if a person has connections to so-called dangerous persons and to show connections between persons, objects and previous investigations. The method is considered “predictive policing”.
In Hamburg, such applications are not yet being used, the police confirmed to “nd”. In addition to Hesse, North Rhine-Westphalia uses the Palantir software as a “cross-database analysis and search”, but the police law there, which was renewed accordingly in 2022, is not the subject of the complaint in Karlsruhe.
In order to create personality profiles, “Gotham” can link information already available to the police with data from registration authorities, vehicle registers, health or social welfare offices. Part of Palantir’s successful model is the integration of so-called unstructured data. This can be text files, mails, address books or even photos and audio files. It is also possible to integrate downloaded files from the internet and social media.
Such files inevitably contain many innocent and unsuspected persons as well as victims and witnesses of crimes. If these groups of people become “by-catch”, their fundamental right to informational self-determination could be violated, argues the GFF in Karlsruhe. Moreover, “Gotham” could focus especially on groups and persons that the police have their eyes on anyway, fears Lea Beckmann, lawyer and procedural coordinator at GFF. This would consolidate existing prejudices.
The ten complainants include journalists, lawyers and activists, such as the defence lawyer Seda Başay-Yıldız and Silvia Gingold, who is active in the peace movement. “I do research on the extreme right and sometimes visit right-wing demos and events undercover,” journalist Sebastian Friedrich, who is also involved, tells “nd”. He fears that after an automated data analysis he himself will come under suspicion as part of the extreme right.
Soon, Bavaria will also decide to what extent the State Criminal Police Office there will use software from Palantir. Currently, a Fraunhofer Institute is checking the source code of the programme, the Bavarian police confirmed to “nd”. This is intended to dispel fears that sensitive data is being leaked to US secret services. The American parent company of Palantir was financed by the foreign intelligence service CIA, and the service itself was one of the company’s customers. In order to prevent possible data outflows from “Gotham” in principle, the software is not connected to the internet in Bavaria.
Bavaria is leading the nationwide project “Procedural Search and Analysis” (VeRA), in the framework of which Palantir could subsequently be procured for all federal states. Therefore, the GFF hopes for a fundamental decision from the Constitutional Court on stricter rules for the analysis software.
One of the GFF’s lawyers is junior professor Sebastian Golla from the Ruhr University in Bochum. He stresses to “nd” that data sources for “predictive policing” must be limited. “A basic problem is that such software literally invites the police user to play games and encourages them to look for connections.”
Thomas Petri, the Bavarian State Commissioner for Data Protection, takes a similarly critical view. “The automated evaluation undermines the principle of purpose limitation,” Petri told “nd”. The data available to the police should not be arbitrarily compiled. “Because in some context, almost the entire population is recorded in the police databases and could end up in the grid. That would be a considerable encroachment on fundamental rights.”
The many detailed questions from the court showed that the judges were also “critical of the vague norms on automated data analysis”, the GFF commented on the hearing. The judgement from Karlsruhe is expected in a few weeks.
Published in German in „nd“.
Image: The Court in Karlsruhe today (Maria Scharlau/ GFF).
Leave a Reply