The Lower Saxony State Criminal Police Office has analysed digital evidence from the alleged RAF militant Daniela Klette using artificial intelligence. This disruptive technology is reminiscent of BKA pioneer Horst Herold.
In the 1970s, the then-President of the German Federal Criminal Police Office (BKA), Horst Herold, introduced electronic data processing. Since then, Germany has maintained the extensive Inpol database for all German police authorities, as well as the method of dragnet investigation, which Herold sought to use to digitally optimise the pursuit of members of the militant and armed Red Army Fraction (RAF). Criminal geography was also employed, which he integrated into daily police work to calculate crime probabilities. Using these methods, the BKA chief, known as “Commissioner Computer,” was quite successful – although he was already facing criticism at the time due to data protection concerns.
Fifty years later, artificial intelligence (AI) is now set to revolutionise computer-assisted investigative work in German criminal offices – as the ever-growing proliferation of digital devices has provided investigators with such a vast amount of information that manual analysis is practically impossible.
The special commission “Triangle” has experience dealing with such extensive data sets. This unit within the State Criminal Police Office (LKA) in Lower Saxony is investigating the robbery offences and attempted murder charges against Daniela Klette and her fellow suspects, Ernst-Volker Staub and Burkhard Garweg, who are also sought as RAF members. The trial against Klette begins on 25 March in Verden. According to information from “nd,” the responsible police in Hanover, which is conducting the investigation for the Verden Public Prosecutor’s Office, has deployed a powerful AI tool: Three months after Klette’s arrest in Berlin-Kreuzberg, the authority procured the “Pathfinder” software from the Israeli company Cellebrite and used it to search through the digital evidence of the accused.
Alongside handwritten notes and documents, around 300 data carriers were found in Klette’s possession – the LKA estimates that the information stored on them amounts to up to 30 terabytes, which, according to an official note, is equivalent to “6 million mobile phone photos or a text containing 18 trillion characters.” The purpose of “Pathfinder” was to “pre-select” this data set, filtering out icons or emojis, excluding irrelevant photos, and simultaneously searching for images of interest. Specifically, investigators searched for “depictions of people, vehicles, and, if possible, weapons and money.” It is likely that the software was also intended to recognise known crime scenes.
AI eliminates police department
The significance of AI for modern police work was explained by the current BKA President, Holger Münch, at the agency’s annual autumn conference in November in Wiesbaden under the title “How We Ride the Wave.” According to Münch, the technology proved itself following the dismantling of the encrypted communication service EncroChat, when over 115 million messages from suspected criminals were cracked and analysed by European police authorities. Another example cited by Münch was the improvement in the accuracy of facial recognition in Germany, which has advanced so significantly that an entire BKA department with almost 50 employees has become redundant.
The arrest of Klette in February 2024 had already sparked interest in expanding facial recognition. During the autumn conference, Münch recalled investigative podcaster Khesrau Behroz, who uploaded old wanted photos of the suspects to the website of the US company Pimeyes and, using their AI facial recognition technology, found pictures of Klette on Facebook. Her arrest followed shortly thereafter. The journalist had “exposed deficiencies in police powers,” said the BKA chief.
So far, German police authorities may only use facial recognition retroactively – that is, comparing images of suspects and criminals stored in the Inpol database with other photographs. Real-time analysis of video surveillance footage is currently not permitted, nor is comparison with images on the internet. However, following the Islamist-motivated attack in Solingen in the summer of the previous year, the then-governing coalition proposed a “security package” that would allow police to search for suspects or victims online using photos or voice samples. The opposition Christian Democratic Union (CDU) in the Bundestag deemed the bill, which was later blocked in the Bundesrat, insufficient. With a CDU-led government, expanded permission for digital biometric investigations is expected in the near future.
Dragnetting software
A research laboratory at the BKA is exploring which additional AI technologies could be used for police purposes. These include decrypting data and recovering deleted or damaged storage media. Information found in police databases, financial transactions, or social media can be linked together and analysed for suspicious patterns. AI then assists with automatically generated summaries, reports, and translations.
Cellebrite already offers solutions for many of these requirements. The company’s products are widely used for IT forensics within German police forces, although its analytical software has not been adopted. Instead, most federal states use the case management system “RS-Case” from Rola Security Solutions, now owned by Deutsche Telekom. This software can visualise relationships between individuals, objects, and incidents on a timeline, helping to identify investigative leads. Meanwhile, North Rhine-Westphalia, Hesse, and Bavaria rely on software from the notorious US company Palantir, which can also access online data. However, this function is supposedly disabled in Germany.
Like competing products, “Pathfinder” can process text-based messages, including metadata from various email or messaging services, which can be cross-referenced: information about chat participants, timestamps and locations of network connections, users’ online statuses, device details, and, when GPS is enabled, their location data.
Triangle” searched for Klette’s keywords
The investigators’ work with Cellebrite’s “Pathfinder” software also involves searching for specific terms in digital evidence. The AI tool can recognise keywords from text files, emails, chat messages, and even images using optical character recognition (OCR). According to internal documents, the LKA Lower Saxony used predefined search queries to identify potentially relevant content in Daniela Klette’s data trove. These apparently refer to already known findings and read, for example, ‘money messenger’, ‘bomb’, ‘meat counter’, ‘McDonalds’, ‘stun gun’. Place names of several crime scenes and vehicle licence plate numbers also serve as keywords.
At the beginning of May 2024, the BKA, which is investigating the terrorist offences of which Klette is accused on behalf of the Federal Public Prosecutor’s Office, sent the data intended for analysis with ‘Pathfinder’ on a hard drive to the evidence centre of the LKA Lower Saxony. Technicians converted these into a file format developed by Cellebrite – a process that allegedly required overtime. The files were then copied directly to the LKA server, as it would have taken a whole week to read them in using analysis software.
A ‘pre-selection in stages’ then began with the help of ‘Pathfinder’. The investigators were not asked to wait until all the data had been processed by the AI, but to start viewing the results while the analysis was still ongoing.
AI applications classified as high-risk systems
As is typical with such disruptive technologies, the use of AI in police investigations raises fundamental legal questions. The EU’s AI Regulation, passed by the European Parliament in cooperation with the Commission and member state governments over a year ago, sets guidelines for its use. It stipulates that high-risk AI systems may only be introduced into police work with increased reporting and oversight requirements. Applications that generate predictions or recommendations – which is precisely the stated purpose of “Pathfinder” – fall into this high-risk category, argue Klette’s lawyers, Undine Weyers, Ulrich von Klinggräff, and Lukas Theune.
According to the defence, the use of such an AI system lacks a legal basis in Germany. The Federal Constitutional Court ruled in February 2023 that the automated data analysis regulations in Hesse and Hamburg’s police laws were unconstitutional due to their insufficiently specific provisions. The software also generates new insights using algorithms and incorporates data from the suspect’s environment, the court found. Their police laws did not sufficiently specify the analysis methods, it ruled. According to the court, the software also generates new knowledge with the help of algorithms and includes data from the suspects’ environment. The judgement in Hesse was explicitly about Palantir’s existing analysis software, while the judgement in Hamburg was about a possible acquisition.
Klette’s lawyers argue that ‘Pathfinder’ also creates new grounds for suspicion. However, the technology also cancels out central procedural principles: By enabling the LKA to investigate with the help of new technologies and search through previously impossible amounts of data in a short space of time, the duty of ‘equality of arms’ of the parties is violated in court proceedings – because the defendant and their defence do not have the possibilities of such technology.
The defence is also unable to see which evidence the Public Prosecutor’s Office obtained through automated analysis and which it obtained ‘manually’. It is also not clear from the files which algorithms were used to analyse the evidence with ‘Pathfinder’. These are part of the source code of the software, which is usually kept as a business secret by the manufacturers. Only if the function is known can it be traced whether the software leads the investigation in the wrong direction due to erroneous ‘hits’ or overlooks exculpatory evidence – due to one-sided programming of the software or a police bias in its application.
Non-Governmental Organisation demands impact assessment
The non-governmental organisation Algorithm Watch is fundamentally sceptical about police use of AI systems and demands transparency. “It must be clear who within the police and prosecution is responsible for errors or distortions in such systems,” says Kilian Vieth-Ditlmann from Algorithm Watch. AI tools like “Pathfinder” could also create false links where none exist, he warns. “AI-generated recommendations can quickly lead to what is known as automation bias,” influencing the investigators’ approach in a specific direction. Algorithm Watch therefore calls for a dedicated legal framework for AI-based data processing in law enforcement, including an impact assessment that involves defence lawyers or victim advocacy groups in the evaluation process.
In the LKA Lower Saxony, “Pathfinder” has been used since 2022 for the evaluation of evidence and is operated on a police-owned server, as the LKA informed “nd.” Only employees of the state office have access to it – in addition to a “single-digit number of Pathfinder admins and two IT specialists,” according to an LKA memo. There was no dedicated impact assessment for “Pathfinder”; before procurement, only a “consideration of the overall circumstances” was carried out to determine whether the software “meets legal requirements,” the authority explained to “nd.” There is also no specific legal basis for AI in the Lower Saxony Police Act; according to its own statement, the LKA justifies its use based on the general powers for police investigations from the Code of Criminal Procedure.
The police in Lower Saxony chose Cellebrite over applications from the company Palantir due to “police needs in IT forensics,” according to the spokesperson, without providing further details upon request. However, “Pathfinder” is not linked to other police databases. It also remains unclear how many people worked on the Klette case using Cellebrite’s digital investigation technology. However, according to a memo, a senior investigator “approached the management level” and proposed the purchase of two “costly licenses” for the processing of seized evidence. The LKA does not comment on the fees paid for this.
Police not satisfied with “Pathfinder”
Apparently, “Pathfinder” did not meet the needs of the special commission “Triangel”—at least, one of the memos states that the Cellebrite software was used “only at the beginning of the investigation.” It was “quickly determined” that the technology was “not suitable for investigating the present offenses.” Neither the LKA nor the Federal Public Prosecutor General at the Federal Court of Justice wishes to provide reasons for this, referring instead to ongoing investigations.
Unlike Klette’s defense, the BKA considers analysis software like “Pathfinder” to be a low-risk AI application. This is evident from a presentation given by a BKA employee at the agency’s most recent autumn conference, which “nd” has in full. The presentation includes a fictional scenario in which AI is used to analyze a darknet forum containing 1.5 million seized chat messages; thousands of images or texts “related to weapons” were identified in this scenario. However, the AI findings are not legally admissible and must always be verified by investigators, according to the presentation’s conclusion.
The LKA Lower Saxony had also mandated manual review for all “Pathfinder” hits. Nevertheless, Klette’s attorney, Lukas Theune, has significant concerns regarding the use of “Commissioner Computer” against his client: “We urgently need a societal debate on how far we want to relinquish human control and allow AI to take over police work. Should AI also be delivering verdicts in the future?”
Published in German in „nd“.
Imgage: Forensic institute at State Police Department Brandenburg (symbolic image).
Leave a Reply