A UN convention on lethal autonomous weapons systems is not in sight
For more than seven years, several dozen states have been discussing a possible ban on autonomous killer robots at the United Nations (UN). So far, the talks have not led to any concrete negotiations; above all, the major military powers – the USA, China, Israel and Russia – are putting on the brakes. It is also unclear how a joint treaty within the framework of the Convention on Certain Conventional Weapons (CCW) could be structured.
Lethal autonomous weapons are systems that pursue a target and – with the help of artificial intelligence, for example – decide on the best time to attack. They can be drones in the air, on land or at sea. Such systems can also identify the people to be killed on the basis of their appearance, stature or biometric data.
At the beginning of December, the governments met again in Geneva within the framework of an expert group. Again, there was no agreement on which path the states could take on the way to a common convention. Actually, the group was supposed to prepare recommendations for the so-called Review Conference of the UN Weapons Convention. It also began today in Geneva, the 125 participating states now have to decide on the future of the expert group. Critics like the international Campaign to Stop Killer Robots are demanding that it finally be given a negotiating mandate for a legally binding instrument.
First, however, the participants in the negotiations would have to agree on a common definition of lethal autonomous weapons systems. The position of the German government and other states is that a human being must always make the decision to kill, i.e. the systems do not attack “fully automatically”. “We reject lethal autonomous weapon systems that are completely removed from the disposal of humans,” is also stated in the coalition agreement of the new Social Democratic-Green-Liberal German government.
However, this is where the ambiguity begins: If a soldier orders a flying drone to destroy a vehicle and to check the faces of its occupants beforehand, is it no longer a fully automatic attack? Yes, it is, says Marius Pletsch of the German Peace Society – United Opponents of War, which is participating in the campaign against killer robots from Germany. Because after the drone has received the kill order, the soldier is no longer involved in the routine. He can stop the attack, but otherwise the machine follows its mission independently.
The technology described is a so-called loitering munition. The Bundeswehr is now launching a new attempt to procure such a weapon. The Ministry of Defence has commissioned the German armament service provider AMDC to prepare a study for the sighting of kamikaze drones available on the market. One of the suppliers is Rheinmetall, with the killer weapons taking off from a newly developed drone tank. If the new government coalition were serious, it would have to stop this initiative started by the previous government and declare itself in favour of outlawing such systems.
If the CCW conference fails this week, a possible binding instrument under international law against the lethal, autonomous weapon systems could also be located in the UN General Assembly or even outside the UN. Here, the military powers would not be involved, says Pletsch. “But a strong new norm could be created that can have a long-term effect on these states as well”.
Image: Campaign to Stop Killer Robots (Twitter).
Leave a Reply