Ban on ‘killer robots’ faces showdown in Geneva
Switzerland is a world leader in robotics and artificial intelligence (AI), fields where discoveries can be used for both civilian and military purposes. They are also areas lacking international regulation. What’s more, international talks in Geneva are on the verge of collapse.
In March 2020 the Libyan government used a Kargu-2 quadcopter in the civil war, according to a United Nations reportExternal link. This drone “hunted” a human target without being instructed to do so. It was the first time in history that an autonomous lethal weapon – also known as a killer robot – had been deployed.
These weapons systems, developed using robotics and artificial intelligence, do not require human operation. Autonomous drones, for example, are programmed to approach a specific position, select an object and kill the target without any contact with a controlling human. As the incident in Libya shows, killer robots can also act independently. Unlike weapons of mass destruction, there are no specific treaties or regimes that condemn or prohibit these weapons and technologies internationally.
Weapons of mass destruction (WMD), such as nuclear, biological and chemical arms (ABC weapons), have greater destructive power than conventional arms. They can kill a large number of people and destroy the environment within a very short time.
WMD are governed by disarmament and non-proliferation treaties that are binding under international law, such as the Treaty on the Non-Proliferation of Nuclear Weapons, the Biological Weapons Convention or the Chemical Weapons Convention. They are intended to prevent the proliferation of nuclear weapons and to outlaw biological and chemical weapons worldwide.
In addition, there are four politically binding regimes under which participating states expand and harmonise their export controls: the Nuclear Suppliers Group, the Australia Group, the Missile Technology Control Regime and the Wassenaar Arrangement. Switzerland participates in all four.
Source: SECO External link
Opinions differ as to whether this should be viewed as an omission. In response to a question from SWI swissinfo.ch, the Swiss foreign ministry wrote that international humanitarian law applied to all weapons and technologies, including new ones such as autonomous weapons systems. “So there is no vacuum for the use of robotics, artificial intelligence and other digital technologies in armed conflicts,” it said.
Not everyone in the international community agrees. “Some states think existing legislation is insufficient,” said Laura Bruun, an expert on new military and security technologies at the Stockholm International Peace Research Institute. While international humanitarian law covers all types of weapons, the use of AI-controlled military technologies is not explicitly regulated, she said. “This creates a normative vacuum, depending on how the law is interpreted.”
EU or UNESCO rules on the ethical use of AI refer to civilian applications, not military ones, Bruun said. As new technologies such as artificial intelligence advance, it becomes increasingly difficult to differentiate between the civilian and military potential of a development. The fact that these technologies are very easy to disseminate – AI software can even be distributed via email or open source – complicates regulatory and monitoring procedures.
“Of course, international humanitarian law applies to the use of such weapons, but international regulations that take into account new types of technology are needed,” said Elisabeth Hoffberger-Pippan, a security researcher and international law expert at the German Institute for International and Security Affairs in Berlin.
Dim prospects for Geneva talks
The UN has been negotiating a ban on autonomous weapons systems in Geneva since 2017. Switzerland supports these negotiations in principle, because although it rejects a complete ban, it is in favour of regulations, controls and restrictions. Last year the Swiss mission to the UN formulated a proposalExternal link to regulate lethal autonomous weapons, joining the group of countries pushing for legally binding measures.
But there has been no progress, since Russia rejects almost every proposal for regulation. Russia even boycotted the most recent round of negotiations in March because of the ongoing war in Ukraine. But Israel, the US, Turkey, the UK and South Korea also oppose any binding regulationExternal link of autonomous weapons systems, as they believe that international humanitarian law provides an adequate framework for a responsible approach to these weapons.
More
Killer robots: ‘do something’ or ‘do nothing’?
The last meeting of the panel will take place in July. Experts do not expect much progress. Behind closed doors, states are already talking about the failure of the Geneva negotiations. The foreign ministry told SWI swissinfo.ch that for the time being there was no agreement among the states on an international instrument.
“It’s likely that not all states will want to continue supporting the Geneva process because it’s simply not worth it,” Hoffberger-Pippan said. She expects an alternative forum will be sought for negotiations on rules for autonomous weapons systems.
Why states oppose a ban
The reasons Switzerland, like most states, is not seeking a complete ban on autonomous weapons are both economic and diplomatic, according to Stephen Herzog of the Center for Security Studies. Switzerland fears the impact on its exports. In the fields of robotics and artificial intelligence, Switzerland is one of the leading countries in the world.
More
Swiss army uses drone technology. Should we worry?
Hoffberger-Pippan said this fear is only partially justified. At the moment, she said, discussions are primarily focused on international law to regulate the use of autonomous weapons systems and not, so far, on export controls. On the other hand, she pointed out, many countries fear a complete ban could make research in this area difficult.
“Investors would wonder why they should allocate funding, if potential inventions are in any case not allowed to be deployed?” she said. This is a challenge for the US in particular, but for many other major military powers too.
The US’s view is that autonomous weapons should be tested before they are banned. That would make it possible to ascertain whether they can be put to worthwhile use. Some states believe autonomous weapons even bring advantages – at least to the party using them. For example, deaths can be avoided and personnel costs saved.
Civilian drones in Ukraine
In 2017 the Swiss government decided to oppose a complete banExternal link for similar reasons, stating that it could mean prohibiting potentially useful systems that could, for example, avoid collateral damage to the population. For this reason, says Laura Bruun, discussions about regulating civilian and military applications should be conducted in parallel.
“Recognising that the distinction between the two uses is becoming increasingly blurred would be a first step towards controlling the technology,” she said.
Hoffberger-Pippan observes a paradigm shift with drones: while they used to be viewed very critically, they are increasingly gaining acceptance internationally, even among the population. In the war in Ukraine, for example, Ukrainian troops have used civilian drones on a large scale in addition to military drones, gaining an unexpected advantage over Russia.
Although the use of drones in combating terrorism remains a highly problematic legal and ethical issue, and the Russian war of aggression in Ukraine is comparable only to a limited extent, this example shows that drone use is not exclusively subject to censure. Perhaps there are also sensible uses for weapons systems that function with a high degree of autonomy.
“Our changing times are ushering in a modernisation of the military and, with that, more acceptance for technological innovation,” Hoffberger-Pippan said. So it is quite possible that public opinion towards autonomous weapons will also change.
Products or technologies that can be used for both civilian and military purposes are dual use. The problem is that an invention like nuclear technology can bring civilian benefits to humanity in the form of nuclear power plants (although these are also controversial) or medical treatments, but it can also be life-destroying in the form of bombs. It therefore makes little sense to completely ban the corresponding technology and research, but they need to be handled responsibly.
The Wassenaar Arrangement is an alliance of countries with the shared goal of preventing destabilising accumulations of conventional weapons and dual-use goods.
Source: SECO External link
In compliance with the JTI standards
More: SWI swissinfo.ch certified by the Journalism Trust Initiative
You can find an overview of ongoing debates with our journalists here . Please join us!
If you want to start a conversation about a topic raised in this article or want to report factual errors, email us at english@swissinfo.ch.