Swiss perspectives in 10 languages

Swiss researchers warn about autonomous weapons 

distopian war image
Could science fiction one day become a reality? Richard Jones/science Photo Library

Switzerland is positioning itself as a hub to develop AI technology in drones. Scientists are flagging the risk that these algorithms are used for military purposes. 

A bombarded town appears on the screen. As troops come under attack, a soldier calls for reinforcement: “Send the Lanius!” Eight small drones take off, approach enemy forces and kill them, one by one. 

This is not a war movie but a scene from a promotional clip for the Israeli arms company Elbit Systems. Lanius, Latin for “butcher”, is a kamikaze drone system equipped with artificial intelligence (AI). It can navigate autonomously, find its way through narrow gaps in buildings, and recognise human targets. The weapon still needs human intervention to attack but technically it already has the capability to make the decision to kill by itself.   

Sitting in his office at the University of Zurich’s campus in Oerlikon, Davide Scaramuzza acknowledges: “When my team and I saw the Elbit video, we were shocked.” The similarity of his own research with the technology shown in Elbit’s videos is hard to miss. In a video produced by his university videoExternal link, a drone darts around in a forest, buzzes between buildings and flies through a window. The professor is in charge of research at the Robotics and Perception Group. He and his team are worldwide leaders in the field. Their work includes AI applications for drones that make them more agile in GPS-deprived and complex environments. 

More

Need for control 

For years, leading scientists from across the globe have been calling for more stringent regulation of research that could lead to the development of systems such as Lanius. Their aim is to stop autonomous weapons from becoming the Kalaschnikovs of tomorrow. 

Davide Scaramuzza
Davide Scaramuzza is Professor of Robotics at the University of Zurich. Davide Scaramuzza

In November 2021, eight Swiss researchers addressed an urgent appeal to the government. In their letter, only now made public and seen by the Swiss weekly SonntagsBlick, they urged Foreign Minister Ignazio Cassis to take action in order to ensure that algorithms are never allowed to make decisions about the life or death of a human. Unregulated, lethal autonomous weapons would likely become a reality within a decade, the letter states. Cassis’s response: “The Federal Council shares many of the legal, ethical and security concerns expressed by scientists and researchers with regard to such weapons.”  

Switzerland supports international efforts to establish appropriate controls, similar to existing international agreements on chemical or nuclear weapons. The problem is that a treaty on autonomous weapons is out of reach, at least in the short term, as explained by the Swiss government’s expert on the matter, Reto Wollenmann (read the interview below). Or, as US military expert Zachary Kallenborn puts it: “The major military powers don’t want to give up weapons that could be useful to them.” 

More
Protesters

More

‘We don’t want killer robots’

This content was published on Reto Wollenmann, a senior Swiss foreign ministry official, explains why an international killer robots treaty is still a long way off.

Read more: ‘We don’t want killer robots’

Swiss cutting-edge research raises ethical questions  

In Professor Scaramuzza’s laboratory, drones are displayed in glass cabinets stretched along the hallway. Part of the space is fitted out as a flight hangar, with obstacles on the floor and nets hanging from the ceiling – a safety precaution if a drone spins off course. 

Scaramuzza was one of the people who signed the letter addressed to Cassis. He does not avoid tough questions. On the contrary, the director of research spends two hours explaining how his laboratory operates and how they managed to develop new, AI-powered drones, including the ones that seem almost identical to those promoted by the Israeli arms company.  

Already back in 2009, Scaramuzza’s team managed to construct a camera-equipped drone that could fly autonomously, without the aid of a GPS. This early breakthrough was rapidly followed by a whole series of others. The professor headed a European project that developed an autopilot system, a patented device now used by millions. One of his team colleagues went to the United States National Aeronautics and Space Administration (NASA) sending Swiss-developed technology to Mars. The first entrepreneurial project from Scaramuzza’s laboratory was bought by Facebook in 2016 and went on to develop Oculus Quest, the leading virtual reality glasses, also known as the VR headset. 

+ Swiss army uses drone technology. Should we worry?

Scaramuzza is enthusiastic when he talks about his work. His team are now experimenting new sensors that would allow drones to fly through smoke and in the dark. They are developing AI algorithms to assist robots in undertaking human tasks. 

“Of course, this raises a lot of ethical questions,” explains the professor. “Anything that can be used for good can also be used for bad.” That has always been a challenge in robotics, he says, and then immediately adds a clarification: “The same algorithms we use to fly these drones were used for breast cancer screenings. They’ve saved millions of people. Should we ban them? No.” 

AI-guided drones are still too inaccurate to be used in a war, he says. But research is making progress. That’s why Scaramuzza is convinced that now is clearly the time to ask: “How do you make sure the technology isn’t misused?” 

Algorithms with a clear military potential 

Scaramuzza himself worked on a project financed by the US military institute DARPA. Purely primary research, he stresses. He also took part in a drone race organised by the US arms manufacturer Lockheed Martin, and in 2021, in Dübendorf in the canton of Zurich, he demonstrated that AI-guided drones can fly faster than a human pilot. The US military did not receive any software and was only informed of the results of the project just prior to publication of his research, Scaramuzza says. Nor did his team deliver any code to the defence contractor, according to the professor. 

But, the kind of visual navigation he researches plays a key role in military applications. So what is the connection between the arms company Elbit’s new weapons systems and his research at the University of Zurich? “They use similar algorithms,” he explains. 

There is no direct contact or technology transfer between his lab and Elbit, says the researcher. “I condemn any military application of our technology”. Each joint project with an external company is checked by the university. In the case of dual-use goods — items that can be used for both civilian and military purposes — permission has to be granted by both the university and the State Secretariat for Economic Affairs (SECO). The University of Zurich’s media office confirms this.  

However, indirect connections exist. Scaramuzza readily admits this. Academic publications are mostly freely available; and members of staff take their knowhow with them when they change jobs – including when they move on to an arms company. 

It is a delicate balance: progress lives from the free sharing of knowledge. The researcher stresses that censorship of any kind is dangerous. However, there are ways to minimise the risks, Scaramuzza adds: “Researchers can withhold parts of a code or pass them on only under licence.” This already happens today, he says, for ethical or commercial reasons.  

But structured processes governing these arrangements don’t exist in Switzerland. Abroad, some universities have started asking about risks, says the professor: “That’s good because it opens researchers’ eyes.” 

This investigation was supported by a grant from the JournaFONDS. It first appeared in SonntagsBlick on 15 January 2023.  

JournaFONDS logo
Stefany Barker
More

Popular Stories

Most Discussed

In compliance with the JTI standards

More: SWI swissinfo.ch certified by the Journalism Trust Initiative

You can find an overview of ongoing debates with our journalists here . Please join us!

If you want to start a conversation about a topic raised in this article or want to report factual errors, email us at english@swissinfo.ch.

SWI swissinfo.ch - a branch of Swiss Broadcasting Corporation SRG SSR

SWI swissinfo.ch - a branch of Swiss Broadcasting Corporation SRG SSR