Robotics pioneer Roland Siegwart and philosopher Nadia Mazouz discuss dual-use technologies, drone warfare in Ukraine and whether universities should one day engage in military research.

A swarm of drones.
Many technologies developed at universities can serve both civilian and military purposes. How do researchers deal with this? (Image generated with AI: Intach / Adobe Stock)

Roland Siegwart, imagine a scenario where, three years from now, Switzerland is attacked by a swarm of enemy drones. Would you be prepared to work on measures designed to repel such an attack?
Roland Siegwart: We’ve actually debated that very question in my lab. For a long time, the idea of developing military technology was unthinkable. But the world changed with Russia’s invasion of Ukraine. We’re now facing the kind of threat that has long been alien to us – at least here in Europe. If that danger were to become more concrete for Switzerland, I would certainly be willing to help. And I also believe that we could make a meaningful contribution with our expertise.

Robots in action

Title page Globe 25/04

This text appeared in the 25/04 issue of the ETH magazine Globe.

Download Read whole issue (PDF, 4.2 MB)

Would you develop such technologies at ETH Zurich itself?
Siegwart: No. What we do at ETH is basic research, and our results must remain accessible to everyone. For specific military applications, you would need to get other entities involved, such as specialised research centres, start-up firms or established companies.

Nadia Mazouz: I agree. Science is by nature open and transparent. It only works if new findings can circulate freely and be critically evaluated. Mili?tary work, by contrast, depends on secrecy. So these two approaches are fundamentally at odds. You can’t conduct genuine research if nobody can access or scrutinise the results. Openness is essential to science.

Roland Siegwart
“We must make sure that the knowledge generated at universities doesn’t fall into the wrong hands.”
Roland Siegwart
Roland Siegwart

Many technologies developed at universities are what we call “dual-use technologies” and can serve both civilian and military purposes. How do you deal with that as researchers?
Siegwart: Virtually every technological development has dual-use potential. Drones can’t fly without GPS chips – the very same ones used in smartphones. The point is that you can never know in advance how basic research will be applied. The task of universities is to produce and share new knowledge. Any other approach would impose constraints that would make research impossible. That said, we obviously have to make sure that the knowledge generated at universities doesn’t fall into the wrong hands. That’s why ETH vets applications from high-risk countries.

Mazouz: Even a pencil is dual-use, in the sense that you can use it to write a poem or to injure someone. There are clearly technologies that are inherently problematic, such as nuclear weapons. But for most dual-use technologies, the only recourse is careful scrutiny: What exactly is the risk? Who might misuse the technology? And how great is the potential harm? We also need forums where researchers developing new technologies can discuss dual-use concerns together with political scientists, legal experts and philosophers. And that’s exactly what our new Albert Einstein School of Public Policy aims to achieve.

About

Nadia Mazouz is Professor of Practical Philosophy in the Department of Humanities, Social and Political Sciences at ETH Zurich.

Roland Siegwart is Professor of Autonomous Systems in the Department of Mechanical and Process Engineering at ETH Zurich and Co-Director

We’ve also seen civilian technologies being used for military purposes in the war in Ukraine – from consumer drones and unmanned ground vehicles to underwater vehicles. Does that surprise you?
Siegwart: No, it’s been on the horizon for some time. The real shock for many militaries is just how cheap and effective these systems are. Drones basically consist of components you can find in any smartphone. That makes them far less expensive than conventional military hardware – and thus suitable for mass production. Traditional air defence systems, where each missile costs around a million dollars, are virtually helpless against swarms of small, inexpensive drones.

Switzerland would also be overwhelmed by such a scenario. It’s safe to say that aerial warfare will change fundamentally in the future. Given the threat of a drone attack, it's difficult these days to find anyone willing to sit in a tank. And ten years from now, people may well be reluctant to fly a fighter jet, because they would stand almost no chance against a swarm of autonomous drones.

Mazouz: What struck me most was how improvised everything was at the outset. Russian tanks were being destroyed by cheap, home-made drones. This led to a war of attrition reminiscent of the First World War. Meanwhile, that initial phase of improvisation has given way to a professional manu?facturing operation with supply chains. Today, Ukraine has an advanced drone industry – and so does Russia. That marks a fundamental shift in how wars are fought – and it also means that weaker actors can hold their own through the use of drones.

The next step would be autonomous weapons systems, which some say the Russians already have. How realistic is that?
Siegwart: “Autonomous” is an elastic term. Many current drones can already be said to be auto?nomous in that they rely on GPS to reach their target. The next step would be drones navigating entirely independently, using cameras and radar, without the need for GPS or a radio link. Technically, that’s perfectly feasible. And when that

happens, defending against drones will become even harder, because many drone defence systems work by jamming radio connections or GPS signals. Computer-vision technology has already advanced to the point where a drone could autonomously identify a military target such as a tank. But that’s easier than navigating long distances without any assistance at all.

Could machines one day make war more “humane”, because they make more rational decisions and fewer mistakes than humans?
Mazouz: A “humane war” is a contradiction in terms. War is always inhumane. But I understand your point. It’s based on the idea that machines are not emotional – but the problem is that they’re not rational either! Ideally, they act exactly as we programme them. If we’ve done our job well, they would behave proportionately and attack only those targets permitted under international law. Humans, by contrast, are emotional beings, capable of both rage and empathy – and that cuts both ways. If a fellow combatant were killed the day before, a soldier might be more inclined to disregard international law. But there have also been cases where a helicopter pilot aborted an attack on a motorcycle because he suddenly saw a child sitting behind the person riding it. A machine wouldn’t be capable of that kind of empathy.

Siegwart: In the long run, autonomous systems could nevertheless behave more precisely and consistently than humans – just think of self-driving cars. But until we get to that point, there will be mistakes, because these systems don’t yet have that level of reliability. Countries like Russia exploit that uncertainty and claim, for instance, that their drones have violated another nation’s airspace “by accident”. That’s why the risk of escalation in this transition phase is so high.

Nadia Mazouz
“Is a home where people are assembling combat drones a legitimate military target?”
Nadia Mazouz
Nadia Mazouz

What impact does drone warfare have on international law?
Mazouz: New technologies have always posed major challenges to international law. It’s a fragile institution – states bind themselves to it voluntarily, and it only works if enough of them uphold these self-imposed obligations. But to answer your question: international law draws a clear distinction between civilians and combatants. The only legit?imate direct targets in war are combatants and individuals who produce military equipment at their workplace. But what about a person building a combat drone at home? Is their residence a legitimate military target? These are the kinds of highly complex questions that new warfare technologies raise, and they demand both moral and legal answers.

Let’s finish with some predictions. In what ways will warfare change in the coming ten years?
Siegwart: I’m confident that autonomous systems will have a major impact on the battlefield of the future – in the air, as well as on the ground and in the water. Traditional tanks and fighter jets will hardly stand a chance. Air defence systems are already overwhelmed today, so new solutions are urgently needed.

Mazouz: We may also see “transhuman” scen?arios: people technologically augmented to interact more effectively with machines. It sounds like science fiction, but it’s being seriously discussed. At the same time, we mustn’t forget that we live in an age of profound uncertainty. Disruption looms everywhere – from the climate to geopolitics to technology. That makes it impossible to simply extrapolate in a linear fashion. We must remain vigilant and open-minded.

JavaScript has been disabled in your browser