Experts

  • Position : Head of Robotic Center
  • Affiliation : Skolkovo Fund
  • Position : Project Director, Project on Asian Security; Project Director, Emerging Technologies and Global Security Project
  • Affiliation : PIR Center
complete list

Security Index: Unrecognized threats of military robots

28.02.2017

MOSCOW, FEBRUARY 28, 2017. PIR PRESS – “One of the most important questions concerning the usage of lethal autonomous weapons systems (LAWS) is the meaning of the potential loss of human control on the use of force and its consequences”, – Gilles Giacca, Legal Adviser of the International Committee of the Red Cross (ICRC) Legal Division.

The evolution of the warfare relates to the development of technologies as closely as every other domain of state or social life. The image of future war has been associated with the participation of robots for a long time, provoking fierce debates among politicians, military authorities and executives of humanitarian organizations. The most fervent discussions concern the perspective of LAWS usage on a battlefield; an armed vehicle fitted with artificial intelligence and – potentially in future - able to make autonomous decisions. The question of the introduction of a new form of weapon turns out to be relatively new for the international agenda and stands at the crossroads of political, legal, technological, and moral concerns. Thus, the elaboration of a single approach becomes difficult.

What are the criteria to define LAWS? Will existing international norms be enough to control  these robots on a battlefield? What are the threats of using these armed machines in war zones? All these questions, as well as some others, were discussed during the round table organized by PIR Center. Vadim Kozyulin gathered the most prominent experts in the field for the discussion: Research Fellow of the Lauterpacht Centre for International Law, University of Cambridge Tom Grant; First Secretary of the Department for Nonproliferation and Arms Control, Ministry of Foreign Affairs of the Russian Federation Andrei Grebenshchikov; Legal Adviser of the International Committee of the Red Cross (ICRC) Gilles Giacca;, Head of the Robotics Center, Skolkovo Foundation Albert Efimov; Professor of the Xi'an Political Academy Xinping Song; and Сoordinator of the Campaign to Stop Killer-Robots Mary Wareham.

Despite the opinion of Albert Efimov (“in the present state of artificial intelligence it is hard to even imagine killer-robots that will be able to choose goals and that means that the present fears on the part of experts are a bit exaggerated”), experts tend to think that it is necessary to discuss all problems related to LAWS implementation, even at the present stage. Mary Wareham believes that “states should start the discussion on a preventive ban of development, production and implementation of completely autonomous weapon systems. It can be done only after the necessary level of considerable human control over key fighting function will be defined. More specifically, it concerns taking the decision on the commission of murder in each particular case”. The ICRC, according to Gilles Giacca, is concerned that “at the present stage of technological development, autonomous systems are being ignored during discussions. Instead of the question of complete control over robots’ autonomy, the question of remote control of a human operator is discussed”.

During the discussion, the experts examined legal aspects of war robots’ implementation into conflicts. Xinping Song emphasized two main directions of international lawmaking which should be prioritized before LAWS introduction. “Firstly, it is necessary to highlight the principle of division between military targets and civilian population during war. Several LAWS don’t correspond to this principle due to the lethality of their damaging action and some other reasons. The Second important principle is the principle of proportionality. It reads that the scale of damage should correspond to the importance of a military task. Several LAWS can cause massive damage of such scale that the principle of proportionality will be violated”.

Andrei Grebenshchikov expressed the doubts of Russian Ministry of Foreign Affairs concerning the idea that existing international norms are not enough for regulating LAWS. “In our view, international humanitarian law, in case of its proper adherence, already introduces some limits to elaboration and implementation of such systems. Particularly, it concerns Additional Protocol I of 1977 and Geneva conventions of 1949 that introduced the principles of selectiveness, proportionality and the implementation of precaution measures for defense of civilians”.

The full text of the round table: “Military robots: expected and unexpected threats” is available in Russian in the new issue of Security Index Journal (118-119) 2016.

You can send your feedback about the issue, your questions about the possibility of publication in the "Security Index" as well as placing your advertisement by e-mail: editor@pircenter.org


 

Comments

 
 
loading