• Position : Head of Robotic Center
  • Affiliation : Skolkovo Fund
  • Position : Consultant, "Global & Regional Security: New ideas for Russia" Program
  • Affiliation : PIR Center
complete list

The Security Index Journal: Vadim Kozyulin and Albert Efimov on lethal autonomous weapons systems


MOSCOW, MAY 19, 2016. PIR PRESS – "Modern machines that lay claim to being called “robots”, surpass humans in quantitative analysis, performing routine and repeated tasks and processing large amount of information. Today, however, even a child can easily beat a robot in qualitative analysis or logical thinking", − Albert Efimov, Head of the Centre for Robotics Studies at Skolkovo Foundation.

The development of weapons systems based on the principles of autonomy has been under way for many years. Today a number of countries possess such systems, with different levels of human control, in service with their armed forces. And the facts of using of lethal autonomous weapons systems (LAWS) in recent conflicts is impossible to hide. The experts within some military authorities argue that talk of indeed autonomous lethal weapons systems is premature. However, the NGOs are raising the alarm demanding to ban “killer robots”.

The attempts to start a discussion on this topic at various international venues face with the inability to agree on a very definition of the term. Yet another problem arises: how to determine the degree of autonomy of a particular type of weapons, given that many long-existing weapons systems have autonomous functions.

In their article “The New James Bond – the Machine with a License to Kill” (the article is available in Russian) Vadim Kozyulin senior researcher at the PIR Center and Head of the Centre for Robotics Studies at Skolkovo Foundation Albert Efimov examine the main areas of LAWS development in Russia and the United States, the legislative framework under such development as well as major organizations carrying out R&D in this field.

Experts point to the shortcomings of today's military robots: "...Unlike humans they are not capable of making complex decisions and taking into account multiple circumstances; similarly, they are unable to fully comprehend the surrounding conditions or to adapt to unforeseen circumstances". However, the authors note that their presence can be very useful on the battlefield: "...They save soldiers' lives, create new opportunities for reconnaissance, potentially reducing collateral damage”. the protests of NGOs and several governments advocating for a comprehensive preventive ban of this new kind of weapons, LAWS development is well under way. That’s why this problem requires serious consideration among politicians and experts alike.

The article will be published in the next issue of the Security Index Journal №1 (116) 2016.

For all questions related to the “Security Index” journal, Editor-in-Chief Olga Mostinskaya is available at +7 (495) 987 19 15 or via email at mostinskaya at