Chapter 7. Lethal Autonomous Weapons Systems in the Context of Multinational Disarmament

April 15, 2024

For the last decade – since 2013 – the 1980 United Nations Convention on Certain Conventional Weapons (CCW) has held quite a number of informal meetings of experts as well as formal meetings in the format of Group of Governmental Experts (GGE) to discuss the issues related to lethal autonomous weapons systems (LAWS)[1]. For three years the discussions (2013-2016) were held at the informal level. Three informal meetings of experts have been convened to discuss the problems related to emerging technologies in the area of lethal autonomous weapons systems in the context of the objective and the purpose of the Convention. The mandate of these meetings gave no indication of what the outcome of the discussion should be, beyond creating an opportunity to deepen the understanding of these weapon systems.

Building on two informal meetings in 2014 and 2015 that had already underscored the importance of technical, ethical, legal and strategic questions raised in connection with autonomous weapons systems, it was the objective of the third – in fact the last – informal meeting of experts in April 2016 “to discuss further the questions related to emerging technologies in the area of lethal autonomous weapons systems (LAWS), in the context of the objectives and purposes of the Convention of Certain Conventional Weapons”[2]. At this informal meeting of experts in April 2016, the participating states agreed to recommend at the CCW’s Fifth Review Conference in December 2016 to continue the deliberations by establishing an Open-ended Group of Governmental Experts. The main purpose of the GGE was to consider options relating to LAWS, such as regulation, prohibition, or to take no further action. So, the GGE was established by the decision of the CCW’s 5th Review Conference in December 2016[3]. Since that time there have been organized six consequent meetings of GGE and quite a number of intercession discussions.

The most common types of weapons with autonomous functions are defensive systems. This includes systems such as antivehicle and antipersonnel mines, which, once activated, operate autonomously based on trigger mechanisms. 

Newer systems employing increasingly sophisticated technology include missile defense systems and sentry systems, which can autonomously detect and engage targets and issue warnings. Other examples include loitering munition (also known as suicide, kamikaze or exploding drone) which contain a built-in warhead (munition) and wait (loiter) around a predefined area until a target is located by an operator on the ground or by automated sensors onboard, and then attacks the target. These systems first emerged in the 1980s; however, their systems functionalities have since become increasingly sophisticated, allowing for, among other things, longer ranges, heavier payloads and the potential incorporation of artificial intelligence technologies. 

Source: United Nations Office for Disarmament Affairs
(https://disarmament.unoda.org/the-convention-on-certain-conventional-weapons/background-on-laws-in-the-ccw/)
 

As the founder of the World Economic Forum Klaus Schwab and other prominent analysts and researchers note, the history of warfare and international security is the history of technological innovation, and today is no exception. Modern conflicts involving states are becoming increasingly hybrid in nature, combining traditional battlefield techniques with the elements previously associated with non-state actors. The distinction between war and peace, combatant and non-combatant is becoming uncomfortably blurry, and this is profoundly impacting the nature of national and international security regimes, affecting both the probability and the nature of conflict[4]. In this context autonomous weapons systems are widely considered as a transformational technology that will have far-reaching implications on all levels of strategic and military operational decision-making.

Why the CCW?

The topic of LAWS has been discussed in other, mostly informal, levels but the CCW has been chosen as the priority target of the research because it gives profoundly more complete and more balanced multilateral picture of the main results around the key discussions points of the issue of LAWS. The answer lies in the nature of the Convention which uniquely combines the humanitarian dimension of international relations with disarmament and arms control. Since 2013 the CCW format has grappled with a broad range of concerns raised by the prospect of weapons that, once activated, would select, and engage targets without human control – autonomous weapons systems. There is a widely spread among some states and numerous nongovernmental organizations (NGOs) rather radical opinion that the deployment of autonomous weapons systems – even if such systems could be designed to fully comply with relevant legal requirements – raises fundamental ethical issues.

Should machines ever be given the power to decide over life and death or to inflict severe injury, autonomously and without any direct human control or intervention, would such a development towards mathematically calculated life and death decisions and the lack of empathy in such machines not undermine the principle of human dignity and challenge the principle of humanity as well as the dictates of public conscience? According to the shared position and the opinions expressed by some international players (a number of European, African and Asian states, the majority of the Latin America states, numerous influential profile NGOs) the focus area of the CCW meeting shall be the legal dimension of autonomous weapons systems use, with a particular emphasis on the laws of armed conflict, international human rights law and accountability issues. The main argument formulated by them is that, beyond any doubt, autonomous weapons systems do not comply with the laws of armed conflict and therefore cannot be fielded.

“LAWS could have far-reaching effects on societal values, including fundamentally on the protection and the value of life”.  

Report of the Special Rapporteur on Extrajudicial,
Summary or Arbitrary Executions
Christof Heyns
2014
Source: United Nations Digital Library
(https://digitallibrary.un.org/record/771922?ln=en)

The 2012 report of the Human Rights Watch finds that “autonomous weapons systems requiring no meaningful human control should be prohibited”[5]. What is the gist of the positions of those international players which have been driving the international fora and multilateral disarmament mechanism towards the acceptance of the idea to seriously explore the issue of LAWS with a view to agreeing on some concrete diplomatic measures? These positions, in fact, remain mostly unchanged.

The 2013 and 2015 statements endorsed by more than 270 engineers, computing and artificial intelligence experts, roboticists, and professionals from related disciplines in 37 countries were interested how devices made and deployed by opposing forces and controlled by complex algorithms would interact, thus issuing the warning that they could “create unstable and unpredictable behavior that could initiate or escalate conflicts or cause unjustifiable harm to civilian populations[6]. They further elaborated: “We are familiar with human error in the handling of technology with catastrophic consequences from tragedies such as Three Mile Island, Chernobyl, Fukushima, and Bhopal. Nuclear warheads accidentally have dropped off planes of the developed countries with the most sophisticated safety systems in place. Thus, accidents from fully autonomous weapons cannot be ruled out”[7].  

The report by the Center for a New American Security on the operational risks associated with autonomous weapons looks at potential types of failures that might occur in completely automated systems, as opposed to the way such weapons are intended to work[8]. In 2014, more than 20 Nobel Peace Laureates issued a joint statement calling for a preemptive ban on fully autonomous weapons, which expressed concern that “leaving the killing to machines might make going to war easier and shift the burden of armed conflict onto civilians”[9]. A number of countries have expressed particular concern at the possibility that lethal autonomous weapons systems can negatively impact peace and destabilize regional and global security.

It is considered that the LAWS are potent to:

  • escalate the pace of warfare and the likelihood of resorting to war, in large part due to the promise of significantly reduced military casualties;
  • ignite and foster arms races, in case their possession by some states makes all other states feel compelled to acquire them, against the possibility of asymmetric warfare as a result of the discrepancies between the technological haves and have-nots;
  • be acquired and used by non-state armed groups, including terrorist organizations;
  • undermine existing law, controls, and regulations;
  • escalate war and increase the likelihood of going to war. 

One reason for military interest in autonomous functions is the increasing speed or pace of warfare despite the warnings coming from scientists and others about the risks posed by autonomous systems interacting at speeds beyond human capacities. Human Rights Watch and others have described insurmountable legal and practical obstacles that would likely interfere with holding someone accountable for unforeseeable, unlawful acts committed by a fully autonomous weapon. Quite a number of delegations to the CCW experts meeting, both informal and formal, have stressed out that the use of lethal autonomous weapons could change not only the way war is fought, but how wars end. If one resorts to lethal autonomous weapons systems, what about terminating the conflict? When do you stop? When is a political objective achieved?

The longer lethal autonomous weapons systems go unregulated, the greater will be the risk of their proliferation, especially to non-state actors. This problem is aggravated further when and if such weapons are considered to have essential military benefits. According to an informed opinion, the Middle East, Asia, and other regions are experiencing growing political extremism propagated by nationalist groups, ethno-religious movements, and other non-state actors, for whom international norms are irrelevant. The technology is increasingly within reach due to ever-more advanced drones capable of carrying sophisticated imaging equipment and significant payloads readily available at the civilian market.

Some stakeholders of the CCW have been stressing that the prevention principle is also relevant to these deliberations. So is the fundamental Martens Clause, which mandates that the “principles of humanity” and “dictates of public conscience” be factored into an analysis of their legality[10]. The debate and opinions about autonomous weapons systems continue to evolve. Discussions at CCW GGE held at the 2016-2023 meetings have demonstrated that there is a widely shared assumption that decisions to kill may not be delegated to autonomous systems. Future discussions about these systems should therefore proceed from the understanding that where weapons systems are at issue, human control must be retained.

Autonomy: a general picture

There are currently underway the discussions at a variety of national and international fora on the factor of autonomy and weapon systems. The descriptions of autonomy in weapon systems often start with the concept of a technological spectrum moving from remotely controlled systems on the one side to autonomous weapon systems on the other. Autonomy increases as one moves along the spectrum from the objects remotely controlled by human operators (such as remotely piloted unmanned aerial vehicles, UAVs) to automatic and automated systems, to fully autonomous ones[11].

Modern armed forces currently employ the whole range of systems across this spectrum: surveillance devices, range-finding and targeting devices, land-based transport vehicles, aerial vehicles, and robots designed for high threat tasks such as explosive ordnance disposal. Today, these systems are clustered at the lower end of the spectrum (remote controlled and automatic/automated ones). Some states have expressed interest in moving further along this spectrum towards greater autonomy, perhaps as far as fully autonomous lethal weapons. There are functions that, when made more autonomous, are considered generally acceptable, such as navigation, transportation, and others. Some stakeholders are less at ease with applying increasingly greater autonomy to other functions, such as target selection. There are still other functions that some consider to be of a great concern when the autonomy function is applied – such as the decision to use force and weapons release. Explicitly naming and then reaching a shared understanding among the states about which of these functions are of the greatest concern or raise uncertainties have proved to become the object of serious contradictions at different fora including primarily the CCW.

Autonomy: pro and contra

Autonomy has turned to become the key issue in international discussions on LAWS. The notion of autonomy is a characteristic of a technology, attached to a function or functions, not an object in itself. A system, for example, might be able to autonomously navigate, yet not be autonomous in selecting its targets. It is the function that can be turned on or off in particular circumstances. The general conclusion of the substance of the discussions is the following – there are environments where autonomy is a more beneficial or less risky feature than in others. Not all autonomous functions raise similar concern: some might be uncontroversial while others may invite significant legal, ethical, and strategic questions.

Current military interest focuses on increasingly autonomous systems with a limited range of missions, for example force protection, demining, and surveillance of dangerous environments. There is also interest in the defensive use to protect borders or military installations with the help of the systems capable of attack. However, as some countries within the CCW format stated, if autonomy increases, the advantage of removing humans from decision-making is becoming less clear. Some delegations to the CCW claim that superhuman response times will be a must in future conflicts and therefore only machines will be able to take such decisions. Some disagree with this commonly held opinion. Others insist that this issue requires much deeper examination.

Some military strategists question the necessity or even desirability of delegating responsibility for a decision on launching an attack to autonomous systems. In many militaries an enhanced centralization of military decision-making is taking place facilitated by modern communications and made necessary due to political calculations that are essential to ensure mission success and avoid unacceptable political costs for the action taken. Systems that would have the authority to take attack decisions appear to run counter to that trend. From the perspective of centralized command and control, allowing a machine to make decisions on the use of force could appear to be both risky and unnecessary.

A significant amount of what is really known about military interest in autonomy comes from only a few countries.  For example, only few states are materially involved in research and development (R&D), production and putting into military use highly automated military systems to say nothing of their real military utilization. Other states’ perceptions of utility, necessity, desirability, and consequences have been very often of theoretical rather than practical nature. But we must admit that the meetings on this issue within the framework of the Convention on Certain Conventional Weapons offered the states an opportunity to voice their concerns and convictions, thereby broadening and deepening the discussion. Despite serious contradictions on the factor of autonomy expressed during the CCW meetings the general conclusion to be made is that the trend towards gradually increasing autonomy in military systems in general and in weapons systems in particular will continue in the future. With the possibility of autonomous ground, air, surface, subsea, and space vehicles looming large, it is likely to affect all the domains of warfare.

Some key definitions

The balance of power between the most prominent states or groups of states with regard to fundamental substantive issues of the topic is of key importance for understanding the intermediate results of the consideration of LAWS at the CCW platform and for the prospects for its further discussion. One of the first and, in fact, the principle question that the GGE has started to consider in a more focused than during the informal part of the discussions in 2016 way is the definition of an autonomous weapons system. Without clarifying the subject of the discussions, they are doomed to remain mostly in a very speculative form. Even to date, to say nothing of the initial stage of the work there is still a good deal of dismay and differences among the states about it.

It could lay the ground for very politicized and thus counterproductive consequences – artificial division of weapon systems into good and bad ones from the point of view of their complying with international humanitarian law (IHL). On the other end of the spectrum are those states that seek to define autonomous weapons in such a way as to narrow down the kinds of systems that would qualify.

The International Committee of the Red Cross (ICRC) focuses on the critical functions approach to autonomous weapons, where the concern is over the functions of a system that enable it to select and attack targets without human intervention. The ICRC, moreover, is urging the states to “set limits on autonomy in weapons systems to ensure that they can be used in accordance with international humanitarian law and within the bounds of what is acceptable under principles of humanity and the dictates of public conscience”[12]. The ICRC has defined autonomous weapon systems in the following way: “Any weapon system with autonomy in its critical functions. That is, a weapon system that can select (i.e., search for or detect, identify, track, select) and attack (i.e., use force against, neutralize, damage, or destroy) targets without human intervention”[13]. The advocates of this functional approach stress that the advantage of this broad definition, which encompasses some existing weapon systems, is that it enables real-world consideration of weapons technology to assess what may make certain existing weapon systems acceptable – legally and ethically – and which emerging technologies developments may raise concerns under international humanitarian law and under the principles of humanity and the dictates of public conscience.

As the United Nations Institute of Disarmament Research (UNIDIR) eloquently reminds us, “ultimately the autonomy question is really about what control/oversight […] we expect humans to maintain over the tools of violence that we employ”[14]. Some states prefer a narrow definition more closely linked to identifying the type of weapon systems of the greatest concern. In addition, some narrow definitions distinguish between highly automated and autonomous weapons. However, many experts emphasize that there is not such a clear difference from a technical perspective, and the core legal and ethical questions remain to be the same.

From the very beginning of the discussions, both in unofficial and official formats, Russia has proceeded and continues to proceed from the premise that the subject of the discussion in the GGE on LAWS should be limited to fully autonomous military and dual-use systems, which are proposed to be understood as “unmanned technical means that are not munitions and are designed to perform combat and support tasks without any participation of the operator”[15]. This revised definition was proposed by the Russian representatives, who suggested that it should be considered more carefully and serve as a guidance in further consideration of LAWS issues. The attention is drawn to the main argument in favor of the proposed definition – it is precisely to reach a common understanding of the subject of the discussion that would introduce more specifics into the debate.

Defining what constitutes an autonomous weapons system and how it differs from presently fielded automated systems is not merely a technological concern, it is a political concern. As such, it will be up to states to decide. Nevertheless, this is not to say there is no sense of agreement amongst states at present. Indeed, what has emerged over the past ten years is the consensus that the states agree to bear an obligation to respect and ensure respect for IHL in all circumstances. This basic respect for IHL means that belligerents should take necessary steps to ensure compliance with it. This could bring hope – very faint though – that this fragile consensus will help remove some of the worries over autonomy in weapons systems.

Applicability of current IHL rules to LAWS: Russia’s position

In the course of discussions on LAWs, a number of delegations including the Russian Federation have expressed the opinion that current international law as well as the international humanitarian law, contain a number of important restrictions that can be applied to all types of weapons without exception, including potentially lethal autonomous weapons systems, and do not need to be modernized or adapted to the specifics of these weapons. These positions of Russia were recorded on paper, in a number of working documents of the Russian Federation on the legal aspects of the LAWS issues, and above all, in the document submitted in March 2023[16]

In particular, the document states that among these restrictions are: (1) indiscriminate, disproportionate use of LAWS, their use against the civilian population, as well as without taking precautions to protect them; (2) any military use of LAWS must be carried out in accordance with the principle of proportionality between military necessity and damage caused; (3) the decision on the expediency, forms and methods of LAWS use is made by the person planning the military operation and forming scenarios for the use (mission) of these systems. It was emphasized by quite a number of states that the responsibility for the use of LAWs lies with the official who sets the task for such weapons systems and gives the order to use them. At the same time, he/she must have the appropriate knowledge and skills necessary to operate these facilities.

An analysis of the experts’ position allows us to insist that these states proceed from the inexpediency of developing additional principles and norms to regulate or limit the operation of LAWs. At the same time, it is impossible not to agree with the key argument on this matter: there is no reason to assert that the principles of humanity per se, the requirements of public conscience (for example, the famous reservation of the Russian professor of Saint Petersburg University Fyodor Martens (Martens Clause), as well as human rights component, can be considered as an absolute and the only sufficient condition for the introduction of restrictive and prohibitive regimes on specific types of weapons.

In this context, the representatives of some CCW states (Russia China, India) both during informal discussions and within the framework of the GGE, have jointly opposed until the last moment the development of any legally binding international instrument in relation to LAWS and the introduction of a moratorium on the development and the use of such systems and the technologies for their creation. In addition, Russia’s position proceeds from the premise that the discussions on agreeing certain rules of conduct with regard to the LAWS are premature. This provision appears to be crucial for understanding possible further international action with regard to LAWS.

Concerning the possibility of triggering the development of new treaty restrictions on specific types of weapons, a key prerequisite for this is the existence of clear evidence that their use is so destructive that it can under no circumstances comply with the core principles of IHL, for example, in case it is indiscriminate, causes unnecessary destruction and suffering, and has an extensive and long-term impact on the environment. This approach is applicable to all types of weapons, means and methods of warfare. In line with this logic some experts in the CCW format have repeatedly pointed out that it is impossible to extrapolate to any weapons system the moral principles on which the IHL is based and insist on their observance, as well as to endow it with legal personality, since it is developed by man. Moreover, the justification of the preemptive prohibition of LAWs premised on IHL moral postulates is fraught with unnecessary fragmentation of the existing international legal regulation and the inevitable artificial division of weapons into good and bad in this case.

Artificial intelligence (AI) and the growing autonomy of weapons. The notion of meaningful human control

Starting from 2018, amid the consideration of the recent scientific and technological achievements on weapons development the man-machine issue has been put high on the agenda in the discussion of LAWS within the CCW framework. In numerous statements and in its working papers on this crucial topic many states including the Russian Federation sharing different basic views on the principal issues relating to LAWS have reaffirmed their commitment to a really key element – the need to preserve human control over the machine in order to ensure compliance with the existing norms of international law. Their stand on the issue is reflected in the 2018-2023 GGE reports, including the guidelines for emerging technologies in the field of LAWS[17].

A distinctive feature of the Russian approach though was that effective human control over machine could be ensured not only by a direct control. In addition, the imposition of limits on the objectives, the duration and scope of LAWS could increase their predictability and thus facilitate compliance with IHL. The concept of meaningful, appropriate, or adequate human control over critical targeting and attack functions in weapons systems and their use in individual attacks has immediately gained wide currency within this international debate. Many states and NGOs have affirmed the need to retain human control over autonomous weapons and want to further explore this concept as an approach to tackling future weapons systems[18].

In this context, it is appropriate to emphasize that the concepts of meaningful human control, form and degree of human involvement and other categories promoted by a number of countries, which are generally unrelated to law, are fraught only with politicization of the discussion. After all, the determination of the degree of significance of such control would be a value judgment and purely arbitrary. Thus, this concept could become a tool for the pursuance of vested national interests. At the same time – and this distinguishes the position of other key players – USA, China, India, as well as Russia, for example – from the positions of a number of radically-minded countries – the specific forms and methods of control should remain at the discretion of states and not be mandatory and universal in their nature.

Yet, states and individuals shall at all times be responsible under international law and national legislation for their decisions on the creation and application of new technologies in the field of LAWS. The responsibility for the application of LAWS lies with the person who assigns them the task and orders their use. This understanding, as an alternative opinion, is recorded in the reports of the Group of Governmental Experts on the results of its work in 2018-2023, including in the guiding principles for new technologies in the field of LAWS[19]. In this context Russian representatives as well as some other states have repeatedly stressed that they consider Additional Protocol 1 (AP-1) of 1977 to the Geneva Conventions of 1949 as an effective preemptive mechanism for the misuse of potential LAWS. Moreover, the obligations listed above are considered by a number of countries (including those that have not acceded to this instrument) as rules of customary law to be applied regardless of whether the state is a party to AP-I or not. It is impossible not to agree with the repeated calls on a number of states to withdraw the reservations made when ratifying this IHL instrument, as well as on those that have not yet acceded to these documents to become full-fledged parties to them as soon as possible.

Article 36

The importance of the Article 36 of the 1977 Additional Protocol I to the 1949 Geneva Conventions has been repeatedly stressed out in the discussions. It requires states to conduct a legal review of all new weapons, means and methods of warfare in order to determine whether their employment is prohibited by international law[20].  According to the opinions expressed within CCW and other formats autonomous weapon systems should, like any other weapon systems, be subject to such a legal review[21].  

Article 36 reviews, however, are national procedures beyond any kind of international oversight, and there are no established standards with regard to how they should be conducted. Some states may be less willing or less able than others to review the lawfulness of weapons that contain autonomous features and the vast majority do not have a weapon review procedure (despite the fact that this is a requirement as a matter of law) and have to develop one from scratch.

Some CCW member states have been trying to promote a practical obligation on states to prevent the use of weapons that violate international law by employing a mechanism, colloquially referred to as a weapon review, a legal review or simply Article 36 review, that can determine the lawfulness of any new weapon or means or method of warfare before it is used in an armed conflict. This obligation derives from the basic rules set out in article 35 of AP-I, which state that the right of states to choose means and methods of warfare is not unlimited. International law includes both general rules and treaty law that prohibit or restrict specific effects or types of weapons, means and methods of warfare[22].  

As a general rule, international humanitarian law prohibits the use of weapons, means or methods of warfare that cause superfluous injury or unnecessary suffering, or damage military objectives and civilians or civilian facilities indiscriminately. There are also a number of rules under treaty and customary law that ban specific types of weapons, such as biological and chemical weapons or blinding laser weapons. These prohibitions and restrictions are intended to set their minimum. It seems that the main aim of current international law is to minimize the negative consequences of an armed conflict by limiting its parties in the choice of means and methods of conducting military operations. It is known that the IHL relevant principles are enshrined in Article 35 of Additional Protocol I to the Geneva Conventions of 1949, which creates, in fact, a legal prerequisite for the development of international legal restrictions on the use of certain types of weapons. They are also recorded in a number of other authoritative sources, including CCW and the Saint Petersburg Declaration of 1868[23].  In this context it is hardly possible not to agree with the assessments of Russian experts, shared by experts from USA, China, India that the appeals of a number of radical states for developing a universal mandatory mechanism for conducting such legal reviews specifically designed for LAWS are redundant.

It is quite appropriate to assume that general efforts should be refocused onto further universalization of the AP-I to the Geneva Conventions and the withdrawal by states of the reservations made when ratifying this IHL instrument. Another idea expressed by a number of states at the GGE meetings is the voluntary exchange of best practices to ensure human control of a machine, as well as the fulfillment of obligations under Article 36, of course, taking into account the considerations of national security and trade secrets. There is no reason to assert that the principles of humanity per se, the requirements of public conscience (for example, the Martens Clause), as well as the human rights component can be considered as an absolute and only sufficient condition for the introduction of restrictive and prohibitive regimes with respect to the specific types of weapons.

Controversies over the harmfulness and usefulness of LAWS

Proponents of bans on LAWS have emphasized the immorality inherent in LAWS, their inherent inability to conform to the norms and principles of IHL, especially in light of the development of AI capabilities and the possibility of a human losing control of the machine. In contrast, opponents of radical bans on LAWS, including the countries such as Russia and China have promoted the idea that LAWS can be more effective than a human operator in solving some key problems, for example reduce the likelihood of errors. In addition to their technological advantages (accuracy, speed, efficiency), such weapons neutralize the risks caused by human factor (operator errors related to his mental and physiological state, religious and moral attitudes).

The sum and substance of their arguments is that LAWS in the proper meaning of the word do not yet in fact exist. As for the use of highly automated technologies which are being rapidly developed and introduced, they can increase the accuracy of targeting weapons directed against military targets and help reduce the likelihood of deliberate strikes on civilians and civilian objects. They are not subject to the inherent human flaws. They cannot act out of vengeance, panic, anger, prejudice, or fear. It is appropriate to state that it is on this fundamental issue that a rather deep rift has emerged in the positions of the GGE participants.

A number of presentations were made at the sessions of the CCW GGE (by such states as Russia, China, Israel, South Korea) which demonstrated the advantages of highly autonomous weapons systems in the context of their military use, including compliance with international humanitarian law.  It has been shown by specific examples that such means play an important role in solving the problems of defense, combating terrorist threat, the threat of mines, destruction of military facilities, protection of strategically important objects (APS, dams, bridges, etc.), elimination of terrorist groups, protection of civilians.
As for Russia’s position, it proceeds from the premise that the Martens Clause, by virtue of its nature, can only be used to assess human behavior in the course of military operations, including decision-making in the programming and use of the weapons in question, but not the actions of the weapons themselves. It is impossible to demand from a machine the compliance with the principles of humanity and the requirements of public conscience. There is no denying that, under international law, governments, and individuals (including development engineers and manufacturers) are responsible for the creation and use of LAWS at all stages of its life cycle. Thus, officials who set the task for these weapons systems and give the order for their use should be responsible for the use of IHL. A number of states have taken note of this provision. The key in this broad range of international legal issues, around which the discussions are flaring up, is the position that the existing international legal regulation is sufficient.  It is clear that the restrictions and principles arising from IHL apply to all types of weapons without exception, including potential LAWS. This means that the above-mentioned systems, their technical characteristics, and features derived from the presence of autonomy, as well as their use in combat operations, must comply with the principles of proportionality included in the preamble to CCW.

The character of the discussion at the CGE meetings demonstrates that there is no strong case to date for the introduction of any new restrictions and prohibitions on LAWs, modernization, or adaptation of IHL due to the development of autonomous technologies. Moreover, it has been noted that it is the high level of autonomy that allows eventual LAWS to operate in a dynamic combat environment and in a variety of environments, while ensuring the appropriate degree of selectivity and accuracy and, as a result, their compliance with IHL principles and regulations. In this regard, Russia and like-minded states propose to focus on the analysis of the existing international legal norms in the context of LAWS and oppose the development of any legally binding international instrument in relation to LAWS within the framework of GGE and the introduction of a moratorium on the development and use of such systems and the technologies used to create them.

What to do with LAWS? The way ahead

The main political and diplomatic dilemma closely linked to the work on this issue within the framework of the CCW GGEs has been and still is about what output of the informal and later formal discussions on LAWS should be from the point of view of international legal norms and standards. In other words, should formal negotiations on LAWS be started at CCW? And if yes, what should be the expected formal outcomes?

The Campaign to Stop Killer Robots and the NGO coalition have been pushing state parties to negotiate and adopt a preemptive ban on the development, production and use of LAWS, and some states have expressed their readiness to discuss this possibility so far.  Most states are still in the process of analyzing the issues at stake and determining their positions[24]. At the CCW GGE meetings a core of radically minded states, including Austria, Norway, Sweden, Ireland, Switzerland, the Holy See, quite a number of African states, most Latin American countries, as well as numerous NGOs, are in favor of the transition to the GGE negotiation mode with an eye on the development of certain bans on LAWS within the CCW framework. The position on the recent proposals and signals of a number of countries, mostly in favor of the introduction of bans and deep restrictions on LAWS, to transfer the discussion of the topic of LAWS onto new platforms or other specially formed formats, in case the CCW platform does not live up to radical hopes, also looks very indicative. As an alternative, it is proposed, in particular, to launch the mechanism of diplomatic conferences, analogous to the process of developing the Convention on the Prohibition of Cluster Munitions (CCM). In accordance with the positions recorded both in the speeches and in the working documents, Russia, and like-minded states such as China and India proceed from the fact that the relevant GGE created within the framework of the Convention on Certain Conventional Weapons is the optimal platform for discussing the topic of lethal autonomous weapons systems. This position is shared by a number of other players.

Another point dividing the countries participating in GGE, almost since the 2016 CCW Review Conference when the mandate for the establishment of the Group was adopted, is the question of the format of discussions – discussions or negotiations. Positions on this issue divided the CCW participants. Moreover, a number of states that do not consider themselves as supporters of radical bans on LAWS were among the supporters of negotiations (for example – most of the EU members, primarily Germany, Italy, Belgium, the Netherlands, France, Greece). Russia’s position on this issue remains unchanged – it is expedient to continue the discussion of the topic of LAWS in GGE on the basis of an agreed discussion mandate and in full compliance with the goals and objectives of the Convention, without going beyond its scope. The added value of GGE will depend on the practical applicability of its developments, primarily for CCW purposes.

An international document on LAWS: to be or not to be

In this regard, it is hard not to agree with the position expressed by a number of CCW member states that current international law, including IHL, already contains a number of important restrictions that fully apply to weapons systems with a high degree of automation and potential LAWS and do not need to be modernized or adapted to the specifics of these means. When considering the use of potential LAWS, it is important to proceed from the need to maintain the balance between the legitimate defense interests of states and humanitarian concerns, as recorded by CCW, as well as from the inadmissibility of indiscriminate and disproportionate use of LAWS, their use against the civilian population, as well as without taking precautions to protect them.

Decisions on war and peace are political, with the security issues requiring serious debate and discussion. It seems that CCW is an optimal forum for addressing the issue of LAWS, given the unique nature of the Convention. In this regard, it is counterproductive to transfer this topic to other international fora. It would only be duplicating the controversial experience of the CCM ban and once again advocating further fragmentation of the UN disarmament machinery and taking the expert work out of the approved UN formats.

It is also important to draw attention to the fact that it is advisable to discuss the implementation of the existing obligations under export control regimes within the context of LAWS in the appropriate export control formats. The discussions of LAWS should be fully consistent with the aims and objectives of CCW and should not go beyond its scope. The Convention proved its relevance by swiftly adopting a discussion mandate on this topic, but there has been little progress made apart from carefully and objectively building a base of common knowledge. That is because the discussions largely lack a goal to work towards and due to the great variety of contradicting positions, approaches, and opinions. If this challenge persists, CCW states must feel compelled to take reasonable and cautious steps and avoid giving a disappointing and inadequate response.

Expectations are running high. No one wants a long, drawn-out, and inconclusive process. So, in order to avoid disappointing results policymakers should focus on wise measures that can take the CCW swiftly towards a balanced, solid but realistic outcome. As a variant of such kind of results could be the adoption of a set of best practices or recommendations with regard to potential LAWS.


[1] The Convention on Certain Conventional Weapons // United Nations. URL: https://disarmament.unoda.org/the-convention-on-certain-conventional-weapons/.

[2] Developments in the Field of Information and Telecommunications in the Context of International Security // UNODA Fact Sheet, 2015. URL: https://unoda-web.s3-accelerate.amazonaws.com/wp-content/uploads/2015/07/Information-Security-Fact-Sheet-July2015.pdf.

[3] Fifth Review Conference of the High Contracting Parties to the Convention on Prohibitions or Restrictions on the use of Certain Conventional Weapons Which May be Deemed to Be Excessively Injurious or to Have Indiscriminate Effects // United Nations Digital Library. URL: https://digitallibrary.un.org/record/3856242; Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) // UNODA. URL: https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-_Informal_Meeting_of_Experts_(2016)/ReportLAWS_2016_AdvancedVersion.pdf.

[4] Schwab K. The Forth Industrial Revolution: What it Means, How to Respond // World Economic Forum, January 14, 2016. URL: https://www.weforum.org/agenda/2016/01/the-fourth-industrial-revolution-what-it-means-and-how-to-respond/; Edgerton D. The Shock of the Old: Technology and Global History since 1900 // Profile Books, 2006; Singer P. The Robotics Revolution // Brookings, December 11, 2012. URL: https://www.brookings.edu/articles/the-robotics-revolution/.

[5] Losing Humanity. The Case against Killer Robots // Human Rights Watch, November 19, 2012.
URL: https://www.hrw.org/report/2012/11/19/losing-humanity/case-against-killer-robots.

[6] Scientists Call for a Ban // Stop Killer Robots, October 16, 2013. URL: https://www.stopkillerrobots.org/news/scientists-call/.

[7] Johnson N. et al., Abrupt Rise of New Machine Ecology beyond Human Response Time // Scientific Reports, 2013. Article number: 2627. URL: www.nature.com/articles/srep02627; Top Teams’ Automated Cybersecurity Systems Preparing for Final Face-off // Defense Advanced Research Projects Agency, July 13, 2016. URL: https://www.darpa.mil/news-events/2016-07-13.

[8] Scharre P. Autonomous Weapons and Operational Risk // Centre for a New American Security, February 29, 2016. P. 12. URL: https://www.cnas.org/publications/reports/autonomous-weapons-and-operational-risk.

[9] Nobel Laureates Call for Killer Robots Ban // Stop Killer Robots, June 12, 2014. URL: https://www.stopkillerrobots.org/news/nobelpeace/.

[10] Ticehurst R. The Martens Clause and the Laws of Armed Conflict // International Review of the Red Cross, Vol. 37, № 317. Рp. 125-134. URL: https://www.icrc.org/en/doc/resources/documents/article/other/57jnhy.htm.

[11] Bills G. LAWS unto Themselves: Controlling the Development and Use of Lethal Autonomous Weapons Systems // George Washington Law Review, 2015. Issue 83:1, Note, Vol. 83. URL: https://www.gwlr.org/laws-unto-themselves/.

[12] Autonomous Weapon Systems: Technical, Military, Legal and Humanitarian Aspects. 2014 Report of an Expert Meeting // The International Committee of the Red Cross, November 1, 2014. URL: https://www.icrc.org/en/document/report-icrc-meeting-autonomous-weapon-systems-26-28-march-2014.

[13] Ibid.

[14] Framing Discussions on the Weaponization of Increasingly Autonomous Technologies // UNIDIR Resources, 2014. № 1. URL: https://unidir.org/files/publication/pdfs/framing-discussions-on-the-weaponization-of-increasingly-autonomous-technologies-en-606.pdf.

[15] «О Концепции деятельности Вооруженных Сил Российской Федерации в сфере разработки и применения систем вооружений с использованием технологий искусственного интеллекта» / Рабочий документ Российской Федерации / CCW/GGE.1/2023/WP.5 // Группа правительственных экспертов по новым технологиям в сфере создания смертоносных автономных систем вооружений Женева, 6-10 марта и 15-19 мая 2023 г. URL: https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-Group_of_Governmental_Experts_on_Lethal_Autonomous_Weapons_Systems_(2023)/CCW_GGE1_2023_WP.5_0.pdf.

[16] Report of the 2019 Session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems. CCW/GGE.1/2019/3. Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons System Geneva, 25-29 March 2019 and 20-21 August 2019 // UNODA. URL: https://documents.unoda.org/wp-content/uploads/2020/09/CCW_GGE.1_2019_3_E.pdf.

[17] Report of the 2023 Session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems. CCW/GGE.1/2023/2. Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons System Geneva, 6-10 March, and 15-19 May 2023 // UNODA. URL: https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-Group_of_Governmental_Experts_on_Lethal_Autonomous_Weapons_Systems_(2023)/CCW_GGE1_2023_2_Advance_version.pdf.

[18] Roff H.M., Moyes R. Meaningful Human Control, Artificial Intelligence and Autonomous Weapons / Briefing Paper Prepared for the Informal Meeting of Experts on Lethal Autonomous Weapons Systems, UN Convention on Certain Conventional Weapons, April 2016 // Article 36, 2016. URL: https://article36.org/wp-content/uploads/2016/04/MHC-AI-and-AWS-FINAL.pdf.

[19] Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons which may be deemed to be Excessively Injurious or to have Indiscriminate Effects (with Protocols I, II and III) // United Nations Treaty Collection, October 10, 1980. URL: https://treaties.un.org/Pages/ViewDetails.aspx?src=TREATY&mtdsg_no=XXVI-2&chapter=26&clang=_en.

[20] A Guide to the Legal Review of New Weapons, Means and Methods of Warfare. Measures to Implement Article 36 of Additional Protocol I of 1977 // The International Committee of the Red Cross, June 10, 2020. URL:  https://www.icrc.org/en/publication/0902-guide-legal-review-new-weapons-means-and-methods-warfare-measures-implement-article; Report of the 2023 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems. CCW/GGE.1/2023/CRP.2. Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons System, Geneva, 6-10 March, and 15-19 May 2023 // UNODA. URL: https://docs-library.unoda.org/Convention_on_Certain_Conventional_Weapons_-Group_of_Governmental_Experts_on_Lethal_Autonomous_Weapons_Systems_(2023)/CCW_GGE1_2023_CRP.2_12_May.pdf.

[21] Structuring Debate on Autonomous Weapon Systems. Memorandum for Delegates to the Convention on Certain Conventional Weapons (CCW) // Article 36 Briefing Paper, 2013. URL: https://www.article36.org/wp-content/uploads/2013/11/Autonomous-weapons-memo-for-CCW.pdf.

[22] Sandoz Y. et al., Commentary on the Additional Protocols of 8 June 1977 to the Geneva Conventions of 12 August 1949. International Committee of the Red Cross (1986) // ICC Legal Tools Database. URL: https://www.legal-tools.org/doc/6d222c/.

[23] Declaration Renouncing the Use, in Time of War, of Explosive Projectiles Under 400 Grammes Weight. Saint Petersburg, 29 November / 11 December 1868 // International Humanitarian Law Database. URL: https://ihl-databases.icrc.org/en/ihl-treaties/st-petersburg-decl-1868.

[24] Marsh N. Defining the Scope of Autonomy // PRIO Policy Brief, 2014. № 2. Oslo: PRIO.
URL: https://www.prio.org/publications/7390.

E16/MIN – 24/04/15