Autonomous weapon systems: the need for meaningful human control

March 2, 2016 - nr.97
Summary

Summary, conclusions and recommendations


1.   Summary and conclusions

Definition
Autonomy has been a feature of offensive weapons (e.g. fire-and-forget missiles) and
defensive weapons (such as Patriot surface-to-air missiles) for decades. As yet, however,
there is no internationally agreed definition of an autonomous weapon. Any workable
definition will have to make a clear distinction between existing weapons with autonomous
functions and future autonomous weapons.

For the purpose of this report, an autonomous weapon is defined as:
A weapon that, without human intervention, selects and attacks targets matching certain
predefined characteristics, following a human decision to deploy the weapon on the
understanding that an attack, once launched, cannot be stopped by human intervention.

The person operating the weapon does not know which specific target will be attacked, but
the type of target is pre-programmed. A weapon is only autonomous if the critical functions
for using potentially lethal force – namely ‘target selection’ and ‘target engagement’ – are
performed autonomously, keeping humans out of the loop. The term ‘loop’ refers to the
decision-making process for selecting and attacking targets. This may cover only the critical
processes (target selection and engagement) carried out autonomously by the weapon (the
narrow loop) or the entire targeting process in which humans play a decisive role (the wider
loop). At present, there are only a few weapon systems that leave humans out of the narrow
loop. These include the Israeli Harpy unmanned combat aerial vehicle, which is designed to
attack enemy radar systems.

The AIV/CAVV believes that the term ‘loop’ should be interpreted in its wider sense. After
all, prior to the process whereby it selects and attacks a specific target, a weapon is
deployed and programmed by humans. This involves decisions on target selection, which
are part of a wider targeting process that includes such tasks as formulating objectives,
target selection, weapon selection and implementation planning. NATO has a standard
procedure for this purpose that also takes account of the potential consequences for
civilian populations. Over the next few decades, humans will remain responsible for the
decision whether or not to deploy weapons.

Current and future deployment of autonomous weapons
Deploying autonomous weapons can provide key advantages. For example, computers
collect and process data faster than humans, thus facilitating effective defence against
incoming missiles. In addition, autonomous weapons can to some degree replace humans
on the battlefield, thereby reducing the risk to friendly troops. They can also operate in
environments where humans cannot survive, for example due to high pressure, extreme
temperatures or lack of oxygen. Autonomous weapons can also help limit the number of
casualties among civilians and friendly military personnel. Over the next few decades,
these weapons will most likely be developed and deployed to attack specific types of
targets or carry out defensive tasks.

It is highly unlikely that autonomous weapon systems will entirely or substantially take over
the role of humans on the battlefield. Rather, it is thought they will be deployed alongside
troops and existing weapon systems and in coordination with other military and civilian
technologies. This is because the nature of modern conflicts complicates the deployment
of such systems. One characteristic of these conflicts is that military targets are
increasingly located in predominantly civilian areas. In many cases, moreover, the parties
to a conflict deliberately do not distinguish themselves clearly from non-combatants. This
often makes it difficult to deploy autonomous weapons. A second characteristic of modern
conflicts is the importance of winning the hearts and minds of the local population. That
is another reason why autonomous weapons are expected to play a limited role in modern
conflicts, while humans will continue to play a crucial one.

Long-term developments regarding autonomous weapons are largely dependent on
advances in the field of artificial intelligence. A weapon system that has the capacity
to learn, formulate its own rules of conduct and independently adjust to changes in
its environment would be fully autonomous and place humans beyond the wider loop.
Such ‘self-aware’ systems, which do not exist at present, would effectively be beyond
human control. The AIV/CAVV considers it unlikely that fully autonomous weapons that
are designed to function without any human control will be developed within the next
few decades. If that were to happen, these weapons would be programmed to perform
the entire targeting process autonomously, from formulating the military objective to
determining the time and place of deployment. Setting aside the question of technological
feasibility, the AIV/CAVV fails to see why a state would want to develop or commission
such a weapon.

The concept of meaningful human control has attracted a lot of attention in recent
years as a result of factors including fears aroused by the idea of fully autonomous
weapon systems. The increasing complexity of autonomous systems may ultimately
lead to a partial or near-complete loss of human control. Because this possibility cannot
be excluded, the AIV/CAVV believes that it needs to be taken seriously. It is therefore
important to keep a close eye on developments in the fields of artificial intelligence and
robotics.

Legal framework governing the admissibility of autonomous weapons and their
deployment

International law prohibits the use of force between states, except in cases listed in the
UN Charter. States may use force (1) for the purpose of self-defence, (2) under a UN
Security Council mandate or (3) with the permission of the state where force is being
used. Whether or not the use of force involves the deployment of autonomous weapons
makes no difference in this context.

International humanitarian law prohibits the use of weapons if, when deploying those
weapons, it is impossible to distinguish between military targets on the one hand and
civilians and civilian objects on the other, if they cause unnecessary suffering and/or
excessive injuries among enemy combatants or if the effects of their deployment cannot
be controlled in a manner prescribed by international humanitarian law, resulting in
indiscriminate harm to military personnel and civilians. There is no reason to assume that
autonomous weapons by definition fall under any one of these categories. Under article 36
of the First Additional Protocol to the Geneva Conventions, states are obliged to determine
whether new weapons are compatible with the requirements of international humanitarian
law. The question whether a specific autonomous weapon falls under one of the categories
of prohibited weapons therefore needs to be assessed on a case-by-case basis.

Apart from certain specific arms control treaties, there are two legal regimes that regulate
the use of force: international humanitarian law and human rights law. Armed conflicts are
regulated by international humanitarian law, which imposes certain requirements for the
deployment of weapons: application of the principles of distinction (between military and
civilian objects), proportionality (weighing military advantage against expected collateral
damage) and precaution (protecting civilians and civilian objects as much as possible).
In certain specific situations, such as the deployment of autonomous weapons on the
high seas, under water, in the air or in sparsely populated areas, the requirements of
international humanitarian law will generally be satisfied. However, in many other cases, at
least for the next decade, the deployment of autonomous weapons may be complicated
by a lack of prior certainty as to whether the requirements of distinction, proportionality
and precaution can be met. The question whether autonomous weapons can be
deployed without violating international humanitarian law is therefore highly dependent
on context. During the targeting process, military personnel in the wider loop will have to
determine whether deploying autonomous weapons in a specific context can be justified in
accordance with the requirements of international humanitarian law.

The above-mentioned legal regimes apply to the use of all types of force, and there is no
reason to assume that this would be any different for autonomous and fully autonomous
weapons. When deploying such weapons, therefore, states and individuals are obliged
to ensure compliance with these rules of law. The AIV/CAVV believes that discussing
whether autonomous weapons might be able to perform this task themselves one day is
a hypothetical exercise. From an international humanitarian law perspective, it makes no
difference whether or not they would be able to do so, since the same legal requirements
continue to apply to the deployment of all weapons.

Accountability
The AIV/CAVV believes that the existing legal regime, as described above, is an adequate
formal legal framework for holding offenders accountable. There is no accountability gap
as regards the deployment of autonomous weapons, as long as the decision to deploy,
taken in the framework of the targeting process, remains with humans. At any rate, there
is no reason to assume that there will be any erosion of the liability under criminal law of
commanders, subordinates or those in positions of political or administrative responsibility
during the next decade. They are responsible for deciding whether deploying and
activating autonomous weapons in a given context is consistent with the requirements of
international humanitarian law and ethically justified. Likewise, there are no gaps in state
responsibility as regards the deployment of autonomous weapons.

However, compared to the deployment of weapons that require continuous human
operation, such as those employed by a rifleman or by a fighter pilot during aerial combat,
there is a shift in accountability in the case of autonomous weapons. This is because
the deployment of autonomous weapons does not involve a decision to attack a specific
target; rather, that decision is implicit in the decisions to deploy and activate them. As a
result, accountability lies primarily with the commander who decides to deploy the weapon
and the soldier who activates it, as opposed to a soldier who selects and attacks specific
targets. This means that commanders and soldiers who are involved in the deployment
of autonomous weapons must be well trained and well informed as regards their
potential effects. They are required to make judicious decisions concerning distinction,
proportionality and precaution without knowing which specific targets will be attacked. In
other words, there has to be meaningful human control.

The basic norms of international humanitarian law strictly regulate the deployment
of autonomous weapons. Any deployment that does not comply with these norms
is therefore unlawful. As a result, commanders can actually be held accountable for
reckless deployment of autonomous weapons that results in violations of international
humanitarian law. Factors such as the interval between the weapon’s activation (i.e. the
last moment at which distinction, proportionality and precaution can be considered) and
the actual attack on a target, as well as the complex nature of autonomous weapons,
give rise to a need for greater restraint in their deployment. In other words, these factors
cannot be invoked to evade accountability by arguing that certain consequences were
unforeseeable.

Meaningful human control
The AIV/CAVV prefers the concept of meaningful human control to the terms ‘judgment’
and ‘predictability’. International consensus also seems to be emerging on the usefulness
of this concept. Although there is no general agreement on its definition, it is widely
acknowledged that the concept can serve as a criterion for distinguishing between
acceptable and unacceptable types of autonomous weapons and deployment.

Despite the lack of an internationally agreed definition of the concept of meaningful
human control, it already plays a key role in public acceptance of weapon systems that
independently select and attack targets. The AIV/CAVV adheres to the principle that
humans should be responsible for all decisions concerning the use of lethal force.
Meaningful human control implies that humans make informed, conscious choices
regarding the use of weapons, based on adequate information about the target, the
weapon in question and the context in which it is to be deployed. In addition, the design of
the weapon, its testing in a realistic operational environment and the training of those who
operate it should all be geared to ensuring meaningful human control. Incidentally, these
requirements apply to all weapons.

The AIV/CAVV relates meaningful human control to the entire targeting process (the wider
loop), as decisions concerning the selection and engagement of targets are taken at
various stages of this process, even in cases involving the deployment of autonomous
weapons. Meaningful human control is supposed to serve as a guarantee for well-founded
ethical and legal decisions concerning the use of potentially lethal force. Moreover, it is
possible in principle to attribute responsibility and accountability to individuals if humans
have control over autonomous weapons. Meaningful human control is thus instrumental to
compliance with the requirements of international humanitarian law and ethical principles
and the attribution of responsibility and accountability.

The AIV/CAVV believes that the concept of meaningful human control should be regarded
as a standard deriving from existing legislation and practices (such as the targeting
process), which means that there is no need for new or supplementary legislation. It does
not have to become a new norm within international law. The concept of meaningful human
control can serve as a benchmark when assessing compatibility with article 36 of the First
Additional Protocol to the Geneva Conventions. In addition, it can be useful in identifying
potential violations of international humanitarian law as a result of the deployment of
such weapons. The procedure for assessing the compatibility of autonomous weapons
with article 36 should also examine whether the degree to which human control has
been incorporated into the design of the weapon in question offers adequate guarantees
of compliance with international law. It is therefore important to achieve international
consensus on the precise definition and meaning of the concept of meaningful human
control.

An interpretative guide could clarify the current legal landscape with regard to the
deployment of autonomous weapons. The process leading to such a document might also
promote consensus on the concept of meaningful human control. For example, it could
list best practices – classification levels of national systems and procedures permitting –
on such issues as the role of meaningful human control in the article 36 procedure and
in relation to the deployment of autonomous weapons. Such a guide, which would be
informative and educational, could conceivably be produced within the framework of the
Convention on Certain Conventional Weapons (CCW).

Ethics and autonomous weapons
National and international law are based on ethical principles, which are broader in scope
than the law. The AIV/CAVV believes that as long as the deployment of autonomous
weapons is subject to meaningful human control, ethical issues (such as human dignity)
will not give rise to any problems. Within the wider loop, humans are responsible for
making a balanced decision to deploy autonomous weapons for the purpose of eliminating
enemy units and objects. The use of potentially lethal force is intentional in such cases,
even if the targets are selected and attacked by an autonomous weapon. Deploying
autonomous weapons with meaningful human control can spare military lives on the
battlefield and help prevent or limit civilian casualties. Nevertheless, the number of
situations in which such weapons can be deployed in a responsible manner is expected to
be limited.

In the future, advances in the field of artificial intelligence could undermine human control
over autonomous weapons. This might happen, for example, if self-learning systems were
able to modify their own rules of conduct. The AIV/CAVV believes that this will not happen
within the next few decades. It also believes that autonomous weapons should not be
used if humans no longer have meaningful control over them. The AIV/CAVV therefore
attaches great importance to the discussions that are currently taking place within the
CCW framework on the legal, ethical, technological and policy implications of long-term
developments in the field of autonomous and fully autonomous weapons. The issue is
also being discussed within NATO, and the Netherlands should actively contribute to this
debate.

A moratorium?
In April 2013, the UN Special Rapporteur on extrajudicial, summary or arbitrary executions,
Professor Christof Heyns, called for a moratorium on ‘at least the testing, production,
assembly, transfer, acquisition, deployment and use of LARs [lethal autonomous robots]
until such time as an internationally agreed upon framework on the future of LARs has
been established’. During the CCW’s informal meeting of experts on lethal autonomous
weapon systems in April 2015, he highlighted the importance of meaningful human
control: ‘As long as they [autonomous weapon systems] are good tools, in the sense that
humans exercise meaningful control over them, they can and should be used in an armed
conflict situation. […] If they are no longer tools in the hands of humans, they should not
be used.’

Over the next ten years and, in all likelihood, the next few decades, autonomous weapon
systems will probably not fall under any of the categories of prohibited weapons,
which means their use can and must comply with the existing legal framework and the
relevant ethical principles (such as those that have been recognised and enshrined in
international humanitarian law and rules of engagement). The technology in question is
therefore neither unlawful nor unethical. Its use, however, may be either, but this applies
to all weapons. The AIV/CAVV anticipates that autonomous weapons will remain under
meaningful human control for the next ten years at least. This provides ample opportunity
to ensure compliance with international law and respect for human dignity. The AIV/
CAVV believes it is important to continue investing in research in the field of autonomous
weapons. In order to gain proper insight into their ethical, legal and technological aspects,
a thorough understanding of these systems and their development is crucial.

The AIV/CAVV believes that there are various practical objections to a moratorium or a
ban. Much of the relevant technology is being developed for peaceful purposes in the
civilian sector and has both civilian and military (dual-use) applications. It is therefore
difficult to draw a clear distinction between permitted and prohibited technologies. In
addition, there is no international consensus on the definition of the relevant concepts.
The question thus becomes: a moratorium on what? A non-proliferation regime would
also be hard to enforce, as it would be difficult to establish the existence of ‘weapons’ in
the case of dual-use technology and readily available programming languages. Countries
would not be able to trust that other countries were respecting the agreement. During
the CCW’s informal meetings of experts in April 2015, it became apparent that there was
no support among states for a moratorium or a ban. Only five countries (Cuba, Ecuador,
Egypt, the Holy See and Pakistan) indicated that they would support such an initiative. A
treaty establishing a moratorium or a ban is not viable without widespread support. For
these reasons, the AIV/CAVV currently regards this option as inexpedient and unfeasible.
However, it cannot rule out that advances in the field of artificial intelligence and robotics
might necessitate revision of this position in the future.

2.    Recommendations

  1. The AIV/CAVV believes that if the Dutch armed forces are to remain technologically advanced, autonomous weapons will have a role to play, now and in the future. However, as explained in this report, the deployment of such weapons must always involve meaningful human control.
     
  2. The AIV/CAVV considers it important to distinguish between autonomous weapon systems (in which humans play a crucial role in the wider loop) and fully autonomous weapon systems (in which humans are beyond the wider loop and there is no longer any human control).
     
  3. The AIV/CAVV believes that the Netherlands should remain actively involved in discussions within the CCW framework on the legal, ethical and policy implications of developments in the field of autonomous weapon systems. It also stresses the importance of conducting a public debate on new technologies and advises the government to maintain close contacts with NGOs, the scientific community and other interested parties regarding this issue.
     
  4. The AIV/CAVV is of the opinion that participants in the upcoming CCW meetings should reach agreement on the definition of autonomous weapons and the concept of meaningful human control as quickly as possible. NATO members should also seek to coordinate their positions on this issue. The AIV/CAVV believes it is important that for the purpose of these discussions the decision-making ‘loop’ be interpreted as relating to the entire targeting process in which humans play a decisive role and not merely to the narrow loop of critical processes – target selection and engagement – that autonomous weapons perform independently.
     
  5. The AIV/CAVV advises the government to use the upcoming CCW meetings to advocate a more widespread implementation of the article 36 procedures at national level, greater transparency concerning the outcomes of these procedures and more international information sharing.
     
  6. The AIV/CAVV believes that, when procuring autonomous weapons, the government should strictly apply the procedure relating to article 36 of the First Additional Protocol to the Geneva Conventions. It further believes that the concept of meaningful human control should serve as a benchmark for this purpose. In the AIV/CAVV’s opinion, the Dutch Advisory Committee on International Law and the Use of Conventional Weapons should play a key role in advising the Dutch government on the compatibility of specific autonomous weapons with existing and emerging rules of international law, in particular international humanitarian law.
     
  7. In light of the importance of attributing responsibility and accountability, the AIV/CAVV believes that, when procuring autonomous weapons, the government should ensure that the concept of morally responsible engineering is applied during the design stage.
     
  8. The AIV/CAVV believes that, when procuring autonomous weapons, the government should ensure that they are extensively tested under realistic conditions.
     
  9. The AIV/CAVV advises the government to ensure that ethics training programmes for military personnel, in particular commanders, devote attention to ethical issues relating to the deployment of autonomous weapons.
     
  10. The AIV/CAVV advises the government to push internationally (especially within the CCW framework) for a process that will lead to the formulation of an interpretative guide that clarifies the current legal landscape with regard to the deployment of autonomous weapons. Such a document, which would be informative and educational, could list best practices on such issues as the role of meaningful human control in the article 36 procedure and in relation to the deployment of autonomous weapons.
     
  11. In light of the rapid advances in the fields of robotics and artificial intelligence and the ongoing international debate (especially within the CCW framework) on the legal, ethical and policy implications of autonomous weapon systems, the AIV/CAVV advises the government to review the relevance of this advisory report in five years’ time.
Advice request

Professor J.G. de Hoop Scheffer
Chairman of the Advisory Council on International Affairs
Bezuidenhoutseweg 67
P.O. Box 20061
2500 EB The Hague

Professor W.G. Werner
Chairman of the Advisory Committee on Issues of Public International Law
Bezuidenhoutseweg 67
P.O. Box 20061
2500 EB The Hague

Date            7 April 2015
Re               Request for advice on autonomous weapons systems

Dear Professor De Hoop Scheffer and Professor Werner,

In order to be able to respond to current and future threats, the armed forces must continue to innovate. They must therefore make use of the latest technologies, including robotics and information technology. It is often the civilian sector that is at the forefront of new advances in this area.

For some time now, the armed forces have been using systems that can to a large extent operate automatically, such as the ship-based Goalkeeper close-in weapons system (CIWS) and Patriot surface-to-air missiles. The degree to which these systems are set to ‘automatic’ by their operators depends on the security environment and the threat situation. The greater the threat and the shorter the response time, the more automatically these systems need to operate in order to be effective, though they are continuously monitored by their operators.

Rapid technological advances are reinforcing the trend towards computerised – or in some cases autonomous – functions in a wide range of products, including weapons systems. The future development of fully autonomous weapons systems with artificial intelligence that are capable of selecting targets and applying potentially lethal force without human intervention is no longer a fanciful idea.

Although they do not yet exist, a debate about the legal, ethical and policy implications of fully autonomous weapons systems has arisen in the international arena. In 2013, the Special Rapporteur on extrajudicial, summary or arbitrary executions of the UN  Human Rights Council, Christof Heyns, published a report on lethal autonomous robots (LARs) that addresses these issues. In addition, several NGOs have joined forces in the international Campaign to Stop Killer Robots to draw attention to the potential consequences of developing autonomous weapons systems.

The government articulated its position on autonomous weapons systems in its letter to parliament of 26 November 2013 (Parliamentary Papers 33 750 X, no. 37), in which it stated that the Dutch armed forces are not developing such systems and that they have no plans to do so. It reiterated the guiding principle that all weapons systems and their deployment in armed conflicts have to comply with all the relevant requirements of international law. Under article 36 of the first Additional Protocol to the Geneva Conventions, the government is obliged to determine whether new weapons and new methods of warfare are compatible with international law. For this purpose, it created the Advisory Committee on International Law and the Use of Conventional Weapons (AIRCW).

In other words, the acquisition or deployment of autonomous weapons systems is prohibited if the relevant requirements of international law cannot be met. In a previous advisory report, the Advisory Committee on Issues of Public International Law (CAVV) stated that ‘the deployment of any weapons system, whether or not it is wholly or partly autonomous, remains subject to the same legal framework’ (Advisory report no. 23: Armed Drones, July 2013, p. 9).

The government wants to encourage debate on autonomous weapons systems. For example, the Netherlands is funding research on the issues raised by these systems. This month, moreover, it is attending the second Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS) under the UN Convention on Certain Conventional Weapons (CCW). At the first meeting of experts in May 2014, consensus seemed to be emerging on the introduction of the concept of ‘meaningful human control’ as a factor in determining whether or not an autonomous weapons system complies with ethical norms. Another issue discussed was whether the fact that a weapons system is potentially incompatible with ethical norms automatically means that it is in violation of international law.

However, opinions on what constitutes ‘meaningful human control’ differ widely, and further investigation is required to clarify this concept. In addition, it is worth investigating whether other concepts might be helpful in examining the compatibility of autonomous weapons systems with ethical norms.

In light of the above, the government has formulated the following questions for the AIV and the CAVV:

  1. What role can autonomous weapons systems (and autonomous functions within weapons systems) fulfil in the context of military action now and in the future?
     
  2. What changes might occur in the accountability mechanism for the use of fully or semi-autonomous weapons systems in the light of associated ethical issues? What role could the concept of ‘meaningful human control’ play in this regard, and what other concepts, if any, might be helpful here?
     
  3. In its previous advisory report, the CAVV states that the deployment of any weapons system, whether or not it is wholly or partly autonomous, remains subject to the same legal framework. As far as the CAVV is concerned, there is no reason to assume that the existing international legal framework is inadequate to regulate the deployment of armed drones. Does the debate on fully or semi-autonomous weapons systems give cause to augment or amend this position?
     
  4. How do the AIV and the CAVV view the UN Special Rapporteur’s call for a moratorium on the development of fully autonomous weapons systems?
     
  5. How can the Netherlands best contribute to the international debate on this issue?

The government would appreciate receiving the report in time for the parliamentary budget debate this autumn.

We look forward to receiving your report.

Bert Koenders
Minister of Foreign Affairs

Jeanine Hennis-Plasschaert
Minister of Defence

Government reactions

Government response to AIV/CAVV advisory report no. 97, Autonomous weapon systems: the need for meaningful human control

Introduction

The government thanks the Joint Committee of the Advisory Council on International Affairs (AIV) and the Advisory Committee on Issues of Public International Law (CAVV) (‘the advisory committee’) for its timely and thorough advisory report on the legal, ethical and policy issues raised by the increase in autonomous functions in weapon systems. The government believes the conclusions and recommendations support and complement its current policy. The main conclusion of the report is that meaningful human control is required in the deployment of autonomous weapon systems. The government concurs with this view. The following is a detailed commentary on the advisory report.

Definitions: autonomy and meaningful human control

The government’s guiding principle is that all weapon systems, and their deployment in armed conflict, must comply with the requirements set by international law. As the government said in its letter to parliament of 26 November 2013, this of course also applies to autonomous weapon systems.1 There is as yet no internationally agreed definition of an autonomous weapon system. The advisory committee has adopted the following working definition:

A weapon that, without human intervention, selects and engages targets matching certain predefined criteria, following a human decision to deploy the weapon on the understanding that an attack, once launched, cannot be stopped by human intervention.2

This definition indicates that, although humans can no longer intervene once the weapon has been deployed, they do play a prominent role in programming the characteristics of the targets that are to be engaged and in the decision to deploy the weapon. Humans are thus involved in the ‘wider loop’ of the decision-making process. This means that humans continue to play a crucial role in the wider targeting process. An autonomous weapon as defined above is therefore only deployed after human consideration of aspects such as target selection, weapon selection and implementation planning, including an assessment of potential collateral damage. In addition, the autonomous weapon is programmed to perform specific functions within pre-programmed conditions and parameters. Its deployment is followed by a human assessment of the effects. Assessments of potential collateral damage (proportionality) and accountability under international humanitarian law are of key importance in this respect.

Meaningful human control
The advisory committee states that if the deployment of an autonomous weapon system takes place in accordance with the process described above, there is meaningful human control. In such cases, humans make informed, conscious choices regarding the use of weapons, based on adequate information about the target, the weapon in question and the context in which it is to be deployed. The committee also points out that there is increasing recognition of this concept as a criterion by which to distinguish between acceptable and unacceptable autonomous weapon systems and deployment.

The advisory committee sees no immediate reason to draft new or additional legislation for the concept of meaningful human control. The concept should be regarded as a standard deriving from existing legislation and practices (such as the targeting process). The government supports the definition given above of an autonomous weapon system, including the concept of meaningful human control, and agrees that no new legislation is required.

The advisory committee does, however, recommend that multilateral efforts be made to develop an internationally agreed definition (recommendation no. 4):

The AIV/CAVV is of the opinion that participants in the upcoming CCW meetings should reach agreement on the definition of autonomous weapons and the concept of meaningful human control as quickly as possible. NATO members should also seek to coordinate their positions on this issue. The AIV/CAVV believes it is important that for the purpose of these discussions the decision-making ‘loop’ be interpreted as relating to the entire targeting process in which humans play a decisive role and not merely to the narrow loop of critical processes – target selection and engagement – that autonomous weapons perform independently.

The government agrees that definitions should be agreed on (in accordance with recommendation no. 4). To this end, at the annual meeting of the Convention on Certain Conventional Weapons (CCW) on 12 November 2015 the Netherlands called for the establishment of a Governmental Group of Experts (GGE). One of the tasks of the GGE should be to further study the concept of meaningful human control. There was not enough support for the establishment of a GGE at the meeting, because a number of countries felt the subject was not yet ripe for this. The CCW did, however, decide to hold an expert meeting on autonomous weapon systems for the third consecutive year, in April 2016. The Netherlands will be participating again and will work to achieve further clarity and consensus on the matter of definition and with regard to the concept of meaningful human control. In view of the importance the government attaches to the concept of meaningful human control, the Ministry of Foreign Affairs and the Ministry of Defence are funding a doctoral research project on this subject. The project began in 2015, under the auspices of the VU University, Amsterdam.

Future deployment of autonomous weapon systems under human control

General
The advisory committee points out that there may be key military advantages to autonomous weapon systems, as long as there is meaningful human control in the wider loop of the decision-making process. For example, computers often respond faster and more accurately than humans, which may reduce the risk to friendly units and the civilian population. These systems are often also able to operate in environments that are dangerous to humans, or difficult to reach. It is therefore to be expected that such weapon systems will be developed around the world over the next few decades and deployed for offensive and defensive tasks.

The advisory committee does not expect, however, that autonomous weapon systems will entirely or substantially take over the role of humans on the battlefield. The nature of modern conflicts, which often take place in predominantly civilian areas, complicates the deployment of these weapon systems, in the committee’s view. It is likely that autonomous weapon systems will be deployed for specific tasks alongside military personnel and existing weapon systems and other military and civilian technology (in a complementary capacity). Recommendation no. 1 in the report touches on this subject:

The AIV/CAVV believes that if the Dutch armed forces are to remain technologically advanced, autonomous weapons will have a role to play, now and in the future. However, as explained in this report, the deployment of such weapons must always involve meaningful human control.

The government shares this view. The Dutch armed forces must also take due account of the possibility that opponents, including non-state actors, will deploy autonomous weapon systems against them, and they will need to be able to respond adequately. More generally, the armed forces must therefore know what is being developed and what is on the market in terms of autonomous weapon systems. When the Ministry of Defence updates its Strategic Knowledge and Innovation Agenda (SKIA), it will of course devote attention to autonomous systems and the relationship between humans and machines.

Design phase
The advisory committee advocates taking the interaction between humans and machines into account sufficiently in the design phase of autonomous weapon systems (recommendation no. 7). The design must be such that relevant information required by humans in order to exercise meaningful human control over the deployment of the weapon system is provided clearly and in a timely fashion.

In light of the importance of attributing responsibility and accountability, the AIV/CAVV believes that, when procuring autonomous weapons, the government should ensure that the concept of morally responsible engineering is applied during the design stage.

The government considers this recommendation to be an affirmation of existing policy. The above-mentioned update of the SKIA will identify the relationship between humans and machines as a key theme. The government and several of its knowledge partners are studying this theme. For instance, a financial contribution has been made to the research programme of the Netherlands Organisation for Scientific Research (NWO) on corporate social responsibility, entitled Responsible Innovation. The project entitled Military Human Enhancement: Design for Responsibility and Combat Systems, carried out by Delft University of Technology was part of this research programme.3 They too concluded that the need to attribute responsibility and accountability needs to be taken into account in the design stage of weapon systems.

Test phase
On a similar note, the advisory committee points out the importance of testing autonomous weapon systems extensively in a realistic environment, to ensure that the weapon will always remain under meaningful human control (recommendation no. 8).

The AIV/CAVV believes that, when procuring autonomous weapons, the government should ensure that they are extensively tested under realistic conditions.

This is standard policy. In view of their increasingly complexity and other factors, when weapon systems are procured a considerable amount attention is already devoted to such matters, by both defence personnel and the civilian parties involved (businesses).

Aspects of international law

The advisory committee concludes that there is no reason to assume that autonomous weapon systems under meaningful human control by definition fall into one of the categories of weapons that are banned under international humanitarian law. It rightly points out that those systems are under human control, placing the ultimate responsibility for their deployment with humans.

Article 36
The advisory committee believes that, in assessing whether autonomous weapons are under meaningful human control, there is an important role for the article 36 procedure. This is a procedure that is based on article 36 of the First Additional Protocol to the Geneva Conventions, which obliges States Parties involved in the development or acquisition of new means and methods of warfare to determine whether they are permitted under international law. The Ministry of Defence has established an Advisory Committee on International Law and the Use of Conventional Weapons (AIRCW) for this purpose, whose task it is to advise the Minister of Defence on such matters. Recommendation no. 6 in the report is devoted to this subject:

The AIV/CAVV believes that, when procuring autonomous weapons, the government should strictly apply the procedure relating to article 36 of the First Additional Protocol to the Geneva Conventions. It further believes that the concept of meaningful human control should serve as a benchmark for this purpose. In the AIV/CAVV’s opinion, the Dutch Advisory Committee on International Law and the Use of Conventional Weapons should play a key role in advising the Dutch government on the compatibility of specific autonomous weapons with existing and emerging rules of international law, in particular international humanitarian law.

The government concurs with these views. When weapon systems with autonomous functions are procured it will be made explicitly clear that the AIRCW has assessed them in the context of international law.

The advisory committee makes the following recommendations with regard to article 36 (recommendation no. 5) and the application of existing law in the deployment of autonomous weapons (recommendation no. 10):

The AIV/CAVV advises the government to use the upcoming CCW meetings to advocate a more widespread implementation of the article 36 procedures at national level, greater transparency concerning the outcomes of these procedures and more international information sharing.

The AIV/CAVV advises the government to push internationally (especially within the CCW framework) for a process that will lead to the formulation of an interpretative guide that clarifies the current legal landscape with regard to the deployment of autonomous weapons. Such a document, which would be informative and educational, could list best practices on such issues as the role of meaningful human control in the article 36 procedure and in relation to the deployment of autonomous weapons.

In 2014 and 2015 the government focused on the importance of applying article 36, in the CCW framework and elsewhere. The Netherlands’ efforts contributed to international information-sharing and greater transparency concerning the application of article 36, and will continue to focus strongly on this subject.

At the next CCW expert meeting on autonomous weapon systems in April 2016, the Netherlands will propose the formulation of an interpretative guide in the CCW framework.

Legal accountability

The advisory committee states that there is no accountability gap if humans exercise meaningful human control in the wider loop of the decision-making process for deploying autonomous weapon systems. In that case the existing legal regime is adequate to hold offenders accountable, as there is no change in the accountability of commanders, subordinates or those in positions of political or administrative responsibility who make the decisions. Likewise, state responsibility remains unchanged in the event of deployment of autonomous weapon systems under human control, according to the advisory committee.

The government agrees with this conclusion and emphasises the importance of training and education for military personnel who are responsible for the deployment of autonomous weapon systems. The Ministry of Defence already sets such training and education as a precondition for the operational deployment of weapon systems that operate with a high degree of autonomy, such as the shipborne Goalkeeper system and the Patriot surface-to-air missiles. The same will apply to future weapon systems.

Ethical principles

The international debate on autonomous weapon systems raises the question whether human dignity is violated when people are killed by fully autonomous machines. The report states that, as long as autonomous weapon systems are under human control, ethical issues (such as human dignity) will not give rise to serious problems, as they are no different from other weapon systems in this respect. In order to fully understand the ethical aspects of greater autonomy in weapon systems, the advisory committee considers ongoing public debate on the matter to be of great importance (recommendation no. 9):

The AIV/CAVV advises the government to ensure that ethics training programmes for military personnel, in particular commanders, devote attention to ethical issues relating to the deployment of autonomous weapons.

In current practice, military personnel are taught about the ethical issues that arise from the deployment of weapons. Many ethical issues and humanitarian considerations have already been codified in law, including in important sections of international humanitarian law and various human rights treaties. In addition, the specific Rules of Engagement (ROE) that are drawn up for every mission serve as guidelines for the deployment of weapons. The ROE are part of the wider loop of the targeting process (decision-making process). If there are ethical issues that are not covered by the ROE but are the responsibility of individual soldiers, the Ministry of Defence incorporates those issues into the ethics curriculums of the various training programmes.

Autonomous weapon systems must remain under human control

Weapons systems without meaningful human control in the wider loop of the targeting process (the decision-making process) are sometimes referred to as ‘fully autonomous weapon systems’. Such systems do not yet exist. The advisory committee considers it unlikely that states would consciously choose to develop or commission fully autonomous weapon systems in the next few decades. Even if it were to become technologically feasible, the committee sees no reason why a state would have that ambition. It recommends making a distinction between autonomous weapon systems under meaningful human control and fully autonomous weapon systems beyond meaningful human control (recommendation no. 2):

The AIV/CAVV considers it important to distinguish between autonomous weapon systems (in which humans play a crucial role in the wider loop) and fully autonomous weapon systems (in which humans are beyond the wider loop and there is no longer any human control).

The government strongly emphasises the importance of this distinction and rejects outright the possibility of developing and deploying fully autonomous weapons. It is the state’s responsibility to ensure that the deployment of any weapon system complies with the requirements of international law. With fully autonomous weapons, this would not be possible.

Moratorium

The UN Special Rapporteur on extrajudicial, summary or arbitrary executions, Professor Christof Heyns, concluded in his report of 2013 on Lethal Autonomous Robots (LARs) that these robots prompt many questions. He advised states to set up an international framework and, in the meantime, to impose a moratorium on LARs.4 At the CCW’s expert meeting on autonomous weapon systems in April 2015 he altered his advice somewhat. He concluded that there appears to be consensus on the importance of meaningful human control as a criterion. The Special Rapporteur believes states should be allowed to develop autonomous weapon systems that are under meaningful human control.5 However, he calls for a ban on fully autonomous weapon systems. The NGOs united in the international Stop Killer Robots campaign want a moratorium on fully autonomous weapons.6 In an open letter published in July 2015, over a thousand prominent scientists and entrepreneurs called for a ban on offensive autonomous weapon systems beyond meaningful human control.7

The advisory committee concludes too that autonomous weapon systems must always be under meaningful human control. However, it currently regards a moratorium on the development and deployment of fully autonomous weapons as inexpedient and unfeasible, for reasons explained below.

Expediency
The advisory committee anticipates that autonomous weapon systems will remain under meaningful human control for the next 10 years at least. According to the report, this provides ample opportunity to ensure compliance with international law and respect for human dignity. Current technology and that of the next decade will therefore be neither unlawful nor unethical. The committee also points out the possibility of increasing human control over autonomous weapons through technological or other means.

Feasibility
According to the advisory committee, a moratorium on the development and deployment of fully autonomous weapon system is also impracticable. Much of the relevant technology has both civilian and military (dual-use) applications. It is therefore difficult to draw a clear distinction between permitted and prohibited technologies. The question thus becomes: a moratorium on what?

The report also rightly points out that, in part due to these issues, there is no support within the CCW for a moratorium. At the most recent CCW meeting only five out of the 121 members were in favour.

However, the advisory committee cannot rule out that advances in the field of artificial intelligence and robotics might necessitate revision of its position in the future. Recommendation no 11 deals with this subject:

In light of the rapid advances in the fields of robotics and artificial intelligence and the ongoing international debate (especially within the CCW framework) on the legal, ethical and policy implications of autonomous weapon systems, the AIV/CAVV advises the government to review the relevance of this advisory report in five years’ time.

As indicated in its response to the motion submitted by member of Parliament Rik Grashoff8 during the debate on the foreign affairs budget on 19 November 2015, the government understands the serious public concern regarding developments in robotics and artificial intelligence in relation to weapon systems. That is why the government sets great store by a careful assessment of this subject and why it submitted its request for advice to the AIV and the CAVV.

As indicated before, the government rejects outright the possibility of developing and deploying fully autonomous weapons, because they would be beyond meaningful human control.

However, the government shares the advisory committee’s view that a moratorium on fully autonomous weapon systems is currently unfeasible, a view shared by the International Committee of the Red Cross.9 The government therefore concurs with the arguments put forward by the advisory committee. It remains important, however, to continue to monitor this issue and, in view of the rapid technological developments, to review the advisory report in accordance with the recommendation in five years’ time.

The way ahead

Recommendation no. 3 in the report concerns the Netherlands’ role in the public debate on autonomous weapon systems:

The AIV/CAVV believes that the Netherlands should remain actively involved in discussions within the CCW framework on the legal, ethical and policy implications of developments in the field of autonomous weapon systems. It also stresses the importance of conducting a public debate on new technologies and advises the government to maintain close contacts with NGOs, the scientific community and other interested parties regarding this issue.

The government is implementing this recommendation. It has commissioned or funded various studies on the subject of autonomous weapon systems. The government also supports the study on autonomous weapon systems carried out by the UN Institute for Disarmament Research (UNIDIR). Funded by the Netherlands and others, UNIDIR has already produced four publications on the subject.10 The Netherlands will also continue to contribute to the international debate on this issue, acting along the lines set out in this response.

Response to the report ‘Mind the Gap: The Lack of Accountability for Killer Robots’

As promised to the House of Representatives,11 this government response will also discuss the report ‘Mind the Gap: The Lack of Accountability for Killer Robots’ by Human Rights Watch and Harvard Law School’s International Human Rights Clinic of April 2015. In their report the two NGOs warn of an accountability gap in the deployment of fully autonomous weapons. They use the following definition in this respect: the term ‘fully autonomous weapon’ is used to refer to both ‘out-of-the-loop’ weapons as well as weapons that allow a human on the loop, but with supervision that is so limited that the weapons are effectively ‘out of the loop’.12

The government agrees with Human Rights Watch and the International Human Rights Clinic that autonomous weapons must at all times remain under human control. In this respect the government uses the same definition as the advisory committee. When autonomous weapons are deployed, there must therefore be meaningful human control in the wider loop of decision-making. In such cases there is no accountability gap, as mentioned above in this government response. For the record, the government would point out once again that it rejects outright the development of fully autonomous weapons.

 
1 Letter to Parliament of 26 November 2013 (Parliamentary Paper 33 750X, no. 37).
2 In the opinion of the AIV/CAVV this reflects the definitions used by various international organisations. See footnote 12 of the report.
3 http://www.nwo.nl/onderzoek-en-resultaten/onderzoeksprojecten/i/67/6467.html.
4 Lethal Autonomous Robotics http://www.ohchr.org/Documents/HRBodies/HRCouncil/RegularSession/Session23/A-HRC-23-47_en.pdf.
5 http://www.ohchr.org/Documents/Issues/Executions/CCWApril2015.doc.
6 Campaign to Stop Killer Robots, see http://www.stopkillerrobots.org.
7 http://futureoflife.org/open-letter-autonomous-weapons/.
8 Grashoff motion on a moratorium on the development of fully autonomous weapon systems (34 300-V, no 34).
9 https://www.icrc.org/en/document/lethal-autonomous-weapons-systems-LAWS.
10 http://www.unidir.org/publications/emerging-security-threats.
11 House of Representatives reference no. 2015Z07211/ 2015D15399, 23 April 2015.
12 https://www.hrw.org/report/2015/04/09/mind-gap/lack-accountability-killer-robots (p. 6).
Press releases