Experts

  • Affiliation : Associate Professor, International and National Security Department, Diplomatic Academy of the Ministry of Foreign Affairs of the Russian Federation
  • Position : Deputy Head of Foreign Policy Department
  • Affiliation : Kommersant Publishing House
  • Affiliation : Member, UN Committee Against Torture
  • Position : Project Director, Project on Asian Security; Project Director, Emerging Technologies and Global Security Project
  • Affiliation : PIR Center
complete list

PIR Center Experts on the Prohibition of Military Robots

06.09.2018

MOSCOW, 11 SEPTEMBER 2018. PIR-PRESS. – “Autonomous weapons turn the military men into romantics: they are looking forward to another revolution in military affairs. It means new arms race, temptation to dominate globally, dismantlement of the theory of containment, and international instability for international relations.” – says Dr.Vadim Kozyulin, Director of the PIR-Center Project on New Technologies and International Security.

From August 27 through September 1, 2018, the Group of Governmental Experts established under UN Convention on Certain Conventional Weapons held negotiations on developing regulations banning lethal autonomous weapon systems (LAWS) in Geneva. Delegations from 88 countries, including Russia, attended the meeting. During the negotiations 26 states, including Austria, Argentina, Brazil, Bolivia, and the Vatican supported the initiative to completely ban autonomous killer-robots. This class of weapons consists of a variety of types: from unmanned aircrafts to tanks and aerial defense systems. Russia didn’t support the initiative. As explained by a member of the PIR Center Executive Board Member and Deputy Head of the Kommersant newspaper foreign policy department Dr. Elena Chernenko, supporters of the ban on killer robots hoped that the meeting in Geneva will give the Group of Governmental Experts the mandate to develop a relevant legally binding international Treaty. However, skeptical states blocked the initiative. Coordinator of the global Campaign to Stop Killer Robots Mary Warham, a representative of the human rights organization Human Rights Watch considers among them Russia, the United States, Australia, Israel, South Korea and Japan.

The Russian delegation made it clear in advance that it would block this initiative. "The problem of lethal autonomous weapons systems [...] is rather raw and many aspects of it still remain unclear. De facto, such systems do not exist yet, physically they are not yet in place. There is no universally accepted working definition of LAWS. Different countries, international organizations and non-governmental organizations put their own understanding into this term. And for the moment the picture is quite chaotic", – explained ex-advisor to the Department of Non-proliferation and Arms Control of the Russian MFA and PIR Center Advisory Board Member Andrey Malov in his interview to the Kommersant newspaper.

According to a professor in international law and a PIR Center Advisory Board member Bakhtiyar Tuzmukhamedov, the attempt to use the Martens Clause during the Geneva negotiations was particularly remarkable. In absence of an agreed treaty, this clause could have, if not prohibit, but at least to limit the prospects of development, deployment and use of autonomous weapons. The expert believes that international customs directly applicable to LAWS could hardly be found in contemporary international law. And the Martens Clause, in fact, does not contain any specified normative content – it literally represents a call to start negotiations that could lead to the development of the specific norms that regulate concrete material subjects and legal relations connected with them.

Director of the PIR-Center project on new technologies and international security Dr. Vadim Kozyulin emphasizes that the talks under the auspices of the UN involved discussing only one group of threats associated with LAWS use. These threats are related to the so-called problem of meaningful human control – risks that arise once a human is excluded from the decision-making cycle on the use of weapons. However, the rest two groups are of the same importance. “The second group of threats is associated with the risk of strategic stability violation. Autonomous weapons turn the military men into romantics: they are looking forward to another revolution in military affairs. It means new arms race, temptation to dominate globally, dismantlement of the theory of containment, and international instability for international relations”, the expert points out.

The third group is related to the fact that with the introduction of systems based on artificial intelligence, the use of weapons will also be automated and outsourced to the command-analysis systems, themselves based on the artificial intelligence. The speed of decision-making process executed by the machine will leave human only with the function to approve measures, already conceived by robot.

A decision on the future of the Group of Governmental Experts will be taken on November, 23 at the meeting of the signatories to the UN Convention on Certain Conventional Weapons. According to Elena Chernenko, there is a possibility that the decision will be compromise – the group will continue to function but will not have any mandate to negotiate.

For questions related to the PIR Center Project on New Technologies and International Security, you can contact the Project Director Vadim Kozyulin, by phone +7 (499) 940 09 83 or by e-mail kozyulin@pircenter.org.

Comments

 
 
loading