The Holy See is urging the international community to step up efforts for a legally binding instrument to address the legal and ethical issues raised by Lethal Autonomous Weapons Systems (LAWS), while promoting peaceful uses of AI for the benefit of the common good of humanity.
By Lisa Zengarini
The Holy See has expressed its disappointment over the minimal progress made by the Group of Governmental Experts (GGE) of the Convention on Conventional Weapons (CCW) to start negotiations for a legally binding instrument on Lethal Autonomous Weapons Systems (LAWS).
LAWS, also known as ‘robot-killers’, are new high-tech weapons powered by the most evolved artificial intelligence and are programmed to attack targets with no human assistance or control. A number of States, including the US and Russia, are investing important sums in this new technology, which could shape the wars of the future.
Ethical and legal issues
Over the past years, discussions have taken place to start negotiations for an international framework addressing the moral, ethical, humanitarian, and security issues posed by LAWS. Although a majority of States, including the Holy See, are in favour of drawing a legal and moral line around autonomy in weapons, progress towards regulation has been slow due to resistance by some countries, including the United States, the United Kingdom, and Russia. These same countries have conditioned the modest outcome of the latest GCE meeting, which was unable to agree a way forward last week ahead of the Sixth Review Conference of CCW taking place in Geneva from 13-17 December.
Speaking on Wednesday at the Conference, Msgr. John Putzer, chargé d’affaires of the Vatican’s Permanent Mission to the UN and other international organizations in Geneva, highlighted the Holy See’s disappointment with the results, reiterating the urgent need for an international framework to address the issues raised by LAWS.
The need for human supervision over weapons
“In the view of the Holy See it is imperative to ensure adequate, meaningful, and consistent human supervision over weapon systems” as “only humans are able to see the results of their actions and understand the connections between cause and effect”. This, he said, “would not be the case with LAWS which could never ‘understand’ the meaning of their actions”. The Vatican representative drew attention in particular to three considerations.
The first consideration is that only “adequate human supervision” can preserve ethical principles and ensure compliance with international humanitarian law in the management of weapons systems.
“Meaningful human supervision”, Msgr. Putzer further explained, “also implies that, ultimately, there is always the reference to the human person that must guide the research, development, and use of weapons systems, even in the absence of specific legal regulations, as implied by the ‘Martens Clause’”, which is aimed at offering some protection to individuals caught up in armed conflict even when there is no specific applicable rule of international humanitarian law [ed.].
Thirdly, Msgr. Putzer highlighted that “consistent human supervision entails that at no time the weapons systems would have the capacity to contradict what the human authority has prescribed as the main purpose or result of its intervention”.
The Holy See’s proposal
While reiterating its call for convening the negotiations and in the meantime establishing a moratorium on the development and use of LAWS, the Holy See also proposes the establishment of an international organization for artificial intelligence, “to facilitate, and ensure the right of all States to participate in, the fullest possible exchange of scientific and technological information for peaceful uses and towards the common good of all the human family”.
“In the midst of the global pandemic, it is important to place emerging technologies at the service of humanity for peaceful uses and integral human development”, Msgr. Putzer concluded.