VATICAN CITY - At a public debate at the UN Security Council in New York, the Vatican's secretary for relations with states and international organisations stressed the "urgent reality" of the risks associated with the development of artificial intelligence in the military sphere. He reiterated the Holy See's call for an "immediate moratorium" on autonomous lethal weapons and pointed to the dangers of using artificial intelligence in nuclear command and control systems.
Isabella H. de Carvalho - Vatican City
A 'human-centred approach to the development and use of new technologies' must be applied, especially in the military field, because these technologies cannot replace 'human judgement in matters of life and death', as they would cross boundaries 'that must never be crossed'. This is the appeal made by Archbishop Paul Richard Gallagher, Secretary for Relations with States and International Organizations, in his speech at an open debate held on 24 September at the UN headquarters in New York.
The meeting, organised by the UN Security Council, was dedicated to the topic of artificial intelligence and international peace and security. Mons. Gallagher stressed that "unless the development and use of artificial intelligence are firmly rooted in respect for human dignity and the search for the common good", they risk becoming "instruments of division and aggression" and "fuelling further conflict". This is not an "abstract or remote concern" but "an urgent reality, given the current global instability and the rapid integration of artificial intelligence into conventional and nuclear weapons systems".
Moratorium on lethal autonomous weapons
As has often been stressed in recent years, the Secretary for Relations with States reaffirmed that the Holy See "strongly supports the adoption of an immediate moratorium on the development" of lethal autonomous weapons systems (lethal autonomous weapon systems, LAWS) The implementation of these technologies "raises serious concerns" for the international community at the legal, humanitarian, ethical and security levels because they are systems "without the human capacity" for moral judgment and ethical discernment. The Holy See, Archbishop Gallagher continued, is calling for the creation of "a legally binding instrument to ensure that life and death decisions remain under significant human control."
Archbishop Gallagher also described as "equally disturbing" the threat of "a new arms race characterised by the integration of artificial intelligence into military systems", including space technology and missile defence. According to the Vatican diplomat, this context threatens to "change the nature of weapons and war" and create "an unprecedented level of uncertainty." In particular, he continued, the use of "artificial intelligence in nuclear command and control systems" could lead to "new unknown risks that go far beyond the already fragile and morally questionable logic of deterrence."
Role of the Security Council
Finally, Archbishop Gallagher emphasized that the Security Council has "primary responsibility for the maintenance of international peace and security" and therefore must "pay particular attention to the scientific and technological advances" taking place in the world today. He then thanked South Korea for convening a debate on this timely topic, explaining that artificial intelligence is already having a "profound impact" on areas such as education, the world of work, communications, healthcare and more. This technology, he noted, "has the potential to help realise" the aspirations "that led to the creation of the United Nations eighty years ago", such as peace, security and guaranteeing freedom and human rights.
vaticannews.va/gnews.cz-jav