Autonomous weapons employed in the war in Ukraine

According to several unconfirmed sources, recently developed robotic weapons, weapon systems controlled by Artificial Intelligence and coordinated with various other weapons and commands, are being used in the war in Ukraine.
Autonomous weapon systems are robots equipped with lethal weapons that can operate independently, selecting and attacking targets even without a human decision, thanks to the control of an artificial intelligence. Around the world, armed forces are investing heavily in research and development of autonomous weapons; the United States has earmarked $18 billion for autonomous weapons between 2016 and 2020. But many other nations are producing autonomous weapons, including China and Turkey.
Founded in 2012, the Stop Killer Robots coalition aims to ban autonomous weapons. Their approach is: “Less autonomy. More humanity. Their description of the problem posed by autonomous weapons is general:

“Technology should be used to empower all people, not to reduce us – to stereotypes, labels, objects, or just a pattern of 1’s and 0’s. With growing digital dehumanisation, the Stop Killer Robots coalition works to ensure human control in the use of force. Our campaign calls for new international law on autonomy in weapons systems.”

Already in 2002, the roboticist and honorary president of Scuola di Robotica, Gianmarco Veruggio, founder of Roboethics, identified the problem of the use of military robots as a very serious violation of human rights and called for a ban on the granting of the ‘licence to kill’ to autonomous weapons. During the First International Symposium on Robotics in January 2004, Veruggio made an appeal:

“Military robots are already being used in combat, and billions of dollars are being spent by more than forty nations around the world on the development of increasingly deadly war machines. This is a phenomenon of enormous proportions, which is taking place without the knowledge of the general public and which is at most described as a normal, even beneficial, technological development, despite the fact that many experts warn of the problems implicit in military robots and of the fact that they may violate the Geneva Conventions and the laws of war in force. Above all these technical and legal issues, however, there is an inescapable ethical question of principle: whether it is humanly permissible to grant an autonomous non-human entity the licence to kill a human being. I believe that mankind should be informed and put in a position to make an informed decision on issues that so profoundly involve fundamental aspects of the survival of our species. It would be foolish not to have learnt anything from the experience of nuclear weapons or planetary environmental issues”.

Discussions on UN Convention on Certain Conventional Weapons halted
Following several campaigns on Stop Killer Robots and the intervention of UN Commissions on the use of robots in theatres of war, UN delegates met again in Geneva in late 2021 under the UN Convention on Certain Conventional Weapons to discuss the issues raised by autonomous weapons and propose new binding treaties. The process of approving amendments or introducing new conventions requires all member states to reach consensus on each point. Several nations have not approved a ban on autonomous weapons and discussions were suspended last week. The Washington Post of 11 March 2022 reports that the Russian delegate, citing procedural reasons, asked for the meeting to be dissolved, prompting complaints from Ukraine and many others. Obviously, the wartime war in Ukraine entered the Convention debate and disrupted the discussions.

Autonomous weapons in Ukraine
On 12 March, images appeared on the messaging app Telegram from the Ukraine showing the crumpled structure of an aircraft that several analysts have identified as the new Russian-made drone, the Kalashnikov ZALA Aero known as KUB-BLA (see image). According to the Dutch organisation Pax for Peace, which in turn quotes the specialist magazine Jane’s International Defence Review, the drone, thanks to AI, can autonomously detect the coordinates of a target that has been indicated through an uploaded image. It is not known whether the drone was actually deployed in Ukraine, and if so, whether it did so autonomously.
However, the KUB-BLA would not be the first autonomous AI-powered weapon to be deployed in combat. In 2020, during the conflict in Libya, the group of experts set up under UN Resolution 1973 (2011) reported in a specific report that they had identified the Turkish-made Kargu-2 drone that was targeting logistical convoys and retreating forces from a distance. The Turkish government has denied that the Kargu-2 was used autonomously.

In these hours, the news from Ukraine is not only fragmentary, it is often distorted by the disinformation campaign carried out on a massive scale by all sides.
Even if the news were not confirmed, the terrible logic of the autonomous arms race teaches us that if one side, or several sides, in a war, have a superior weapon, they will use it.
In this regard, Veruggio has from the outset warned against the illusion that it would be possible to control the behaviour of autonomous killer robots by means of appropriate programming that would prevent their use contrary to, for example, the rules of engagement permitted by the Geneva Convention. This is an illusion borrowed from science fiction, from Asimov’s famous three laws of robotics, which is so because of the purely technological impossibility of developing AI systems that are sufficiently evolved to guarantee adequate functioning in an unstructured and hostile environment, together with the impossibility of predicting the behaviour of robots that are equipped with learning capability and autonomy, but also because it is clear that in a life or death conflict, anyone who found himself on the brink of defeat would deactivate every limitation in an attempt to overturn the outcome of the clash.

As would be the case in a nuclear conflict.

 

Resources

Stop Killer Robots: https://www.stopkillerrobots.org/

Report of the 2018 session of the Group of Governmental Experts on Emerging Technologies in the Area of Lethal Autonomous Weapons Systems: https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/documents/final-report.pdf

https://www.wired.com/story/ai-drones-russia-ukraine/

Cover picture: a moment from one of the Stop Killer Robot coalition demonstrations

Autonomous_ Weapons_ in Ukraine

Articoli correlati

Last news
Fiorella Operto

CodyRoby turns 10!

    In 1014, in a brain storming meeting, a gathering to invent ideas, between Alessandro Bogliolo of EU CODE WEEK (the European week for

Leggi Tutto »