As autonomous weapons are increasingly utilized, scientists and human rights advocates continue to call for stronger regulations governing the use of these systems.
A soldier from the 1st Armored Regiment of Australia remotely operating a weapon system (Photo: Australian Defence Force).
The world is transitioning to the use of autonomous weapons empowered by artificial intelligence in armed conflicts. Meanwhile, scientists and human rights advocates are urging the establishment of stronger regulations to guide the use of these weapon systems.
Last month, the Australian Defence Force announced a new weapons system, including drones, robotic combat vehicles, unmanned tanks, and robotic mine-clearing dogs. These are semi-autonomous weapons that still require human involvement in their operation.
Lieutenant Colonel Jakes Penley (left) and Lieutenant Colonel Adam Hepworth (right) at Puckapunyal Military Base, Australia (Photo: ABC Shepparton: Charmaine Manuel).
The Director of Automated and Robotic Systems, Lieutenant Colonel Adam Hepworth, stated that the Australian military recognizes the potential for utilizing artificial intelligence (AI) in military operations, including administrative and combat functions. However, it is crucial to maintain human oversight in decision-making processes.
He emphasized that all systems used by the Australian military must undergo verification processes in accordance with domestic obligations and international law.
Nevertheless, some leading scientists argue that there is a need for even stronger regulatory frameworks to control the use of these military systems.
Professor Toby Walsh, an AI expert at the AI Institute, University of New South Wales, Australia, states that artificial intelligence is a double-edged sword (Photo: TU Berlin/Press/Christian Kielmann).
According to Professor Toby Walsh from the AI Institute at the University of New South Wales, the use of AI and automated tools is a double-edged sword.
On the positive side, some applications offer significant benefits, such as mine-clearing robots. “There should be no more individuals at risk of losing their lives or limbs due to mine clearance operations,” he said. “This is a task perfectly suited for robots.”
We can use robotic dogs for mine-clearing tasks (Photo: ABC Shepparton: Charmaine Manuel).
“However, along with that, I believe there are cases where delegating decision-making to algorithms could lead us to extremely dark and dangerous outcomes,” Professor Walsh added. According to him, the software used in weapons is highly susceptible to theft, copying, or manipulation.
Machines cannot understand the value of human life
Australian Human Rights Commissioner, Lorraine Finlay, stated that the use of autonomous weapons presents challenges to the principles established by international human rights law. She noted that the control systems of the Geneva Conventions have gaps because autonomous weapons are not designed to learn from each mission, and the technology is continuously evolving.
There are concerns about whether machines can genuinely consider ethical implications, as they do not comprehend the value of human life.
“Simply stating that there is human involvement somewhere in the entire process of using these weapons is not enough; it is necessary to clearly define their role in that process, their authority, whether they are the ones making critical decisions or if this is delegated to machines,” Ms. Finlay said.
Drones being demonstrated at Puckapunyal (Photo: Australian Defence Force).
There are currently no specific regulations governing lethal autonomous weapons, and many scientific and human rights voices argue for the need for such regulations.
In November 2023, Australia voted in favor of a United Nations resolution urging the international community to address the challenges posed by autonomous weapon systems. “Now is the time to tackle these issues and ensure that protective measures are implemented immediately.” – Ms. Finlay stated.