Australian Quakers – in collaboration with Safe Ground, Religions for Peace Australia, United Religions Initiative, Pax Christi, the Multifaith Assocation of South Australia and the Canberra Interfaith Forum – will host an event on 22 September 2022 (in person and online) presenting the Australian Interfaith Response to the use of autonomous weapons – otherwise known as Killer Robots.
What Are Killer Robots?
Killer robots are also known as lethal autonomous weapons systems (LAWS)1. These are weapons that would, without meaningful human control, select and attack targets. They would make decisions about taking lives, whilst lacking the critical human characteristics of wisdom, judgement, responsibility, empathy, moral conscience, and compassion necessary to make such a complex choice.
Do they already exist? Armed drones do exist and are in use, but these still have a human operator controlling the weapons system – usually from a distance – who is responsible for selecting and identifying targets as well as pulling the trigger.
Are killer robots currently being developed? Systems do exist – and are under further development – that could be adapted to remove meaningful human control from the selecting and attacking of targets. Some examples of these include
a) a stationary robot in operation along the border between North and South Korea that is armed with a machine gun and a grenade launcher, and can detect human beings using infrared sensors and pattern recognition software, with the possibility of firing at them;2 and
b) a 40-metre long, 135-ton, self-navigating warship under development in the United States of America that is designed to hunt for enemy submarines and can operate without contact with a human operator for two to three months at a time. It is currently unarmed, but US representatives have said that the goal is to arm the warships within a few years. Other examples can be drawn from technologies developed in France, the United Kingdom, Israel, Russia, and China that would not need very much adaptation to become fully autonomous.
Would killer robots be legal under international law? As killer robots would operate without meaningful human control, they would face particular difficulties in complying with two fundamental rules of international humanitarian law: a) distinction and b) proportionality.
a) Warring parties must be able to distinguish between civilians and soldiers, and between civilian objects (such as homes or schools) and military targets. Killer robots would have difficulty in doing so.
b) The laws of war also require the warring parties to weigh the proportionality of an attack. Will the expected harm to civilians and civilian objects be excessive in relation to the expected military advantage? Would a “reasonable military commander” have decided it was lawful to launch the attack? In cases like these and many more, killer robots could not replace human judgement.
Fully autonomous weapons would also violate three foundational elements of human rights law: the right to life, the principle of human dignity, and the requirement of accountability. Human rights law – which is based on principles of Christian ethics – applies during times of peace as well as armed conflict. It is important to note this because it is likely that fully autonomous weapons would be used beyond the battlefield in law enforcement situations.
Download Killer Robots — A Campaign Guide for Churches
Download a Flyer for this event
Program: Religion, Peace and the Moral Issues of Fully Autonomous Weapons
When: Thursday 22nd September at 7pm AEST
Where: The Victorian Quaker Centre, West Melbourne / Online
RSVP: Online at Google Docs
Link: Those attending online will receive a confirmation email with the event details.