Imagine a soldier that doesn’t need to eat or sleep, doesn’t get angry and always follows orders.
Sounds like something out of a science-fiction novel but technological advances mean the potential for creating a fully autonomous weapons system is very close.
Countries like the US, China, Israel, Russia, South Korea and the UK are known to be developing weapons systems with significant autonomy like how targets are selected and attacked.
Six years ago Human Rights Watch launched a campaign to secure a preemptive ban on the development, production and use of so-called killer robots.
The campaign is spearheaded by New Zealander Mary Wareham, who says she's disappointed in New Zealand's stance and lack of political leadership on the issue.
She says at the moment these robots are being used for the dull, the dirty and the dangerous tasks and there is not yet full autonomy in the critical functions of the weapons system but that it's on the horizon.
Wareham says the selection of a target and the use of force against it needs to stay in the control of humans, not be delegated by machines.
“I think we’ll see advances in computer programming to the extent that it will become possible to programme a computer to go out there and select and identify and attack targets without any meaningful control.”
This is why the Human Rights Watch is calling for regulation, she says.
“Most countries will say ‘we do not have lethal autonomous systems, we do not have plans to acquire or develop them’ but that doesn’t gel with what we see in terms of the money that is being sunk into all kinds of weapons platforms with autonomy in them; air based systems, on the ground, stationary, robotic systems, in the sea, under the sea. There are many different types of autonomy that are starting to come to the forefront but the fully autonomous weapons are not here yet.”
Israel is probably one of the most pro-robot nations, she says.
“There are autonomous systems that… are patrolling Israel’s various different borders. Some of them are weaponised, all of them as we understand it, either have a human in or on the loop, there’s not a human out of the loop in terms of the decision making that that weapon but you know, Israel is definitely one of the biggest investors in fully autonomous weapons technology.”
There are very small autonomous systems that are not yet weaponised but can be deployed in a swarm, which is an example of the technology that may be weaponised in the future, she says.
“The United States military invited 60 Minutes to come out to the desert in nerved and watch them disperse a whole swarm of hundreds of…drones out of the back of an aircraft and they floated to earth and started to move into formation as they came down, they were not weaponised but if this is being developed by the military, this is the concern that we have.
“Today’s armed drones are just the Model T, the very first version that you can see.”
An international agreement may be a prohibition approach, or to enshrine the principle of human control in international law, she says.
And she says, more than 80 countries have been debating this issue since the campaign was launched in 2013.
“The international community needs to decide if it’s acceptable to commit machines to take human life on the battlefield or in policing or in border control - these are some of the scenarios in which fully autonomous weapons could be used.”