Meet the ‘killer robots’ of modern warfare from AI-powered suicide drones to machine guns that choose their own targets
AS a soldier runs from the battlefield, he's spotted by a suicide drone loitering in the sky overhead.
The killing machine dives down on its target and explodes in a kamikaze attack – without anyone telling it to.
It might sound like the stuff of science fiction but killer robots, sometimes called "slaughterbots", are already a reality.
Last year, STM Kargu-2 drones hunted down targets in what might be the first instance of artificial intelligence killing on its own initiative.
The revelation came after years of experts warning about the dangers of letting machines decide who lives and dies in combat.
Now a UN conference on so-called Lethal Autonomous Weapons Systems (LAWS) is taking place in Geneva to create new international regulations to stop killer robots being allowed to make their own decisions.
The Russian delegate argued no new rules are needed, but others disagree, saying machines shouldn't be given the power to choose targets themselves.
“Humans must apply the rules of international humanitarian law in carrying out attacks so weapons that function in this way complicate that,” Dr Neil Davison of the International Committee of the Red Cross told BBC Radio's Today programme.
“Our view is that an algorithm shouldn't decide who lives or dies.”
Smart sentry guns
Used by: South Korea, Israel
Deadliest feature: 1,000 rounds of machine gun fire per minute
Machine guns capable of identifying and killing their own targets have existed for at least a decade.
Samsung's arms division developed a sentry gun, the SGR-A1, which uses image recognition to identify humans and shoot at them.
They're now deployed along the Korean Demilitarised Zone, with similar sentry guns installed by the Israeli Defence Force on Israel's border with the Gaza Strip since 2008.
Although both weapons systems are capable of operating on their own, the governments of both Israel and South Korea say the guns are controlled by humans.
Suicide drones
Used by: At least 14 countries including the US, China, and Germany
Deadliest feature: Built-in explosive warheads
Suicide drones are like extremely sophisticated – and extremely scary – missiles.
Instead of going straight to a specific target after being fired, these so-called "loitering munitions" stalk the skies over a specific area.
While loitering, which could go on for hours, they scan the ground in search of a target.
Once found, they attack by speeding to the ground and exploding, hence why they're also sometimes called "kamikaze" drones.
While even the likes of ISIS use rudimentary suicide drones by strapping explosives to remote-control quadcopters, the most advanced loitering munitions are now capable of operating without a human controller.
While it's possible they request authorisation to attack when a target is identified, many loitering munitions are capable of choosing to kill on their own.
It's not certain if robots were making decisions themselves when drones decimated the Armenian army during conflict with Azerbaijan last year.
Over 40 per cent of Armenia's tanks and armoured vehicles were obliterated by suicide drones, which included the Israeli-made Harop.
The Harop, with a max speed of 259mph, can either be guided to targets by a human controller or home in on enemy radar signals on its own.
Azerbaijani officials praised the kamikaze robots as being “very effective” in the conflict, which ended in a matter of weeks as Azerbaijan gained large tracts of territory.
Drone 'swarms'
Used by: UK, Israel, US
Deadliest feature: Thousands of killing machines working in unison
When thousands of drones use AI to work together, they can pull off spectacularly complex formations without the need for thousands of pilots.
A swarm of 1,800 drones flying in unison was used to make a mind-blowing illuminated globe at the Olympic opening ceremony in Tokyo in July.
But in the same month, drone swarms were also thought to have been used in battle for the first time.
Israel is believed to have used the technology to hunt down Hamas fighters launching rockets in Gaza.
The idea of drone swarms is to have uncrewed weapons working together to make their attacks more efficient.
Armed forces in Russia, Britain and the US are also developing their own versions of the lethal tech.
Royal Marines undertook battle drills alongside drone swarms for the first time last month, in which autonomous machines in the air, on the sea and underwater helped soldiers during simulated raids.
The robots were given tasks like resupplying ammunition to troops and bringing blood to medics – as well as finding and identifying targets.
Eventually, fighter jets and battleships might also form part of a swarm connected to other killer robots using AI.
Russia's newly unveiled jet, dubbed Checkmate, already has “elements of artificial intelligence”, according to its designer.
He revealed Checkmate is “capable of operating in a network-centric combat system, that is, working as part of a group of manned and unmanned aircraft”.
'Awful vision of the future'
While some see developing AI-based weapons as a necessary step for a modern military, some critics are deeply concerned by the technology.
AI is already used in lots of beneficial ways, from Tesla's development of self-driving cars to making breakthroughs in cancer treatment.
But it's also being used for sinister purposes, including twisted apps that digitally "strip" clothes off women.
Deploying AI in weapons systems to allow them to choose who lives and dies is arguably the most controversial of all.
It’s a real challenge to human dignity and people's rights
Elizabeth Minor, Campaign to Stop Killer Robots
At the ongoing UN conference on killer robots, Russia's delegate argued autonomous weapons are able to use an “appropriate level of selectivity and precision” which allows them to remain compliant with existing international law.
But many other states including Brazil, Australia and Mexico have called for an outright ban on killer robots.
Even the Vatican waded into the debate at the UN conference, blasting LAWS for lacking “humanity and public conscience”.
“We're concerned that the systems of meaningful human control over who, where and when to kill are being eroded,” Elizabeth Minor of the Campaign to Stop Killer Robots The Sun.
READ MORE SUN STORIES
“It’s a real challenge to human dignity and people's rights – it’s an awful vision of the future where killing is done by machines without human control.”
Minor added: “We're not worried about the terminator just yet – but we're creeping towards a dehumanised future where AI is allowed to make life or death decisions.”