Lethal armed robots which could target and kill humans autonomously should be banned before they are used in warfare, campaigners have said.
The potential future weapons would be able to select a victim on the battlefield without any human intervention, crossing moral and legal boundaries, Human Rights Watch warned.
The so-called 'killer robots' could also be adopted by autocrats to use in deadly attacks on their own people and would be incapable of exercising compassion, campaigners say.
Steve Goose, director of the arms division at Human Rights Watch, said: "Lethal armed robots that could target and kill without any human intervention should never be built.
"A human should always be 'in-the-loop' when decisions are made on the battlefield.
"Killer robots would cross moral and legal boundaries, and should be rejected as repugnant to the public conscience."
He spoke out as the organisation launched its global Stop Killer Robots campaign calling for a pre-emptive and comprehensive ban on fully autonomous weapons.
It suggests the prohibition could be achieved through an international treaty and national laws.
During the past decade, the use of unmanned armed vehicles or drones has dramatically changed warfare.
Now rapid advances in technology permits nations with high-tech military capabilities - including the UK, US, China, Israel and Russia - to move towards systems that would provide greater combat autonomy to machines, Human Rights Watch said.
But if one or more country chooses to deploy such weapons, others may feel compelled to abandon policies of restraint, leading to a robotic arms race, it argues.
Mr Goose said: "Many militaries are pursuing ever-greater autonomy for weaponry, but the line needs to be drawn now on fully autonomous weapons.
"These weapons would take technology a step too far, and a ban is needed urgently before investments, technological momentum, and new military doctrine make it impossible to stop."
Mr Goose added that the prospect of a killer robots arms race was "frighteningly likely".
Supporters of the campaign believe the machines would lead to more civilian "collateral damage", more conflicts being triggered or escalated, a loss of accountability in war, and an uncontrollable arms race.
The Terminator films envisage a horrifying future in which thinking military machines turn on their creators.
That much is sci-fi fantasy, but military technology is marching rapidly towards drone aircraft and other weapons that can operate with little or no human intervention.
The US X-47B, a fast jet drone designed to carry out missions from aircraft carriers, can fly by itself without the need for a remote human "pilot".
It has undergone sea trials and could be replacing US fighter jets in the Pacific in 2020, according to artificial intelligence expert Professor Noel Sharkey, from the University of Sheffield, who chairs the International Committee for Robot Arms Control (ICRAC).
Another hi-tech drone due to undergo flight tests this year is the supersonic British Taranis from BAE Systems. Named after the Celtic god of thunder, it is designed to operate without human control for much of the time and to attack both other aircraft and ground targets.
Crucially, however, none of these weapons yet have the ability to engage an enemy on their own.
Speaking at the campaign's launch at the Front Line club in London today, Prof Sharkey said he was "shocked" by the US military's lack of understanding about the limits of robotic technology.
"Using such weapons against an adaptive enemy in unanticipated circumstances and in an unstructured environment would be a grave military error," he said. "Computer-controlled devices can be hacked, jammed, spoofed or can be simply fooled and misdirected by humans.
"I'm sure robots in the next 20 years or so will be able to do some discrimination between maybe a tank and a school bus. But the problem is really the kind discrimination that is needed on the battlefield to discriminate between a combatant soldier and a civilian, particularly in insurgent warfare.
"Human soldiers can use their own judgment. They can smell their intuition, and robots can't do that at all. In a hundred years, who knows, but that's not the point. A lot of people will die in the meantime."
American Nobel laureate Jody Williams, awarded the peace prize in 1997 for helping to bring about an international ban on anti-personnel landmines, said war would just become too easy to wage with killer robots.
"It's already too easy to go to war," she said. "If war is reduced to weapons working with no human beings in control it is going to be civilians who are going to be, even more than now, bearing the brunt of warfare.
"It crosses a moral and ethical boundary that should never be crossed. We believe it would totally transform the face of warfare and create a new weapons race that we do not need."
The campaigners pointed out that there was already a precedent for action outlawing a weapon before it has even been built. A United Nations ban on the use of blinding laser weapons came into force in 1998.
A Ministry of Defence (MoD) spokesman said: "The MoD has no intention of developing any weapons systems that are used without human involvement.
"Although the Royal Navy does have defensive systems, such as Phalanx, which can be used in an automatic mode to protect personnel and ships from enemy threats like missiles, a human operator oversees the entire engagement.
"Furthermore, all of our Remotely Piloted Aircraft Systems used in Afghanistan to protect troops on the ground are controlled by highly-trained military pilots.
"There are no plans to replace skilled military personnel with fully autonomous systems."
Why are you making commenting on The Herald only available to subscribers?
It should have been a safe space for informed debate, somewhere for readers to discuss issues around the biggest stories of the day, but all too often the below the line comments on most websites have become bogged down by off-topic discussions and abuse.
heraldscotland.com is tackling this problem by allowing only subscribers to comment.
We are doing this to improve the experience for our loyal readers and we believe it will reduce the ability of trolls and troublemakers, who occasionally find their way onto our site, to abuse our journalists and readers. We also hope it will help the comments section fulfil its promise as a part of Scotland's conversation with itself.
We are lucky at The Herald. We are read by an informed, educated readership who can add their knowledge and insights to our stories.
That is invaluable.
We are making the subscriber-only change to support our valued readers, who tell us they don't want the site cluttered up with irrelevant comments, untruths and abuse.
In the past, the journalist’s job was to collect and distribute information to the audience. Technology means that readers can shape a discussion. We look forward to hearing from you on heraldscotland.com
Comments & Moderation
Readers’ comments: You are personally liable for the content of any comments you upload to this website, so please act responsibly. We do not pre-moderate or monitor readers’ comments appearing on our websites, but we do post-moderate in response to complaints we receive or otherwise when a potential problem comes to our attention. You can make a complaint by using the ‘report this post’ link . We may then apply our discretion under the user terms to amend or delete comments.
Post moderation is undertaken full-time 9am-6pm on weekdays, and on a part-time basis outwith those hours.
Read the rules hereComments are closed on this article