Killer robots are ‘quickly moving toward reality’ and humanity only has a YEAR to ban them, expert warns


  • Robots do the fighting would keep soldiers and officers out of harm’s way
  • But experts say the threats to humanity would outweigh any benefits
  • Risk of harm or erroneous targeting of civilians would increase
  • Should start process on lethal autonomous weapons systems in 2017

New technology could lead humans to relinquish control over decisions to use lethal force.

As artificial intelligence advances, the possibility that machines could independently select and fire on targets is fast approaching.

Fully autonomous weapons, also known as ‘killer robots,’ are quickly moving from the realm of science fiction toward reality.

Scroll down for video 

As artificial intelligence advances, the possibility that machines could independently select and fire on targets is fast approaching. Fully autonomous weapons, also known as 'killer robots,' are quickly moving from the realm of science fiction( like the plot of Terminator) toward reality 

As artificial intelligence advances, the possibility that machines could independently select and fire on targets is fast approaching. Fully autonomous weapons, also known as ‘killer robots,’ are quickly moving from the realm of science fiction( like the plot of Terminator) toward reality

KILLER BOTS ARE A ‘DANGER TO HUMANITY’

Researchers explain that machines would make life-and-death determinations outside of human control.

The risk of disproportionate harm or erroneous targeting of civilians would increase, and no person could be held responsible.

Robots also lack real emotions, specifically compassion, which is how a lot of decisions are carried out.

Humans can apply their judgment, based on past experience and moral considerations, and make case-by-case determinations about proportionality.

It would be almost impossible, however, to replicate that judgment in fully autonomous weapons, and they could not be preprogrammed to handle all scenarios.

Allowing technology to outpace diplomacy would produce dire and unparalleled humanitarian consequences.

These weapons, which could operate on land, in the air or at sea, threaten to revolutionize armed conflict and law enforcement in alarming ways.

Proponents say these killer robots are necessary because modern combat moves so quickly, and because having robots do the fighting would keep soldiers and police officers out of harm’s way.

But the threats to humanity would outweigh any military or law enforcement benefits.

Removing humans from the targeting decision would create a dangerous world.

Machines would make life-and-death determinations outside of human control.

The risk of disproportionate harm or erroneous targeting of civilians would increase.

No person could be held responsible.

Given the moral, legal and accountability risks of fully autonomous weapons, preempting their development, production and use cannot wait.

The best way to handle this threat is an international, legally binding ban on weapons that lack meaningful human control.

At least 20 countries have expressed in U.N. meetings the belief that humans should dictate the selection and engagement of targets.

Many of them have echoed arguments laid out in a new report, of which I was the lead author.

The report was released in April by Human Rights Watch and the Harvard Law School International Human Rights Clinic, two organizations that have been campaigning for a ban on fully autonomous weapons.

Removing humans from the targeting decision would create a dangerous world. Machines would make life-and-death determinations outside of human control. But some nations are using unmanned vehicles, such as the Reaper drone (pictured) to carry out missions in combat zones

Removing humans from the targeting decision would create a dangerous world. Machines would make life-and-death determinations outside of human control. But some nations are using unmanned vehicles, such as the Reaper drone (pictured) to carry out missions in combat zones

Retaining human control over weapons is a moral imperative.

Because they possess empathy, people can feel the emotional weight of harming another individual.

WILL ROBOTS GET AWAY WITH WAR CRIMES?

If a robot unlawfully kills someone in the heat of battle, who is liable for the death?

In a report by the Human Rights Watch earlier this year, they highlighted the rather disturbing answer: no one.

The organisation says that something must be done about this lack of accountability – and it is calling for a ban on the development and use of ‘killer robots’.

Called ‘Mind the Gap: The Lack of Accountability for Killer Robots,’ their report details the hurdles of allowing robots to kill without being controlled by humans.

‘No accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party,’ said Bonnie Docherty, senior Arms Division researcher at the HRW and the report’s lead author.

Their respect for human dignity can – and should – serve as a check on killing.

Robots, by contrast, lack real emotions, including compassion.

In addition, inanimate machines could not truly understand the value of any human life they chose to take.

Allowing them to determine when to use force would undermine human dignity.

Human control also promotes compliance with international law, which is designed to protect civilians and soldiers alike.

For example, the laws of war prohibit disproportionate attacks in which expected civilian harm outweighs anticipated military advantage.

Humans can apply their judgment, based on past experience and moral considerations, and make case-by-case determinations about proportionality.

It would be almost impossible, however, to replicate that judgment in fully autonomous weapons, and they could not be preprogrammed to handle all scenarios.

As a result, these weapons would be unable to act as ‘reasonable commanders,’ the traditional legal standard for handling complex and unforeseeable situations.

In addition, the loss of human control would threaten a target’s right not to be arbitrarily deprived of life.

In addition, inanimate machines could not truly understand the value of any human life they chose to take. Allowing them to determine when to use force would undermine human dignity. These Foster-Miller TALON SWORDS units (pictured) are equipped with various weaponry, but such systems - if made autonomous - could revolt and engage incorrect targets, say experts

In addition, inanimate machines could not truly understand the value of any human life they chose to take. Allowing them to determine when to use force would undermine human dignity. These Foster-Miller TALON SWORDS units (pictured) are equipped with various weaponry, but such systems – if made autonomous – could revolt and engage incorrect targets, say experts

Upholding this fundamental human right is an obligation during law enforcement as well as military operations.

Judgment calls are required to assess the necessity of an attack, and humans are better positioned than machines to make them.

Keeping a human in the loop on decisions to use force further ensures that accountability for unlawful acts is possible.

REPORT CALLS FOR BAN ON KILLER ROBOTS

The report by Human Rights Watch and the Harvard Law School International Human Rights Clinic was released as the United Nations kicked off a week-long meeting on such weapons in Geneva. The report calls for humans to remain in control over all weapons systems at a time of rapid technological advances.

It says that requiring humans to remain in control of critical functions during combat, including the selection of targets, saves lives and ensures that fighters comply with international law.

‘Machines have long served as instruments of war, but historically humans have directed how they are used,’ said Bonnie Docherty, senior arms division researcher at Human Rights Watch, in a statement.

‘Now there is a real threat that humans would relinquish their control and delegate life-and-death decisions to machines.’

Some have argued in favor of robots on the battlefield, saying their use could save lives.

But last year, more than 1,000 technology and robotics experts — including scientist Stephen Hawking, Tesla Motors CEO Elon Musk and Apple co-founder Steve Wozniak — warned that such weapons could be developed within years, not decades.

n an open letter, they argued that if any major military power pushes ahead with development of autonomous weapons, ‘a global arms race is virtually inevitable, and the endpoint of this technological trajectory is obvious: autonomous weapons will become the Kalashnikovs of tomorrow.’

According to the London-based organization Campaign to Stop Killer Robots, the United States, China, Israel, South Korea, Russia, and Britain are moving toward systems that would give machines greater combat autonomy.

Under international criminal law, a human operator would in most cases escape liability for the harm caused by a weapon that acted independently.

Unless he or she intentionally used a fully autonomous weapon to commit a crime, it would be unfair and legally problematic to hold the operator responsible for the actions of a robot that the operator could neither prevent nor punish.

In addition, inanimate machines could not truly understand the value of any human life they chose to take. Allowing them to determine when to use force would undermine human dignity. MQ-9 Reaper (pictured), armed with GBU-12 Paveway II laser guided munitions could revolt and shoot nonthreatening targest

In addition, inanimate machines could not truly understand the value of any human life they chose to take. Allowing them to determine when to use force would undermine human dignity. MQ-9 Reaper (pictured), armed with GBU-12 Paveway II laser guided munitions could revolt and shoot nonthreatening targest

There are additional obstacles to finding programmers and manufacturers of fully autonomous weapons liable under civil law, in which a victim files a lawsuit against an alleged wrongdoer.

The United States, for example, establishes immunity for most weapons manufacturers.

It also has high standards for proving a product was defective in a way that would make a manufacturer legally responsible.

In any case, victims from other countries would likely lack the access and money to sue a foreign entity. The gap in accountability would weaken deterrence of unlawful acts and leave victims unsatisfied that someone was punished for their suffering.

At a U.N. meeting in Geneva in April, 94 countries recommended beginning formal discussions about ‘lethal autonomous weapons systems.’

The talks would consider whether these systems should be restricted under the Convention on Conventional Weapons, a disarmament treaty that has regulated or banned several other types of weapons, including incendiary weapons and blinding lasers.

PENTAGON WON’T RULE OUT ROBOSOLDIERS THAT CAN KILL WITH NO HUMAN INPUT

Allowing them to determine when to use force would undermine human dignity.

Keeping a human in the loop on decisions to use force further ensures that accountability for unlawful acts is possible

In March, a top Pentagon official gave a tantalizing peek into several projects that not long ago were the stuff of science fiction, including missile-dodging satellites, self-flying F-16 fighters and robot naval fleets.

Though the Pentagon is not planning to build devices that can kill without human input, Deputy Secretary of Defense Robert Work hinted that could change if enemies with fewer qualms create such machines.

We might be going up against a competitor that is more willing to delegate authority to machines than we are, and as that competition unfolds we will have to make decisions on how we best can compete,’ he said.

Work, who helps lead Pentagon efforts to ensure the US military keeps its technological edge, described several initiatives, including one dubbed ‘Loyal Wingman’ that would see the Air Force convert an F-16 warplane into a semi-autonomous and unmanned fighter that flies alongside a manned F-35 jet.

‘It is going to happen,’ Work said of this and other unmanned systems.

Pentagon researchers also are developing small bombs that use cameras and sensors to improve their targeting capabilities.

Other projects include robot boats and a hyper-velocity gun — known as the electromagnetic rail-gun — that can blast a projectile out at an astonishing 4,500 miles (7,250 kilometers) per hour.

The nations that have joined the treaty will meet in December for a review conference to set their agenda for future work.

It is crucial that the members agree to start a formal process on lethal autonomous weapons systems in 2017.

Disarmament law provides precedent for requiring human control over weapons.

For example, the international community adopted the widely accepted treaties banning biological weapons, chemical weapons and landmines in large part because of humans’ inability to exercise adequate control over their effects. Countries should now prohibit fully autonomous weapons, which would pose an equal or greater humanitarian risk.

At the December review conference, countries that have joined the Convention on Conventional Weapons should take concrete steps toward that goal.

The military robots in Marvel¿s Iron Man 2 (pictured) might not be so far from reality. And experts say it is crucial that the members agree to start a formal process on lethal autonomous weapons systems in 2017. Disarmament law provides precedent for requiring human control over weapons

The military robots in Marvel’s Iron Man 2 (pictured) might not be so far from reality. And experts say it is crucial that the members agree to start a formal process on lethal autonomous weapons systems in 2017. Disarmament law provides precedent for requiring human control over weapons

They should initiate negotiations of a new international agreement to address fully autonomous weapons, moving beyond general expressions of concern to specific action.

They should set aside enough time in 2017 – at least several weeks – for substantive deliberations.

While the process of creating international law is notoriously slow, countries can move quickly to address the threats of fully autonomous weapons.

They should seize the opportunity presented by the review conference because the alternative is unacceptable: Allowing technology to outpace diplomacy would produce dire and unparalleled humanitarian consequences.

Advertisements
%d bloggers like this: