Robots on the rampage: How autonomous weapons will reconfigure the ethics of warfare

By Vanessa Cartwright on March 30, 2015 | Permalink

Robot stare 640x427

The robot armies, artificial intelligence networks and cyborg assassins of science fiction continue to haunt the human imagination. But could they be the future of warfare? To some extent, the onslaught of robots appears inevitable within major militaries. The U.S. Defense Force could replace a quarter of its combat troops with robots by 2030. Using nearly autonomous and autonomous “warbots” capable of intelligent battlefield behaviors could potentially save soldiers’ lives and improve the efficiency of decision-making in war. But it could also bring a host of dangerous problems.

The U.S. Department of Defense defines an autonomous weapon as one that, “once activated, can select and engage targets without further intervention by a human operator”. The guidelines surrounding this definition in the U.S. Directive 3000.09 have been criticized for their imprecision, allowing loopholes where robots can select and strike targets by themselves. This issue is proving divisive in the international community.

Will warbots endanger or save civilian lives?

Pilotless military drones have already been blamed for excessive civilian casualties. Since 2004, over 400 strikes in Pakistan have killed up to 960 civilians and injured up to 1,720 people, reports the Bureau of Investigative Journalism as of March 30, 2015. Clearly, new ethical principles and laws are needed to monitor the increase in poorly regulated lethal autonomous weapons systems, which sport the ironic acronym of ‘LAWS’.

A major advocate for regulations on these systems is the International Committee for Robot Arms Control (ICRAC). Together with other non-government organizations including Human Rights Watch, ICRAC has formed the Campaign to Stop Killer Robots. The Campaign aims to secure an international ban on LAWS.

However, a ban on LAWS might not necessarily decrease future civilian casualties, suggest American law professors Kenneth Anderson and Matthew Waxman. They believe that automation could “make the use of force more precise and less harmful for civilians caught near it”. For example, robots lack the instinct for self-preservation, enabling them to take more risks than soldiers in order to approach a site, confirm there are no civilians, and verify a target. It is only a matter of time before robot systems will be able to distinguish between combatants and civilians. If autonomous vehicles may reduce the 90% human error factor in road accidents, well-designed autonomous robots could produce similar benefits.

Robot in desert 640x441

Do warbots compromise moral accountability?

The proponents of a ban on LAWS believe it is morally wrong to delegate life and death decisions to machines. Soldiers can consult their consciences and moral precepts, whereas robots lack moral instincts and nuanced understandings of context. This raises a pressing moral question: who is to blame for a robot’s actions? Will it be the robot, the operator, the programmer, the manufacturer or even the sponsor? As researchers Merel Noorman and Deborah Johnson conclude in a journal article for Ethics and Information Technology, “Robots for which no human actor can be held responsible are poorly designed sociotechnical systems.”

Ultimately, however, robots don’t really kill people; people kill each other. Designing lethal robots is one means, however distant, to achieve this. And, to an extent, each side in a conflict can be held collectively responsible for the casualties of the other side. It remains to be seen how the law will judge robot-related accountability.

Can warbots outdo humans in decision-making?

According to the U.S. Air Force Report on Technology Horizons, “by 2030 machine capabilities will have increased to the point that humans will have become the weakest component in a wide array of systems and processes.” In many situations, robots will have faster reactions and greater accuracy than humans. Moreover, LAWS are unaffected by tumultuous emotions in the heat of battle, such as fear, panic, hatred, and thirst for vengeance. The result of these emotions is, all too often, an atrocity. Warbots instead offer the potential to impartially monitor and report on the ethical behavior of all parties in the warzone.

But will warbots ever achieve ethical decision-making? Dr Ron Arkin from the Georgia Institute of Technology believes that a set of rules approximating an artificial conscience could function as an “ethical governor” for robots. These rules could eventually be programmed into machines to help them comply with international humanitarian law.

Robot clash_640x327

Controlling the robotic arms race

States must create binding international agreements with a threefold mission: to clearly define the proper uses of autonomous weapons, to verify that other states are using these weapons appropriately, and to stipulate consequences for violators. Otherwise, as the Campaign to Stop Killer Robots warns, the increasing adoption of LAWS by nations including Israel, Russia, China, the U.K. and the U.S. may compel other nations “to abandon policies of restraint, leading to a robotic arms race”.

By lacking fatigue and remorse and replacing human soldiers, widespread robots could make “permanent” armed conflict more likely, a United Nations report cautions. Human opponents could “feel like they are always being watched, that they face a nonhuman foe that is relentless,” states American defense intellectual Eliot Cohen.

However, the benefits of robots in efficient, day-to-day household applications and manufacturing processes indicate that robots are not innately detrimental to society. War itself is the innately destructive entity. The root causes of war are deeper than simply owning advanced weapons. As the French philosopher of science Michel Serres puts it, any object can become the best or the worst of all—it depends on how we use it.

Image sources: Contando EstrelasU.S. Army Corps of EngineersCharles Haynes

Maximize your returns with oil profit – start trading smarter.

Bitplex 360