Autonomous Weapons: A Dangerous Trend
By Charles Kindleberger
PEP Board Member
This article was originally published in the 2016 edition of the Peace Economy News.
Secretary of Defense Ashton Carter, with his PhD in physics from Oxford, is a strong advocate for weapons that incorporate artificial intelligence. He advocates a “third offset” strategy, like in the 1950s and again the 1970s and 80s, the United States tries to offset the potential of adversaries with larger armies, by building smarter high tech weapons.
Carter has begun to establish Defense Innovation Unit Experimental facilities (DIUx), first in Silicon Valley, and second in the Boston area; organizations that will report directly to him. He boasts that the Department of Defense “R and D” budget is twice the size of Apple, Google and Intel combined.
The LRASM
The problem with designing ever more sophisticated weapons is some engineers are motivated to make them “autonomous.” Consider the Long Range Anti-Ship Missile (LRASM), initially supported by the Defense Advanced Research Projects Agency (DARPA), and now recommended to be manufactured by Lockheed Martin. The missile is designed to be fired at a target, presumably by a human, but then to fly to the target without human intervention and attack it. But, what if it turns out that there was a mistake? What if it turns out to be one of our ships, or an adversary’s ship attempting to surrender?
Semi-autonomous weapons may incorporate a lot of automation, but a human operator is always “left in the loop.” Not so with autonomous weapons. In April, a large number of countries at the United Nations agreed to examine Lethal Autonomous Weapon Systems (LAWS) and the extent to which they should be banned under the Convention on Conventional Weapons or some other international regulation to treaty.
For more information see: “Autonomous Weapons and Operational Risk” by Paul Scharre at the Center for a new American Security, February 2016. Also “Killer Robots and Concept of Meaningful Human Control” by Human Rights Watch and Harvard Law School Human Rights Clinic, April 2916. Finally, learn about the International Committee for Robot Arms Control.