In the fall of 1989, the peoples of Eastern Europe rose up against their Communist oppressors. The tyrants ruling these nations had no moral compunction about shooting their subjects down, but fortunately, they couldn’t count on their armed forces to do it. So the Iron Curtain fell, and two years later, even the mighty Soviet Union was brought down when the Red Army, sent into Moscow, refused the orders of those attempting to brutally reinstate Stalinist rule.
But imagine what might have occurred had those soldiers been not human beings but robots, lacking in any sympathy or humanity, ready, willing, and able to reliably massacre anyone the authorities chose to be their targets.
This is the threat posed by the emerging technology known as “autonomous weapons.”
For decades, science fiction has speculated on the theme of robot servants rising up to overwhelm their human masters. Such scenarios remain fantasy, because they require self-reproducing machines with a will to power and the ability and desire to cooperate with each other to carry off a grand collective design — which at this point, anyway, is still quite far-fetched. Instead what we have seen are drone weapons, most typically aircraft, under human command, executing reconnaissance and strike operations by remote control. The military advantages offered by such systems are obvious. Drone fighters, for example, cost much less than piloted fighters, can pull 20 g’s without blacking out, are utterly fearless, and can be sent on one-way missions, if necessary, without human loss. So we are sure to see more of them, and analogous systems developed for land and sea fighting.
The problem, however, occurs with proposals to eliminate human operators and allow such systems to control themselves using “artificial intelligence.” Some have pointed out that this could allow units to malfunction or be hacked by the enemy and subject our own people to their weaponry. That is certainly conceivable. But I believe the real problem is that it would allow whole armies, obedient without the limiting constraint of human thought, to be commanded directly by tyrannical elites.
This danger is illustrated by a recent paper written by a committee of artificial-intelligence experts, which included both strong advocates for autonomous weaponry and some with more cautious attitudes. Reaching a compromise, the group proposed that:
States should consider adopting a 5-year, renewable moratorium on the development, deployment, transfer, and use of anti-personnel lethal autonomous weapon systems . . .
The moratorium would not apply to:
Anti-vehicle or anti-materiel weapons
Non-lethal anti-personnel weapons
Research on ways of improving autonomous weapon technology to reduce non-combatant harm in future anti-personnel lethal autonomous weapon systems
Weapons that find, track, and engage specific individuals whom a human has decided should be engaged within a limited predetermined period of time and geographic region.
One cannot help but note that most of the applications called out to be excluded from the moratorium are those directed against civilians, rather than opposing armed forces.
In her seminal book The Origins of Totalitarianism, Hannah Arendt identified a number of developments that contributed to the growth of totalitarianism in the 20th century. These included items like militarism, anti-Semitism, and imperialism, whose antecedent relationships to Nazism and/or Stalinism are fairly obvious. One of her proto-totalitarianisms, however, took many readers by surprise: She indicted bureaucracy.
Arendt was right. Bureaucracy is necessary for tyranny because it suppresses conscience. The bureaucrat is required not to think or feel. He or she is to be part of a machine.
The purpose of bureaucracy is to turn people into automatons. But regardless of the deepest indoctrination, humans make imperfect robots. As Czech dissident leader Vàclav Havel said in his extraordinary book Living in Truth, conscience may be suppressed, but it cannot be eliminated. The inner voice remains.
Autonomous weapons have no such weaknesses.