One of the first decisions Hillary Clinton will face as
president is whether to continue funding development of automated weapons,
which are weapons that think on their own, selecting targets and firing their
payload without the intervention of humans once they have programmed the
mission into the weapon. Kind of like the Terminator of movie fame, although defense
officials go out of their way to explicitly deny that analogy.
These weapons are as horrifying in their own right as germ,
chemical and nuclear weapons, and more prone to misuse or unintentional use. We
can anticipate that decision-making weapons will be as susceptible to bugs,
hacking and programming errors as other sophisticated systems based on computer
technologies, such as bank databases, credit card companies, government
servers, clouds and the Internet. A robot could turn on us, kill the wrong
target or mindlessly start slaughtering innocents.
There is also the moral
issue of agency. The very thing that makes automated weapons so attractive—we
can send them into battle instead of live soldiers—also underlies the essential
immorality of using robots to kill other humans. It’s so easy to kill an animated figure on a
screen in a video game. And then another, and then another, each of them so
realistic in their detail that they could almost be human. Pretty soon you’ve
knocked off hundreds of imaginary people. Not so easy, though, for most of us
to pull a trigger, knowing that a bullet will rip through heart of someone
standing ten feet away and end their existence. Perhaps we instinctively
empathize with the victim and fear for our own lives. Or maybe most of us kill
with difficulty because the taboo against killing is so strongly instilled in
us, that moral sense that taking the life of another human being is wrong,
sinful?
The problem with all advanced military technologies is that they turn
war into a video game, and by doing so distance the possessors of the
technology from their adversaries. Whether the attack is by conventional
bomber, missile, drone or the decision-making robot weapons now under development,
the technology turns the enemy into video images. Remote warfare dehumanizes
the enemy and makes it easier to kill lots of them without giving it a thought.
The bombardier doesn’t see the victims below, or if he can, they look like
specks. The operator of the drone is even farther away from his intended
victims. The operator of robots even more so.
Once we have robot weapons that are allowed to think and act
independently, the next logical step will be to provide them with nuclear
capabilities. I can only imagine the horrors that we will be able to inflict on
others combining these two apocalyptic weapons, but I’m guessing that a future
civilization from another planet will label the development of automated
weapons with nuclear capabilities as the beginning of Earth’s final extinction
event.
Moral and safety
considerations aside, there is also the issue of cost. Lots of pundits like to
deny it, but one of the primary reasons the United States economy thrived
during the 1990s was the peace dividend we received at the end of the cold war.
Just like the money that the Obama Administration proposes to spend modernizing
our nuclear weapons, the funds to develop automated weapons could better be
used to fund public education, mass transit, alternative fuel, medical research
and other pressing needs.
As soon as one
proposes not developing a new weapon or military technology, apologists for the
military-industrial complex (a Republican president’s phraseology) always
invoke the fear that other countries will develop it first, and automated
weapons are no exception. The argument that we have to do it before others is
fallacious because there is another way: to negotiate a treaty banning all
development of these monstrous weapons of mass destruction. The central factor
in what I believe will be an easy international agreement to reach is the
asymmetry in resources that favors the United States. Only China could keep up
with us in spending if we decided to make a major “moonshot” push to develop
Terminator-like weaponry. But China faces tremendous environmental and
developmental problems. The Chinese also seem usually to prefer to compete
economically and culturally, and would likely welcome a treaty banning automated
weapons.
In the course of a
little over one hundred years, humans have developed four apocalyptic weapons
of mass destruction: germ warfare, chemical warfare, nuclear warfare and now
automated weapons. Thankfully, we have had the will to outlaw two of these
terrible scourges. Let’s hope that Hillary makes it three out of four by vetoing
the further development of robotized weapons, and then starts working on ending
nuclear weapons.
No comments:
Post a Comment