They're known as "lethal autonomous weapons systems," or LAWS, although some people prefer the catchier term "killer robots." Either way, representatives from around the world recently gathered in Geneva to debate an important question: Should they be banned from the battlefield?
What are LAWS? No one can seem to agree on a definition, but basically they are weapons that, once activated by a human operator, can acquire, select and engage targets on their own. Although LAWS elude a consensus definition, that has not stopped human rights nongovernmental organizations such as Human Rights Watch and others from wanting them banned.
In their view, LAWS are incapable of complying with international humanitarian law, the body of law that regulates the conduct of war. Specifically, NGOs contend that LAWS cannot distinguish combatants from civilians. In their dystopian future, Cyberdyne Systems Skynet killer robots will roam the battlefield, indiscriminately gunning down innocent civilians. The truth is much less graphic.
In the real world, the United States and its allies already field such weapons in a defensive capacity. For example, the Phalanx Close-In Weapon System is a rapid-fire, computer-controlled, radar-guided gun system deployed on U.S. warships to destroy incoming anti-ship missiles.
Israel has one, too. Its Harpy unmanned combat air vehicle is a "fire and forget" autonomous weapon designed to destroy enemy radar stations. The Harpy autonomously loiters over the battlefield, automatically searches for and detects mobile or static anti-aircraft missile radar systems, and attacks by colliding with them and detonating.
According to human rights NGOs, only human beings should have the ability to make "life and death decisions" on the battlefield. But no robot built makes a "decision" on anything. Not really. A robot only executes the program that has been installed in it, basing its actions on the algorithms designed by its software engineer. While LAWS may have the ability to find and destroy a target, it is the soldier that deploys the weapon who is making a life-or-death decision, not the robot.
Let's hope the U.S. delegation makes these distinctions and doesn't buy in to the NGO paradigm. So far, the U.S. has expressed no support for a ban on LAWS. Rather the Department of Defense has issued a directive outlining U.S. policy on the development and possible use of LAWS, requiring that they "be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force."
The U.S. is a leader in the development of LAWS, and should continue to be so. That's the only way U.S. armed forces can retain a tactical and strategic advantage over its enemies in future conflicts. The U.S. delegation should block any effort to ban LAWS or regulate them out of existence.
Of course, if these NGOs don't get their way, they may well spin off and hold their own private ban conference in some foreign capital city, as they have done before. These NGOs, along with such nations as Canada and Sweden, have in the past hosted meetings to draft and approve treaties outside the United Nations process. The results have included treaties banning anti-personnel landmines and cluster munitions. While the U.S. has signed neither of these "treaties," the Obama administration recently acquiesced to the ban on landmines.
Hopefully the administration will hold the line in Geneva when it comes to LAWS and not cave to the NGOs again. We should continue to develop LAWS in a responsible manner - and help ensure our security by keeping our armed forces at the leading edge of military technology.
- Steven Groves is the Bernard and Barbara Lomas Fellow in The Heritage Foundation's Thatcher Center for Freedom.
Originally distributed by the Tribune Content Agency