November 25, 2008

NYT: Can battlefield robots behave more ethically than human soliders?

According to computer scientist Ronald C. Arkin, the answer to this question is yes. Arkin is currently designing software for battlefield robots under contract with the U.S. Army.

“My research hypothesis is that intelligent robots can behave more ethically in the battlefield than humans currently can,” he says.

Excerpt from the New York Times article:

In a report to the Army last year, Dr. Arkin described some of the potential benefits of autonomous fighting robots. For one thing, they can be designed without an instinct for self-preservation and, as a result, no tendency to lash out in fear. They can be built without anger or recklessness, Dr. Arkin wrote, and they can be made invulnerable to what he called “the psychological problem of ‘scenario fulfillment,’ ” which causes people to absorb new information more easily if it agrees with their pre-existing ideas.

His report drew on a 2006 survey by the surgeon general of the Army, which found that fewer than half of soldiers and marines serving in Iraq said that noncombatants should be treated with dignity and respect, and 17 percent said all civilians should be treated as insurgents. More than one-third said torture was acceptable under some conditions, and fewer than half said they would report a colleague for unethical battlefield behavior.

...

“It is not my belief that an unmanned system will be able to be perfectly ethical in the battlefield,” Dr. Arkin wrote in his report (PDF), “but I am convinced that they can perform more ethically than human soldiers are capable of.”

Dr. Arkin said he could imagine a number of ways in which autonomous robot agents might be deployed as “battlefield assistants” — in countersniper operations, clearing buildings of suspected terrorists or other dangerous assignments where there may not be time for a robotic device to relay sights or sounds to a human operator and wait for instructions.
Read the entire article, "A Soldier, Taking Orders From Its Ethical Judgment Center."

7 comments:

  1. Anonymous2:08 AM

    And they won't leak evidence of your immoral or unethical covert ops! Brilliant! Dead bots tell no tales.

    ReplyDelete
  2. if we keep progressing along these lines, it won't be long before technology, in the name of war, gets ahead of us; if that happens, then nobody knows what will happen, except that the war machine will just get bigger and bigger, and increasingly dangerous

    we have to move beyond war and everything associated with it; all of the wonderful technology and possibilities that we think we are going to develop are only increasingly the subject of existential risk as long as the war machine exists, and the mentality that supports it; we take war and its applications very seriously; why don't we take peace, the end of war, and an all-inclusiveness for the welfare of all people just as seriously?

    I am convinced that ethics and the battlefield is bullshit

    ReplyDelete
  3. @dharmicmel

    I completely agree that 'ethics and the battlefield' is nonsense. The moment one side feels they've lost a tactical edge they'll reprogram their bots to start taking out civilians. War is impossible to regulate, and flimsy pieces of paper like the Geneva Convention don't hold much currency during times of extreme desperation.

    ReplyDelete
  4. Even little battles are times of extreme desperation, if you're a participant.

    ReplyDelete
  5. Incidentally, I also think the equivocation coming from top civilian leadership over tort... I mean, "enhanced interrogation" and other similar laws of war are the true reason of confusion amongst soldiers about what's ethically acceptable. If commanders and pundits are framing massacres as being bad because the mamby-pamby media might tattle on you, it creates the impression that such strictures owe to mere squeamishness, not any real moral thought. Ordinary people are accustomed to receiving moral and ethical guidance, so if you have the Vice President joking about suspects taking a bath or whatever, then it's going to mean a lot of very young men out there are going to channel their frustrations at the trumped up rules keeping them from accomplishing their mission. There's always a bias toward action in the military, of course, but our President and associated enablers made it much, much worse.

    And those would be the people setting ethical policy on robot warriors. And they can't object.

    ReplyDelete
  6. nato: in a few short paragraphs, you have provided a poignant reality check

    also, having experienced Nam, each and every day became a "blog post" for the overwhelming magnitude of presence in the war machine; even more, most of these things remain invisible and even forgotten

    battlefield robots will never know that, which means, among other things, that the very essence of war could be transformed into something almost unimaginable; this is a side of future possibilities that we really do not want

    ReplyDelete
  7. Anonymous4:29 PM

    When robots "accidentally" kill their first civilians... who will stand trial for it?

    ReplyDelete

Note: Only a member of this blog may post a comment.