Robots Must Study To Be Warriors, Claim US NavyS

Have things really gotten so bad that I can't tell if a new US Navy report warning of the need for robots to have a "warrior code" is a Terminator-related ARG or not? Sadly, yes.

The report, written by the Navy's Office of Naval Research is, according to the London Times, "the first serious work of its kind on military robot ethics" - And if that sentence alone doesn't ring alarm bells, I don't know what will - and places an emphasis on teaching military robots an ethical code to make sure that they don't rise up against us:

"There is a common misconception that robots will do only what we have programmed them to do," Patrick Lin, the chief compiler of the report, said. "Unfortunately, such a belief is sorely outdated, harking back to a time when . . . programs could be written and understood by a single person." The reality, Dr Lin said, was that modern programs included millions of lines of code and were written by teams of programmers, none of whom knew the entire program: accordingly, no individual could accurately predict how the various portions of large programs would interact without extensive testing in the field – an option that may either be unavailable or deliberately sidestepped by the designers of fighting robots.

The solution, he suggests, is to mix rules-based programming with a period of "learning" the rights and wrongs of warfare.

I don't know whether I'm more disturbed about life following this particular art, or just discovering that a military report really admits that no-one is actually in control of their technology.

Military's killer robots must learn warrior code [Times Online]