In 1942, Isaac Asimov came up with the Three Laws of Robotics.
- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
In 2016, the Dallas police decided they didn’t really like the rules, so they sent a robot to execute a cop killer. Most people are okay with the decision, since there isn’t a big fan club for Micah Johnson (though, that there is any fan club at all is rather remarkable). But even if the use of an execution robot seems okay this time, what about next time?
Robot-maker Sean Bielat says he’s fine with the Dallas Police Department’s apparently unprecedented use of a police bomb-disposal robot to kill a gunman on Thursday. “A robot was used to keep people out of harm’s way in an extreme situation,” said Bielat, the CEO of Endeavor Robotics, a spinoff of iRobot’s military division. “That’s how robots are intended to be used.”
Putting aside the fact that Bielat stands to make a killing off robot sales, he’s right. Using a piece of hardware, a bunch of metal, wires and circuit boards, to save a human life is what robots are good for. That they also took a human life in the process, well, it couldn’t be helped. Sometimes, a guy needs killing, as the Texans like to say.
On Sunday, speaking to Face the Nation, Dallas Mayor Mike Rawlings blessed the operation. “The chief had two options and he went with this one. I supported him completely because it was the safest way to approach it,” he said.
The other option was to wait Micah Johnson out. He’ll get hungry. He’ll get tired. It would mean a ton of overtime. But for all the post hoc hand-wringing and pearl clutching, they had a situation to deal with, and the mayor made the call. His call? Kill him. Send in the robot and kill him.
But some ethicists are worried.
“My initial reaction was that we have just got onto the slippery slope,” said Heather Roff, a senior research fellow at Oxford and a research scientist at Arizona State University’s Global Security Initiative. “This is going to be very hard to put back and that the militarization of police capabilities means that they may now feel that it is reasonable to use robotics in this way to ensure compliance…If one doesn’t have to talk to a subject and can demand compliance, then this may mean more forceful or coercive demands are made.”
Hard to put the genie back in the bottle? More like impossible. It didn’t just work well. It worked great. It did exactly what it was supposed to do, executed Johnson while putting no cop at risk. The dawn of killer robots just happened, and we applauded.
Bielat believes that incidents like the one in Dallas, in which police used a Northrop Grumman Remotec Andros F5 to carry explosives close enough to a gunman to kill him, won’t become “a common occurrence,” in part because the Andros F5, like his company’s own celebrated PackBot, cost upwards of $100,000 apiece. But he also believes that military-grade robots are on the cusp of getting a lot cheaper and more capable, due to decreases in the cost of processing power, advances in 3D printing, and other factors.
At $100,000 apiece for military grade robots, they’re kinda pricey. But with the consumer market door suddenly flung wide open, that’s about to change.
“A lot of the components have come down in price because of consumer applications,” he said. “You now have more capability in the camera on a cellphone than you did on the…camera on the PackBot when it was first built.”
How long before every police department can afford not just a robot, but a killer robot, one designed not for bomb disposal but for human disposal? And even if they’re a tiny, itty-bitty department in the middle of nowhere, there’s a plan to cover them.
The advances that such programs yield — and the 1033 program, which allows local and state law-enforcement agencies to request military technology — will make police robots useful, cheap, and ubiquitous, Bielat says. The program wouldn’t provide armed military robots to police, but police may decide to arm them, says Bielat.
Useful, cheap and ubiquitous? You bet. But the best part is that whenever there is a situation that presents the potential for a violation of the First Rule of Policing, there will no longer be a need to put heroes, fathers, mothers, the Finest, at risk. Just send in the robots.
Except the robots don’t talk to people. They don’t warn. They don’t de-escalate. They don’t adjust to changing circumstances. They don’t take prisoners. They just do what they’re programmed to do. Rather than risk 100 cops in from of 1000 protesters, why not send your robot to herd them where you want them. If they don’t comply, it’s not like they can sass a robot. At least, not effectively.
Asimov crafted three very simple rules to prevent what he foresaw as the obvious problem with unfeeling, unthinking, machines that could kill. Out of the box, Dallas decided the rules didn’t apply. While the nattering naboobs of negativity try to sort out their feelings on death by robot, Sean Bielat and others are busy retooling their military versions for a police department near you. And they’ll get that price way down so no police officer will ever have to stand in harm’s way again.