Guest comment: Weaponizing bots is very dangerous

Guest comment: Weaponizing bots is very dangerous

Here are our guiding principles that communities and policymakers should follow when discussing police bots.

by Jay Stanley

San Francisco was embroiled in controversy earlier this month over a proposal to allow police to deploy robots armed with deadly weapons. After initially giving the technology the go-ahead, the Board of Supervisors reverse path Because of widespread public outrage. At the moment, killer robots are banned in San Francisco, but the controversy there has brought the issue into the national spotlight. People are increasingly aware that this technology exists and that some police departments would like to deploy it.

Our overarching position is that police should be prevented from using robots to commit acts of violence. Bots should not be used to kill, subdue, push, restrain, control or harm people. Judging from the near unanimous protests in san francisco and other citiesthe audience agrees.

Arming bots is very dangerous. While anyone can come up with hypothetical stories in which the use of killer robots for the police might seem logical, the “movie plot scenarios” are just a small tip of a large pyramid of cases in which the police use force – many of which are completely gratuitous and take place in a legal context which shields almost all officers from accountability. Since 2013, more 98% of shootings are at the hands of the police It did not result in criminal charges against the officer. And in recent years, the police have continued to kill thousand people a year – with blacks three times More likely to be killed by the police than whites.

We also argued for a long time regarding both domestic Drones And the other robotsTools that allow remote and risk-free use of force for the operator make it inevitable that force will be overused, and increase the chances of it being used sloppily, resulting in unintended targets being hit. Signals between the remote operator and the robot may also deteriorate due to communication and control problems – or even hack. under any Appropriate cost/benefit calculationHowever, the negative effects of introducing armed robots into police departments will far outweigh any benefits.

The public is not alone in its opposition to robot violence. over there General consensus In law enforcement, flying robots—also known as drones—must do just that Don’t be armed in local contexts. New York Police Department Experience 2021 with robot dogs which did not include any discussion of arming is still shorthanded by public opposition, and the nation’s foremost manufacturer of two-legged robots, Boston Dynamics, which makes the robotic dog “Spot,” is commendable. to forbid the arming of her bots, and says she will License revocation From which customer does he arm them?

Since the NYPD controversy, however, some Police departments continued to use robotic dogs, and Customs and Border Protection (CBP) announce He was experimentation With technology on the frontier, either way no weapons. But another robotic dog maker, Ghost Robotics, has already built a gun-wielding robot that it’s marketing to the military (company slogan: “A fighter’s best friend”). In 2016, Dallas Kill the police Armed during a mass shooting using a wheeled robot composed of a rig rigged with explosives, in the country’s first and only use of lethal force by a robot. And now we’ve seen police in San Francisco fight back against a ban on armed robots.

No matter how many protests some police departments and manufacturers may make about their lack of interest in weaponizing robots, the possibility will continue to hang out there like a forbidden fruit unless it is made illegal.

other android rules

However, it seems likely that local law enforcement agencies will find uses for botnets even if the weaponization and use of force is prohibited. As technology continues to improve, it will be deemed too useful to spread widely in society — and law enforcement will not be an island of abstinence where technology is routinely deployed for various uses. In fact, the use of ground robots by police bomb squads is already widespread, as in use From Drones.

However, police use of technology to interact with civilians is unique — a different beast from the use of robots in industrial, employment, or domestic contexts, or even by police bomb squads. Police engage in power-charged interactions with civilians. Bots, no matter how they are used, are an extension of the Officers’ power to act in the physical world.

Police robots of the future may present dangers we can’t imagine right now, but there are some threats to civil liberties that go beyond weapons that are already becoming apparent. Bots may enter private property and collect video and other data, which can lead to privacy risks. This is already a big problem with flying robots, of course. Mission creep is also a concern. For example, the primary envisioned use case for robotic police dogs is to assist in hostage and barricade situations by scouting, serving food, and the like. (Drones are also being marketed for this purpose.) But if the technology is sold to the public for this purpose, it could potentially expand to all kinds of other uses. In Honolulu, the police used their robot dog Scouting camps for the homeless.

Fear and dehumanization are also risks. years of Science Fiction People have primed to find robot dogs scary, but these fears should not be dismissed as irrational or “just atmospheric.” Some police departments used it Soldier equipment for scare And the Intimidation, a tactic called “show of force”. Such tactics cool dissent and, when used routinely, undermine people’s quality of life. A robot dog may, strictly speaking, be just a tool, but the fact that people find it scary is significant.

Given these potential problems, what should policymakers do when communities decide to approve ground police robots for various nonviolent uses? Our recommendation is that communities put the following principles into law:

Do not arm or use force.

Robots (including drones) must not be equipped with weapons of any kind designed or intended for use against persons, or permit the use of force of any kind against persons including pushing, shoving or shoving them. A bomb disposal robot of the type already widely deployed, which was not designed to interact with humans, would be allowed.

It is forbidden to enter private property without a warrant.

No ground robot should enter private property unless a police officer has the legal right to be there, pursuant to a court order or “emergency circumstances”. It is very likely that the courts will enforce this policy under the Fourth Amendment, but in the meantime, it should be made into law. Communities must also request a warrant to conduct surveillance in any situation where a police officer will be required to obtain a warrant to conduct electronic surveillance.

Not used without community permission.

We often see police deploy controversial high-tech devices without telling, let alone asking, the communities they serve, using funds from sources such as corrupt Assets confiscation programs or federal grants. Bots should not be deployed unless approved by the community city council or other elected oversight body as we recommend.”Community control over police surveillance. Elected officials should also consider limiting permission to deploy bots to a countable list of uses. If that later leads to some unexpected use being banned that people think makes sense, the police could easily go back to city council and ask for permission, and an open debate could ensue. .


Communities cannot discuss high-tech police gadgets if they do not know about them. San Francisco residents only learned about the SFPD bots because of a recent state law which required police departments to disclose the military technology and weapons in their arsenal. Communities that decide to allow police ground robots should follow the recommendations for Public notice, audit and effectiveness Tracking we’ve already requested where the police want to deploy drones.

Jay Stanley, senior policy analyst, American Civil Liberties Union’s Speech, Privacy, and Technology Project

#Guest #comment #Weaponizing #bots #dangerous

Leave a Comment

Your email address will not be published. Required fields are marked *