Stunning: Taser wants to turn the public and their smartphones into its big data posse of police informants

Posted on Sep 27, 2017 by Glyn Moody

Last month we wrote about the increasing application of big data analysis by police forces, often using software from the shadowy outfit Palantir. But it’s by no means the only company that sees this as a huge growth market: another is Axon. If that name is unfamiliar, that’s because it’s the new, rather anonymous branding for the very well-known Taser:

“We are Axon, a team committed to pushing the boundaries of technology to help you feel more confident in the field, at the station, and in court. From Smart Weapons, like our TASER devices, to police body cameras and digital evidence management systems, every product works together as a single network. Seamlessly integrated. Completely connected. And designed to help police, sheriffs, and law enforcement agencies everywhere make the world a safer place.”

Taser is just one division of Axon, and arguably not the most important one. That honor probably belongs to its body-worn camera products. Even there, it’s not the camera itself that matters. As Axon announces on its home page, it is offering a free one-year trial of its cameras, software and training to every police officer in the US. It’s a classic razor-blade model: give away the razor itself, make money from the supplies – in this case, a subscription to Axon’s core product, the site Evidence.com. Axon’s aim is to provide law enforcement with a one-stop cloud-based system for storing and managing all kinds of police data – body-worn video, in-car video, interview room video, CCTV, photographs, audio, and documents – along with features such workflow and audit trails.

Those are all fairly conventional offerings. But an article in The Intercept earlier this year revealed how Axon is starting to apply AI technology to the terabytes of video evidence it now holds. Initially, Axon hopes AI will allow it to carry out automatic redactions of faces in order to protect privacy, extract key information, and detect objects and even emotions. The idea is to free up police officers from paperwork so that they can spend their time combatting crime more fruitfully. AI-based analytics will provide police forces with detailed information about crime trends, similar to Palantir’s products. However, Axon’s plans go much further:

“We’ve got all of this law enforcement information with these videos, which is one of the richest treasure troves you could imagine for machine learning,” Taser CEO Rick Smith told PoliceOne in an interview about the company’s AI acquisitions. “Imagine having one person in your agency who would watch every single one of your videos – and remember everything they saw – and then be able to process that and give you the insight into what crimes you could solve, what problems you could deal with. Now, that’s obviously a little further out, but based on what we’re seeing in the artificial intelligence space, that could be within five to seven years.”

As well as mapping past patterns, the aim is explicitly to predict future trends. In its 2017 Law Enforcement Technology Report, Axon writes that thanks to AI and Machine Learning, “agencies could also analyze their data to anticipate criminal activity and better allocate their resources.” As Privacy News Online has just discussed, the big problem with the application of AI techniques to big data in this way is that it is opaque: there is no way for the public or the authorities to scrutinize the methods used to obtain results. That’s bad enough in any domain, but when it comes to policing – and possibly the deployment of lethal force – that’s an extremely serious issue. Axon has other bold ideas. Another story in The Intercept, published last week, reveals the following:

“At Axon, we are developing a product that allows the public to submit photos and video of a crime, suspicious activity, or event. The collection of evidence will be done via a citizen’s smartphone or computer and submitted via a text or uploaded via a webpage to the appropriate law enforcement agtency. This video will then be automatically added to our evidence management platform, Evidence.com, for use by the agency in solving a crime or gathering a fuller point of view from the public.”

The submission of tips from the public isn’t new, or in itself necessarily problematic. What is more troubling here is the fact that Axon wants to become the gatekeeper for such information, encouraging the public to send photos and videos to the company, which will then forward them to the police. That begs the question whether all such public tips will be passed on automatically. For example, suppose a video showed police brutality – perhaps using one of Taser’s weapons to shock or even kill people, a common enough occurrence, as a major Reuters report revealed. Would Axon pass on that possibly incriminating evidence as a matter of course, or does it reserve the right to use its “discretion” in such cases?

That issue is relatively easy to solve – Axon could promise that all evidence will be passed on without exception. The second issue is much more problematic. It stems from the fact that the evidence will not only be available to the police, but to Axon too, which says it will add it to its Evidence.com database automatically. That means members of the public have no idea how their tip about an incident might be used. Videos contain all kinds of ancillary, unnoticed information that, on its own, is innocuous enough. But when combined with terabytes of other videos in Evidence.com’s holdings, and then analyzed using AI technology, they could well end up implicating many people in completely unrelated cases, and might even harm the family and friends of those passing on the tips.

Since there is no way of knowing what Axon’s AI might discern, there is a double danger here. First, people may find themselves unexpectedly under police scrutiny, and secondly, they won’t be able to challenge the reason why, since it is hidden away in the black box of the AI technology. It’s even conceivable that this fact might be used to implicate people falsely. Multiple coordinated tips from a group of people, all containing artfully-placed clues, could achieve that. Although those false clues might be invisible to human operators, they could be picked up by AI systems designed specifically to find such subtle connections. This might sound like science fiction, but the technological progress is so rapid in this field that it perhaps needs to be viewed more as imminent science fact.

Featured image by Junglecat.