After announcing that it would shift some of its emphasis away from non-lethal weapons to police body cameras, for a fleeting moment it felt like the company synonymous with sticks that electrocute people was showing an interest in police accountability. Analysis from the Intercept and a 2017 “Law Enforcement Technology Report” by Taser suggest that the reality might be more complicated — and considerably creepier.
The company now known as Axon created its body camera division a few years ago, but ramped up efforts in 2017. After acquiring two AI companies, Dextro and Fossil Group, in February, signs point to the fact that the company wants to aim its new machine learning brainpower at policing.
While the company has explicitly denied its interest in building a predictive policing engine, claiming that it “will not make predictions on behalf of our customers,” the industry report makes plain reference to its desire to “automate the collection and analysis of virtually all information in public safety while extracting key insights never before possible.” In a page on AI and machine learning, the report lauds the superior insight culled from massive data sets that companies in other industries leverage to predict customer behavior. It continues:
“We may not be quite at the Tom Cruise ‘Minority Report’ level of cognitive prediction, but patterns of individual behavior will become increasingly informative in revealing the probability that an individual will act in a particular fashion. And as our data sets become ever bigger, the analytical algorithms will become ever more sophisticated in revealing robust patterns. It is inevitable that predictive policing will expand. I don’t view this to be a bad thing and is consistent with TASER’s two principles: protect life; protect truth. Any technology platform that can advance these two laudable goals, while protecting the privacy and rights of innocent citizens should, and indeed must, be adopted.”
Considering Taser’s significant investments in machine intelligence, providing data to help police forces make life or death decisions certainly sounds within the company’s wheelhouse. Exactly how that will play out or if its own newly-founded ethics board will rein in that mission remains to be seen.