Controversial crime reporting app Vigilante banned from App Store

A controversial crime-reporting app called Vigilante has been kicked out of the App Store for an app that encouraged, well – vigilantism – and led to the potential for violent responses and racial profiling. The app had only been live for a week in New York before getting the boot, after promising a tool that opened up the 911 system, bringing near real-time reports of criminal activity to its digital display.

The app didn’t just encourage you to avoid the problematic area – a solution that alone would have been questionable, due to its similarity with the unfortunate series of apps that have launched over the years, promising to help you avoid “sketchy” or “poor” neighborhoods. These apps can reinforce assumptions people make about race and income level contributing to crime, and have been called out repeatedly for their racial and classist undertones.

In addition to this issue, Vigilante actively encouraged its users to get involved when they received a crime report. A quote on its launch blog post even suggested that “average and ordinary” citizens should approach the crime problem “as a group” and see what they can do.

Therefore, the app could easily be turned into a tool that could encourage users to intimidate and harass innocent people, because they happened to be in an area where the crime was reported, and matched some sort of profile in the end user’s mind of what a criminal should look like.

We’ve already seen the issue of rampant racial profiling in the neighborhood app Nextdoor, which recently had to roll out a new system to clamp down on the issue, by forcing users to enter in physical descriptions – like clothing, hair color, shoes, etc. – if they choose to include the race of the person in their report. Meanwhile, the same sort of profiling takes place on closed Facebook neighborhood groups, largely unchecked.

In a report from The Guardian, Sam Gregory, program director of Witness, an organization trains and supports activists to document human rights violations, cautioned about the app’s framing. “Vigilantism is a very different idea to being an ethical witness to what’s happening,” he said.

Apple doesn’t typically comment on app removals, but it’s fair to say that the app was likely removed because of the clause in Apple’s App Developer Review Guidelines which bans apps that encourage or risk physical harm. (Update: we’ve now confirmed this is the case).

The developer of the app, a company called Sp0n, said it’s working with Apple to resolve the issue and still plans on shipping a version of the app on Android.