New York City moves to establish algorithm-monitoring task force

New York City may soon gain a task force dedicated to monitoring the fairness of algorithms used by municipal agencies. Formed from experts in automated systems and representatives of groups affected by those systems, it would be responsible for closely examining algorithms in use by the city and making recommendations on how to improve accountability and avoid bias.

The bill, which doesn’t have a fancy name, has been approved by the city council and is on the Mayor’s desk for signing. The New York division of the ACLU has argued in favor of it.

Say, for instance, an “automated decision system” (as the law calls them) determines to a certain extent who’s eligible for bail. It may be that biases inherent to the training data that produced this system tend to result in one group being unjustly favored for bail hearings over another.

The task force will be required to author a report that lays out procedures for dealing with situations like the above. Specifically, the report will make recommendations regarding the following:

  • How can people know whether or not they or their circumstances are being assessed algorithmically, and how should they be informed as to that process?
  • Does a given system disproportionately impact certain groups, such as the elderly, immigrants, the disabled, minorities, etc?
  • If so, what should be done on behalf of an affected group?
  • How does a given system function, both in terms of its technical details and in how the city applies it?
  • How should these systems and their training data be documented and archived?

The task force would need to be formed within three months of the bill’s signing, and importantly it must include “persons with expertise in the areas of fairness, accountability and transparency relating to automated decision systems and persons affiliated with charitable corporations that represent persons in the city affected by agency automated decision systems.”

So this wouldn’t just be a bunch of machine learning experts and a couple of lawyers. You need social workers and human rights advocates, as well, something I’ve certainly argued for in the past.

The report itself (which would be public) wouldn’t be due for 18 months, but this isn’t the kind of thing you want to rush. Assessing these systems is a data-intensive task and creating parallel municipal systems to make sure people don’t fall through the cracks is civically very important.