IBM trained its object recognition software to ID people by skin tone using NYPD data

Campaigners say that the software could easily be used for large-scale racial profiling

The September 11th attacks in 2001 prompted a huge increase in security across the world, nowhere more so than in New York City itself. The New York City Police Department (NYPD) massively increased its surveillance capabilities in the 10 years following the attacks - and a new report alleges that IBM used data from these cameras to train its video surveillance software to tag people by the colour of their skin.

According to The Intercept, demand for IBM's video analytics software by law enforcement rose sharply after 9/11. The software could be integrated into public cameras to search for terrorism suspects, because it could label people using object recognition. It was also able to look for unattended packages, trespassers and more.

Automating this process enables users to quickly search through hours of video for images of individuals matching a particular description.

The Intercept has seen confidential documents that reveal that IBM began developing its software using access to NYPD's camera network.

The Police Department said in an email, ‘Video, from time to time, was provided to IBM to ensure that the product they were developing would work in the crowded urban NYC environment and help us protect the City. There is nothing in the NYPD's agreement with IBM that prohibits sharing data with IBM for system development purposes.

‘Further, all vendors who enter into contractual agreements with the NYPD have the absolute requirement to keep all data furnished by the NYPD confidential during the term of the agreement, after the completion of the agreement, and in the event that the agreement is terminated.'

The NYPD also confirmed that some counterterrorism officials were able to access a pre-release version of the software, which included skin tone identification, in the summer of 2012 - although spokesman Peter Donald said that officers were instructed not to use this feature in their assessment.

The Department never followed through with the analytics software, and phased out its partnership with IBM in 2016. Donald said that he was unaware of any cases where the IBM technology was used in an arrest or prosecution.

IBM developed version 2.0 of its tool in late 2016 or early 2017, and offered it to the NYPD with the ability to track people by ethnicity and provide tags, such as ‘Asian', ‘White' or ‘Black'. According to Donald, the Department "explicitly rejected that product" because of this feature.

Other organisations have not felt the same compunction. The campus police at California State University, Northridge, said that they have been using the IBM tool - including its profiling features - to track criminals and even student protesters.

Unlike facial recognition technology, object recognition has gone largely ignored by privacy campaigners. However, civil liberties advocates now say that they are alarmed by the potential for large-scale racial profiling.

Rachel Levinson-Waldman, senior counsel in the Brennan Center's Liberty and National Security Program, argued that the technology could easily be misused:

"Whether or not the perpetrator is Muslim, the presumption is often that he or she is. It's easy to imagine law enforcement jumping to a conclusion about the ethnic and religious identity of a suspect, hastily going to the database of stored videos and combing through it for anyone who meets that physical description, and then calling people in for questioning on that basis."

Jerome Greco, a digital forensics staff attorney at the Legal Aid Society, said that object identification systems could unfairly point to someone as a suspect, just because they match general physical characteristics:

"I imagine a scenario where a vague description, like young black male in a hoodie, is fed into the system, and the software's undisclosed algorithm identifies a person in a video walking a few blocks away from the scene of an incident. The police find an excuse to stop him, and, after the stop, an officer says the individual matches a description from the earlier incident."

Rick Kjeldsen, a former IBM researcher who worked on the programme from 2009 - 2013, said that the way that the NYPD helped IBM sets a dangerous precedent:

"Are there certain activities that are nobody's business no matter what? Are there certain places on the boundaries of public spaces that have an expectation of privacy? And then, how do we build tools to enforce that? That's where we need the conversation. That's exactly why knowledge of this should become more widely available — so that we can figure that out."