The FBI is building a photo gallery of US citizens for the purposes of a facial recognition database, and plans to have over 52 million images in it by next year – including those of people who've never committed a crime.
That's according to documents published by Electronic Frontier Foundation (EFF), the digital civil liberties organisation, which reveal much information about the FBI's Next Generation Identification (NGI) programme. The database is expected to go live this summer and the EFF warns that "the facial recognition component of this database poses real threats to privacy".
The warning comes in the wake of the NSA surveillance revelations, with documents leaked by former US government IT contractor Edward Snowden causing real concerns about privacy.
The NGI has been designed to include various types of biometric data including palm prints, iris scans and facial recognition data. That information will then be linked to details about the individual including their age, race and address in order to enable identification of people across 18,000 tribal, state and local law enforcement agencies across the US.
The EFF has raised concerns about how NGI will store photos of non-criminals, with over four million images of innocent people expected to be in the database by next year. The images will be obtained through job applications that require a background check and then stored in the database along with fingerprints, which are already collected.
As a result, the EFF warns that individuals could be implicated as being a criminal, just on the basis that they're in the database.
"This means that even if you have never been arrested for a crime, if your employer requires you to submit a photo as part of your background check, your face image could be searched – and you could be implicated as a criminal suspect – just by virtue of having that image in the non-criminal file," the organisation warned.
The EFF has also expressed concerns that the FBI and Congress has failed to "enact meaningful restrictions" on what types of data can be submitted into the system, who can access it and how the data can be used, with no explicit policies in place that the scheme can't take images from social media.
The organisation warns that the system represents a massive breach of civil liberties and shouldn't be going forward.
"We know from researchers that the risk of false positives increases as the size of the dataset increases – and, at 52 million images, the FBI's face recognition is a very large dataset. This means that many people will be presented as suspects for crimes they didn't commit," said the EFF.
"This is not how our system of justice was designed and should not be a system that Americans tacitly consent to move towards," it concluded.
This paper seeks to provide education and technical insight to beacons, in addition to providing insight to Apple's iBeacon specification
Focus on cost efficiency, simplicity, performance, scalability and future-readiness when architecting your data protection strategy