Police facial recognition systems are 98 per cent inaccurate, says research

The systems are a waste of money, argues Big Brother Watch

Most of the results generated by automated facial recognition systems used by the police are inaccurate and a waste of tax payers' money, according to a new study.

After making 50 freedom of information requests, privacy rights campaigner Big Brother Watch found that 95 per cent of facial recognition matches identified innocent people.

"Police forces have stored photos of all people incorrectly matched by automated facial recognition systems, leading to the storage of biometric photos of thousands of innocent people," said the report.

The Metropolitan Police has the worst record for using this technology, with less than two per cent of its results being accurate. More than 98 per cent of them gave the wrong results

And when the Met's facial recognition system did manage to identify two people, neither of them was a wanted criminal.

"One of those people matched was incorrectly on the watch list; the other was on a mental health-related watch list," it said.

Worryingly, 102 innocent members of the public were identified by the technology, although the force has yet to make an arrest using it.

South Wales Police is another law enforcement body that has been using facial recognition in day-to-day cases, but it only recorded 9 per cent accuracy - better than the Met, but not by much.

But the biggest differences with this police force are that it has made 15 arrests using facial recognition results, and that twice as many innocent people were significantly affected.

Big Brother Watch's report claims that South Wales Police "staged interventions with 31 innocent members of the public incorrectly identified by the system who were then asked to prove their identity and thus their innocence".

Overall, South Wales Police has stored the images of 2,451 people innocent people. The Big Brother Watch believes that this could be unlawful.

"Despite this, South Wales Police has used automated facial recognition at 18 public places in the past 11 months - including at a peaceful demonstration outside an arms fair," explains the report.

Silkie Carlo, director of Big Brother Watch, said: "Real-time facial recognition is a dangerously authoritarian surveillance tool that could fundamentally change policing in the UK. Members of the public could be tracked, located and identified - or misidentified - everywhere they go.

"We're seeing ordinary people being asked to produce ID to prove their innocence as police are wrongly identifying thousands of innocent citizens as criminals.

"It is deeply disturbing and undemocratic that police are using a technology that is almost entirely inaccurate, that they have no legal power for, and that poses a major risk to our freedoms."