Met Police criticised over facial recognition technology that doesn't work

Four out of five suspects identified by Met Police facial recognition were innocent

The Metropolitan Police is facing criticism over its use of facial recognition technology that has proved to be wildly inaccurate.

According to a report this week, four out of five people identified as possible suspects by the facial recognition system were found to be innocent.

The claims lend weight to criticisms of the Met Police's live facial recognition (LFR), which has long been slammed as "dangerously authoritarian". However, it is an improvement on the 98 per cent failure rate reported from such trials last year.

The latest report was put together by academics at the University of Essex found that the technology made only eight out of 42 matches correctly across six trials evaluated - an error rate of 81 per cent.

All experiments like this should now be suspended until we have a proper chance to debate this and establish some laws and regulations

The co-authors, Professor Peter Fussey and Dr Daragh Murray, also warned of "significant shortcomings" in the Met Police's process of gaining meaningful consent, and added that watchlists used by police were sometimes out of date and included people considered "at risk or vulnerable", and not just suspected criminals.

The report concludes that it is "highly possible" that the Met Police's use of the system would be found unlawful if challenged in court, and calls for all live trials of live facial recognition to be suspended until these concerns are addressed.

This was echoed by David Davis MP, former shadow home secretary, who said that the research by the University of Essex's Human Rights Centre showed that live facial recognition "could lead to miscarriages of justice and wrongful arrests" and poses "massive issues for democracy".

Davis continued: "All experiments like this should now be suspended until we have a proper chance to debate this and establish some laws and regulations… Remember what these rights are: freedom of association and freedom to protest; rights that we have assumed for centuries, which shouldn't be intruded upon without a good reason."

We are extremely disappointed with the negative and unbalanced tone of this report

Commenting on the findings, Duncan Ball, the Met Police's deputy assistant commissioner, said: "We are extremely disappointed with the negative and unbalanced tone of this report... We have a legal basis for this pilot period and have taken legal advice throughout.

"We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer."

Big Brother Watch, which has filed its own a legal challenge against the Metropolitan Police's live facial recognition technology, claims that it breaches the rights of individuals under the Human Rights Act. It described the report as "absolutely definitive".

The group's director Silkie Carlo added: "I think there is really no recovery from this point… The only question now is when is the Met finally going to commit to stop using facial recognition."

The release of this report comes just weeks after ex-Lib Dem councillor Ed Bridges, supported by campaign group Liberty, launched the first legal challenge against "intrusive" surveillance technology.

He believes his image was captured by South Wales Police while he was shopping in Cardiff, and later at a peaceful protest against the arms trade, and will argue that the use of the technology on him was an unlawful violation of privacy.

Last week, the Biometric Commissioner criticised police's "chaotic use" of facial recognition technology in the Commission's annual report.