Critics ramp up opposition as force announces controversial kit will go into live operational use
London’s Metropolitan Police Service is to “begin operational use” of live facial recognition (LFR) technology.
The force has previously conducted trials of the kit at various locations around the capital. It has now decided to use it more widely and systematically, and will implement cameras at – undisclosed – “specific locations in London”.
These deployments will take place “where intelligence suggests we are most likely to locate serious offenders”, it said, and will be aimed at combatting serious crimes such as violent and armed offences and child sexual exploitation.
A tailored “watch list” of wanted people will be supplied for each installation of the technology, wherein cameras will watch over a “small, targeted area”. Uses of LFR cameras will be “clearly signposted”, the force said, and the technology will not integrate will CCTV or number-plate recognition devices.
The Met said: “This is not a case of technology taking over from traditional policing; this is a system which simply gives police officers a ‘prompt’, suggesting ‘that person over there may be the person you’re looking for’ – it is always the decision of an officer whether or not to engage with someone.”
Trials of the technology over the last few years have met with consistant and strong opposition from campaign organisations and individuals.
Last year, human rights group Liberty supported Cardiff man Ed Bridges in challenging the legality of LFR. In September, the High Court ultimately ruled that the use of the technology was lawful – a decision which Bridges said he intended to appeal.
Liberty called today’s announcement “a dangerous and sinister step”, and urged people to sign a petition calling on home secretary Priti Patel to ban the use of facial recognition.
Big Brother Watch is another organisation to have consistently and vociferously campaigned against the technology. The group has previously published a report that found LFR systems to be 95% inaccurate in identifying matches.
Responding to today’s news, director of Big Brother Watch Silkie Carlo said: “This decision represents and enormous expansion of the surveillance state and a serious threat to civil liberties in the UK.”
She added: “This is a breathtaking assault on our rights and we will challenge it, including by urgently considering next steps in our ongoing legal claim against the Met and the home secretary. This move instantly stains the new government’s human-rights record, and we urge an immediate reconsideration.”
Regulators have also voiced concerns about the increased use of facial recognition.
The biometrics commissioner Paul Wiles has spoken of the need for “informed public debate” about how legislators should respond to the issue. Information commissioner Elizabeth Denham, meanwhile, has said that “police forces need to slow down” in their use of the technology, and allow for more consideration of legal and moral issues.
‘Tried and tested’
Even a report commissioned by the Met to examine its previous trials was largely damning.
The report, which was published by academics from the university of Essex, questioned the technology’s lawfulness and criticised the force for failing to adequately consider issues of human rights.
It also found that, of the 42 matches made by the software that were considered, in the report only eight could be verified as accurate “with absolute confidence”.
Although the Met chose not to exercise a pre-publication right of reply, once the report was released publicly deputy assistant commissioner Duncan Ball said that the force was “extremely disappointed with the negative and unbalanced tone”.
Today’s news was described by assistant commissioner Nick Ephgrave “an important development for the Met, and one which is vital in assisting us in bearing down on violence”.
“We have taken a considered and transparent approach in order to arrive at this point.”
Assistant commissioner Nick Ephgrave
“We are using a tried-and-tested technology, and have taken a considered and transparent approach in order to arrive at this point,” he said. “Similar technology is already widely used across the UK, in the private sector. Ours has been trialled by our technology teams for use in an operational policing environment.”
Ephgrave added: “Every day, our police officers are briefed about suspects they should look out for; LFR improves the effectiveness of this tactic. Similarly, if it can help locate missing children or vulnerable adults swiftly, and keep them from harm and exploitation, then we have a duty to deploy the technology to do this.”