Damning report questions accuracy and lawfulness of Met Police facial recognition trials

Written by Sam Trendall on 5 July 2019 in News

Force expresses disappointment in ‘negative and unbalanced’ tone of report it commissioned

Credit: Gerd Altmann/Pixabay

An independent report into the use of facial recognition technology by London’s Metropolitan Police Service (MPS) has claimed that the software is rarely accurate and could well be unlawful.

The Met commissioned academics from the University of Essex to observe and report upon six trial deployments of the technology that took place across the capital between June 2018 and February 2019. Having been granted “unprecedented access” to the people and processes involved, Professor Peter Fussey and Dr Daragh Murray have called for all such trials to be stopped until the issues they have identified are addressed.

Published this week, their report’s key findings include:

  • It is “highly possible” that, if the use of the technology were challenged in court, it would be ruled unlawful on the basis of a failure to comply with human-rights law, coupled with the lack of “explicit legal authorisation” in UK law.
  • The Met’s planning and conception of the tests was inadequate, and overly focused on technical issues.
  • The force did not produce “a sufficiently detailed impact assessment” that properly took into account the intrusiveness of live facial recognition (LFR) when compared with other surveillance technologies, such as CCTV.
  • Trialling the technology in the context of a live operational environment gave rise to “a number of issues regarding consent, public legitimacy and trust”.
  • There was a lack of clarity over how and why people were included on the ‘watchlist’ of citizens the cameras were equipped to look out for.
  • Data on the watchlist was frequently outdated, meaning people whose case had already been dealt with were stopped unnecessarily.
  • Of the 42 matches made by the software, only eight could be verified as correct “with absolute confidence”.

Despite the many issues identified in his report, Fussey said he welcomed the Met’s “willingness to support” such research.

“It is appropriate that issues such as those relating to the use of LFR are subject to scrutiny, and the results of that scrutiny made public,” he added. “The report demonstrates a need to reform how certain issues regarding the trialling or incorporation of new technology and policing practices are approached, and underlines the need to effectively incorporate human rights considerations into all stages of the Metropolitan Police’s decision-making processes. It also highlights a need for meaningful leadership on these issues at a national level.”

Co-author Murray said that police chiefs had made insufficient effort to “identify human rights harms or to establish the necessity of LFR”.

“Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset, and was not an integral part of the process,” he added.

Prior to publication, a draft of the report was submitted to the MPS, and the force was invited to flag up any factual errors and exercise a “right of reply” to any of its findings. It chose not to do so, but deputy assistant commissioner Duncan Ball has now issued a statement saying that the Met is “extremely disappointed with the negative and unbalanced tone of this report”.

“The MPS maintains we have a legal basis for this pilot period and have taken legal advice throughout,” he added. “We will again review this once we have the outcome of the South Wales judicial review [of the use of LFR]. This is new technology, and we’re testing it within a policing context. The Met’s approach has developed throughout the pilot period, and the deployments have been successful in identifying wanted offenders. We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.”


About the author

Sam Trendall is editor of PublicTechnology

Share this page




Please login to post a comment or register for a free account.

Related Articles

Online safety: How police, public sector and tech firms have reached a data-sharing stalemate
21 May 2021

With the Online Safety Bill now published, former police superintendent Iain Donnelly writes for PublicTechnology on the challenges that need to be overcome in order to ensure the law’s...

Vaccine passports pose discrimination and data protection risks, MPs find
17 June 2021

PACAC claims that government has not made a convincing case for introducing a certification scheme domestically

Related Sponsored Articles

Social justice: how the police can embrace online channels of citizen communication
17 June 2021

PublicTechnology talks to Salesforce about why police forces need to adopt new omnichannel capabilities, offer the public channel choice and the benefits of doing so

"The inflection point is here": how Covid is driving digital transformation in health
9 June 2021

It’s been one of the most challenging years for healthcare providers, but Salesforce sees lasting change from accelerated digital transformation