Damning report questions accuracy and lawfulness of Met Police facial recognition trials
Force expresses disappointment in ‘negative and unbalanced’ tone of report it commissioned
An independent report into the use of facial recognition technology by London’s Metropolitan Police Service (MPS) has claimed that the software is rarely accurate and could well be unlawful.
The Met commissioned academics from the University of Essex to observe and report upon six trial deployments of the technology that took place across the capital between June 2018 and February 2019. Having been granted “unprecedented access” to the people and processes involved, Professor Peter Fussey and Dr Daragh Murray have called for all such trials to be stopped until the issues they have identified are addressed.
Published this week, their report’s key findings include:
- It is “highly possible” that, if the use of the technology were challenged in court, it would be ruled unlawful on the basis of a failure to comply with human-rights law, coupled with the lack of “explicit legal authorisation” in UK law.
- The Met’s planning and conception of the tests was inadequate, and overly focused on technical issues.
- The force did not produce “a sufficiently detailed impact assessment” that properly took into account the intrusiveness of live facial recognition (LFR) when compared with other surveillance technologies, such as CCTV.
- Trialling the technology in the context of a live operational environment gave rise to “a number of issues regarding consent, public legitimacy and trust”.
- There was a lack of clarity over how and why people were included on the ‘watchlist’ of citizens the cameras were equipped to look out for.
- Data on the watchlist was frequently outdated, meaning people whose case had already been dealt with were stopped unnecessarily.
- Of the 42 matches made by the software, only eight could be verified as correct “with absolute confidence”.
Despite the many issues identified in his report, Fussey said he welcomed the Met’s “willingness to support” such research.
“It is appropriate that issues such as those relating to the use of LFR are subject to scrutiny, and the results of that scrutiny made public,” he added. “The report demonstrates a need to reform how certain issues regarding the trialling or incorporation of new technology and policing practices are approached, and underlines the need to effectively incorporate human rights considerations into all stages of the Metropolitan Police’s decision-making processes. It also highlights a need for meaningful leadership on these issues at a national level.”
Co-author Murray said that police chiefs had made insufficient effort to “identify human rights harms or to establish the necessity of LFR”.
“Ultimately, the impression is that human rights compliance was not built into the Metropolitan Police’s systems from the outset, and was not an integral part of the process,” he added.
Prior to publication, a draft of the report was submitted to the MPS, and the force was invited to flag up any factual errors and exercise a “right of reply” to any of its findings. It chose not to do so, but deputy assistant commissioner Duncan Ball has now issued a statement saying that the Met is “extremely disappointed with the negative and unbalanced tone of this report”.
“The MPS maintains we have a legal basis for this pilot period and have taken legal advice throughout,” he added. “We will again review this once we have the outcome of the South Wales judicial review [of the use of LFR]. This is new technology, and we’re testing it within a policing context. The Met’s approach has developed throughout the pilot period, and the deployments have been successful in identifying wanted offenders. We believe the public would absolutely expect us to try innovative methods of crime fighting in order to make London safer.”
Cabinet secretary Sedwill says he ‘would like to see more processes handled’ by technology
The body dedicated to upholding ethical standards across the public sector has published a major report examining how to ensure those standards are not threatened by AI and automation
Minister responds to think tank report by claiming government could establish UK ARPA unit this year
Biometrics and information commissioners remind Met Police that questions remain over both legal footing and public sentiment