Report claims facial recognition is 95% inaccurate
Police defends technology’s deployment as ICO increases scrutiny and campaign group calls for immediate cessation of use
Law enforcement’s deployment of facial-recognition technology (FRT) is coming under increased scrutiny, with the Information Commissioner’s Office expressing strong concerns about its use and a privacy group publishing a report that claims the software makes incorrect matches 95% of the time.
In a blog post, information commissioner Elizabeth Denham has said she has “been deeply concerned about the absence of national level co-ordination in assessing the privacy risks and a comprehensive governance framework to oversee FRT deployment”. She added that conducting such an assessment has become increasingly urgent as facial recognition has been used in the policing of public events including London’s annual Notting Hill Carnival and the 2017 Champions League final in Cardiff.
“There may be significant public safety benefits from using FRT — to enable the police to apprehend offenders and prevent crimes from occurring,” Denham said.
She added: “But how facial-recognition technology is used in public spaces can be particularly intrusive. It’s a real step change in the way law-abiding people are monitored as they go about their daily lives. There is a lack of transparency about its use, and there is a real risk that the public-safety benefits derived from the use of FRT will not be gained if public trust is not addressed. A robust response to the many unanswered questions around FRT is vital to gain this trust.”
Denham claimed that, for FRT to be considered legal, the police needs to demonstrate that not only is it effective in solving the intended problem, but that no other – less intrusive – technology or method could achieve the same outcomes.
- Police ethics body to look at use of facial-recognition technology
- Interview: Surveillance Camera Commissioner discusses his mission to protect privacy and human rights
- Home Office plots £5m project to equip police with facial-recognition software
Denham welcomed the recent establishment of a panel to oversee the use of facial recognition-enabled cameras, on which she will serve alongside the commissioners for biometrics and surveillance cameras. The National Police Chief Council’s (NPCC) recent appointment of a governance lead for the use of FRT in public is also good news, she said.
Nevertheless, the commissioner will continue to monitor the FRT space “as a priority area for my office”. Denham revealed that she recently wrote to the Home Office and the NPCC to lay out her concerns.
“Should my concerns not be addressed, I will consider what legal action is needed to ensure the right protections are in place for the public,” she said.
The commissioner’s comments come as privacy campaign group Big Brother Watch published a report calling for the immediate cessation of the use of FRT by all public authorities. The report also called for the deletion from the Police National Database of “thousands of images of unconvicted individuals”.
The campaign group also published the results of freedom of information requests which showed that the facial-recognition cameras used by the Metropolitan Police to date have made incorrect identifications in 98% of cases. The figure for South Wales Police is 91%.
Such incorrect identifications are known as ‘false positives’.
The report said that the Met has made no arrests resulting from matches made by facial-recognition software, while South Wales Police has made 15 – but has stopped 31 innocent members of the public and required them to prove their identity. The use of FRT in 18 public places in south Wales during the past year has also resulted in the force storing the images of 2,451 innocent people “in a policy that is likely to be unlawful”, according to Big Brother Watch.
The report said: “We are deeply concerned that the securitisation of public spaces using biometrically identifying facial recognition unacceptably subjects law-abiding citizens to hidden identity checks, eroding our fundamental rights to privacy and free expression.”
South Wales Police said that thumbnails of the 2,400-plus innocent people wrongly identified by FRT software are only retained by the force for 31 days – not 12 months, as claimed by Big Brother Watch.
In a statement initially made earlier this month and reissued in response to the criticisms contained the report, the force also claimed that the number of arrests made is significantly higher than claimed by the privacy group.
“Since its introduction nine months ago, over 2,000 positive matches have been made using our ‘Identify’ facial recognition technology, with over 450 arrests. Successful convictions so far include six years in prison for robbery and 4.5 years imprisonment for burglary. The technology has also helped identify vulnerable people in times of crisis,” South Wales Police said.
The force added: “Of course, no facial recognition system is 100% accurate under all conditions, resulting in false positives. This where the system incorrectly matches a person against a watch list. Technical issues are common to all face recognition systems, which means false positives will be an issue as the technology develops.”
"Facial-recognition technology is a real step change in the way law-abiding people are monitored as they go about their daily lives, and there is a lack of transparency about its use"
Elizabeth Denham, information commissioner
The Metropolitan Police – which has thus far deployed FRT at the past two Notting Hill Carnivals and the 2017 Remembrance Sunday service at the Cenotaph – disputed the characterisation of incorrect matches made by the software as ‘false positives’.
“We do not consider these as false-positive matches, because additional checks and balances are in place to confirm identification following system alerts,” it said. “All alerts against the watch list are deleted after 30 days. Faces in the video stream that do not generate an alert are deleted immediately.”
The force added that the current use of FRT is still only at the trial stage.
“All our deployments during the trial have been and will be overt, with information disseminated to the public, and will be subject to full evaluation at the conclusion of the trial, which is expected to be in around late 2018,” it said. “Whilst we are trialling this technology, we have engaged with the Mayor’s Office for Policing and Crime Ethics Panel, the Home Office Biometrics and Forensics Ethics Panel, the Surveillance Camera Commissioner, the Information Commissioner, the Biometrics Commissioner, and Big Brother Watch. Liberty were invited to observe its use at the carnival last year.”
Share this page
CONTRIBUTIONS FROM READERS
Please login to post a comment or register for a free account.
In the first of a series of exclusive interviews, the head of government’s ‘Digital HQ’ talks to PublicTechnology about the Central Digital and Data Office’s work to unlock £8bn...
MPs publish scathing report finding vetting service is not being properly resourced
Updated assessment reveals £27m in additional spending and possibility that department ‘will consider reduced programme scope’
Officials are warned that, if they choose to use non-corporate channels, they must 'be prepared to defend your choices'
Related Sponsored Articles
The traditional reactive approach to cybersecurity, which involves responding to attacks after they have occurred, is no longer sufficient. Murielle Gonzalez reports on a webinar looking at...