London man hit with £90 fine after covering face in front of police facial-recognition cameras

Written by Sam Trendall on 17 May 2019 in News
News

Pedestrian stopped and penalised for disorderly behaviour

Credit: Oliver Peters/Pixabay

Police fined an east London man £90 after he was stopped by officers for covering his face in front of facial-recognition cameras.

The incident, which took place during a trial of the technology run in February by the Metropolitan Police Service, was captured by reporters from the BBC’s Click technology news programme. The makers of the programme recorded the subsequent disagreement between the man and a group of officers. 

Privacy rights group Big Brother Watch was protesting at the scene, and the organisation’s director Silkie Carlo is seen remonstrating with one officer, whom she asks “what’s your suspicion?”.

“The fact that he’s walked past the clearly marked facial-recognition thing and covered his face gives us grounds to stop him and verify,” the officer responds.

The man is ultimately fined £90 for disorderly behaviour.


Related content


Carlo told the BBC: “There is nothing in UK law that has the words ‘facial recognition’. There is no legal basis for the police to be using facial recognition. There are no legal limitations on how they can use it, no policy, no regulation – this is a free-for-all. We don’t know who is on the watchlists, we didn’t know how long the images are going to be stored for, and the police are making up the rules as they go along. Our ultimate fear is that we would have live facial-recognition capabilities on our gargantuan CCTV network – which is about six million cameras in the UK.”

But Ivan Balhatchet, the Met’s covert and intelligence lead, told the programme that the force’s senior management all believed that “not trialling such technology would be neglectful” on their part. 

“We ought to explore all technology, to see how it can keep people safer, how it can make policing more effective,” he added. “However, we are completely aware of some of the concerns that have been raised and what we are doing with these trials is actually trying to understand those better so we can actually protect human rights but keep people safe at the same time.”

He added that the force is “reviewing all capabilities” regarding how facial-recognition kit could and should be deployed in the future.

“Absolutely the technology is there for body-worn or smaller devices to be fitted with facial-recognition technology – [the same is true of] CCTV. So, absolutely we will look at that,” he said. “But, again, the right safeguards, and the right reviews and learning has to be put around that.”

The incident with the London pedestrian occurred during the last of 10 trial programmes run by the Met, concluding earlier this year. The BBC noted that, on the same day as the man was fined, three wanted criminals were arrested after being identified by the cameras.

Earlier this week, lawmakers in San Francisco voted to, effectively, outlaw the use of facial-recognition technology by government entities across the city.

 

About the author

Sam Trendall is editor of PublicTechnology

Tags

Share this page

Tags

Categories

CONTRIBUTIONS FROM READERS

Please login to post a comment or register for a free account.

Related Articles

‘If a company decides the UK is an unattractive place to be – do we want that company in the UK?’
17 February 2023

Former digital secretary urges government to ensure it sticks to tough rules set out in online safety legislation

Calls to expand biometrics watchdog to commercial entities
17 February 2023

Scotland’s world-first regime needs to go further, critics have claimed

Government report claims authorities’ bulk data collections are stymied by ‘disproportionate safeguards’
10 February 2023

Study assesses impact of Investigatory Powers Act during its first five years and suggests potential changes

Braverman floats criminalisation of ‘highly encrypted devices’
25 January 2023

Government consults on proposals to create new offences to clamp down on technologies it believes are enabling serious crime