Data protection regulator the Information Commissioner’s Office has formally censured a school for failing to adequately assess the privacy impact or obtain explicit consent before rolling out new biometric technology
The Information Commissioner’s Office has issued a formal reprimand to a high school in Essex after it introduced facial-recognition technology without conducting a proper assessment of the data-protection risks to students or obtaining explicit consent.
Chelmer Valley High School in Chelmsford – which has about 1,200 pupils aged from 11 to 18 – first deployed the tech 16 months ago, according to the ICO. It has been used in the school’s canteen to verify and process payments.
Facial recognition systems process sensitive biometric data and, before using the technology, organisations are required to undertake a data-protection impact assessment – an obligation which the regulator said that the school failed to meet.
“This meant no prior assessment was made of the risks to the children’s information,” the ICO said. “The school had not properly obtained clear permission to process the students’ biometric information and the students were not given the opportunity to decide whether they did or didn’t want it used in this way.”
The watchdog added that the institution “also failed to seek opinions from its data protection officer or consult with parents and students before implementing the technology”.
At the time that facial recognition was first deployed, parents were given a slip and asked to return it to the school if they wished to revoke consent for their child to take part in using the new tech.
Related content
- Experts identify ‘massive operational challenges’ with police use of facial recognition
- ‘A threat to our fundamental freedoms’ – MPs and peers voice opposition to live facial recognition
- Home Office explores facial recognition technologies for police
The ICO said that “the law does not deem ‘opt out’ a valid form of consent and requires explicit permission” for the processing of biometric data. Moreover, most of the students affected “were old enough to provide their own consent, therefore, parental opt-out deprived students of the ability to exercise their rights and freedoms”, according to the watchdog.
Lynne Currie, ICO head of privacy innovation, said: “Handling people’s information correctly in a school canteen environment is as important as the handling of the food itself. We expect all organisations to carry out the necessary assessments when deploying a new technology to mitigate any data protection risks and ensure their compliance with data protection laws. We’ve taken action against this school to show introducing measures such as FRT should not be taken lightly, particularly when it involves children.
She added: “We don’t want this to deter other schools from embracing new technologies. But this must be done correctly with data protection at the forefront, championing trust, protecting children’s privacy and safeguarding their rights. A DPIA is required by law – it’s not a tick-box exercise. It’s a vital tool that protects the rights of users, provides accountability and encourages organisations to think about data protection at the start of a project.”
As facial recognition and other biometric technologies have been rolled out more widely in recent years – including use cases from police, government and other public authorities – the ICO has taken a keener interest. In 2021, the former information commissioner Elizabeth Denham published a 67-page opinion on the issue, alongside a condensed public statement in which she spoke out against existing uses of live facial recognition, claiming that all investigations of deployment of the technology conducted by her office to date had found illegality.
PublicTechnology had contacted Chelmer Valley High School requesting comment and was awaiting response at time of going to press.
Your articles are very helpful to me. May I request more information?