ICO issues warning over ‘emotion analysis technologies’

Written by Sam Trendall on 31 October 2022 in News
News

Regulator claims new systems come with inherent risk of ‘systemic bias, inaccuracy and even discrimination’

Credit: Gino Crescoli/Pixabay

The Information Commissioner’s Office has warned of the potential risks posed to citizens by the increasing use of “emotion analysis technologies”, which it claimed invariably fall foul of data-protection laws.

The regulator has issued a warning to businesses that have implemented or are considering use of these tools – which include the likes of “gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture” – that this is an “immature” technology area.

Before deploying such systems, businesses are advised to “assess the public risks” of doing so, such as the technology’s reliance on “algorithms which are not sufficiently developed to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination”.

“Organisations that do not act responsibly, posing risks to vulnerable people, or fail to meet ICO expectations, will be investigated,” the regulator said.

Examples of such technology that are already in use include systems aimed at “monitoring the physical health of workers by offering wearable screening tools,” according to the ICO. Other use cases involve the use of “visual and behavioural methods including body position, speech, eyes and head movements to register students for exams”.

The information on “subconscious behavioural or emotional responses” collected by emotion-analysis technologies constitutes personal data, the watchdog said. Use of this data is liable to be “far more risky than traditional biometric technologies that are used to verify or identify a person”, it added.


Related content


The ICO has published two new reports this week and, in spring 2023, is planning to release guidance on the use a comprehensive range of biometric technologies – including more established concepts such as facial and voice recognition.

Industries where such systems are already widely used include financial services and air travel, while the ICO expects other sectors – include fitness and health, employment, and entertainment – to join in over the coming years. The regulator said that “behavioural analysis in early education is becoming a significant, if distant, concern”.

Deputy commissioner Stephen Bonner said: “Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever. While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination. The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science. As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.”

He added: “The ICO will continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work.”

 

About the author

Sam Trendall is editor of PublicTechnology. He can be reached on sam.trendall@dodsgroup.com.

Share this page

Tags

Categories

CONTRIBUTIONS FROM READERS

Please login to post a comment or register for a free account.

Related Articles

Government guidance on use of private email and WhatsApp to be updated for first time in a decade
15 March 2023

Cabinet Office minister says that department will release new guidelines ‘as soon as possible’

AI, blockchain and quantum lead emerging-tech agenda as government aims to expand on ‘pockets of progress’
3 March 2023

CDDO head of strategy points to challenges and opportunities created by next-generation technologies

What apps are on government’s approved list?
20 March 2023

Only centrally approved third-party applications will be allowed on Whitehall devices – but government remains tight-lipped on what might make the cut or how

Worsening skills shortages threaten government digital transformation, NAO finds
13 March 2023

Auditors praise the ‘fresh approach’ of CDDO but warn that unit’s work across government could be compromised by access to expertise

Related Sponsored Articles

Digital transformation – a guide for local government
6 March 2023

Digital transformation will play a key role in the future of local government. David Bemrose, Head of Account Strategy for Local Government at Crown Commercial Service (CCS), introduces a new...