ICO issues warning over ‘emotion analysis technologies’

Regulator claims new systems come with inherent risk of ‘systemic bias, inaccuracy and even discrimination’

Credit: Gino Crescoli/Pixabay

The Information Commissioner’s Office has warned of the potential risks posed to citizens by the increasing use of “emotion analysis technologies”, which it claimed invariably fall foul of data-protection laws.

The regulator has issued a warning to businesses that have implemented or are considering use of these tools – which include the likes of “gaze tracking, sentiment analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture” – that this is an “immature” technology area.

Before deploying such systems, businesses are advised to “assess the public risks” of doing so, such as the technology’s reliance on “algorithms which are not sufficiently developed to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination”.

“Organisations that do not act responsibly, posing risks to vulnerable people, or fail to meet ICO expectations, will be investigated,” the regulator said.

Examples of such technology that are already in use include systems aimed at “monitoring the physical health of workers by offering wearable screening tools,” according to the ICO. Other use cases involve the use of “visual and behavioural methods including body position, speech, eyes and head movements to register students for exams”.

The information on “subconscious behavioural or emotional responses” collected by emotion-analysis technologies constitutes personal data, the watchdog said. Use of this data is liable to be “far more risky than traditional biometric technologies that are used to verify or identify a person”, it added.


Related content


The ICO has published two new reports this week and, in spring 2023, is planning to release guidance on the use a comprehensive range of biometric technologies – including more established concepts such as facial and voice recognition.

Industries where such systems are already widely used include financial services and air travel, while the ICO expects other sectors – include fitness and health, employment, and entertainment – to join in over the coming years. The regulator said that “behavioural analysis in early education is becoming a significant, if distant, concern”.

Deputy commissioner Stephen Bonner said: “Developments in the biometrics and emotion AI market are immature. They may not work yet, or indeed ever. While there are opportunities present, the risks are currently greater. At the ICO, we are concerned that incorrect analysis of data could result in assumptions and judgements about a person that are inaccurate and lead to discrimination. The only sustainable biometric deployments will be those that are fully functional, accountable and backed by science. As it stands, we are yet to see any emotion AI technology develop in a way that satisfies data protection requirements, and have more general questions about proportionality, fairness and transparency in this area.”

He added: “The ICO will continue to scrutinise the market, identifying stakeholders who are seeking to create or deploy these technologies, and explaining the importance of enhanced data privacy and compliance, whilst encouraging trust and confidence in how these systems work.”

 

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere