Government review calls for mandatory transparency in public sector use of algorithms

Written by Sam Trendall on 1 December 2020 in News

Laws must also be updated, according to Centre for Data Ethics and Innovation 

Credit: Pixabay

A government-led review has recommended the implementation of a “mandatory transparency obligation” for all public-sector entities using algorithms to make decisions that impact citizens.

The Centre for Data Ethics, which was set up by government in 2018 to advise on ethical issues related to data use and artificial intelligence, this week published the findings of an 18-month review into bias in algorithmic decision-making.

The centre picked out three key recommendations, the first of which is that any use of algorithms by the public sector should be subject to openness requirements.

“Government should place a mandatory transparency obligation on all public sector organisations using algorithms that have an impact on significant decisions affecting individuals,” CDEI said.

The government should also update anti-discrimination legislation to take account for how it might apply to the use of algorithms.

Related content

“Government should issue guidance that clarifies the application of the Equality Act to algorithmic decision-making,” the review said. “This should include guidance on the collection of data to measure bias, as well as the lawfulness of bias-mitigation techniques – some of which risk introducing positive discrimination, which is illegal under the Equality Act.”

The third recommendation made by the centre applies to entities across all industries.

“Organisations should be actively using data to identify and mitigate bias,” the CDEI said. “They should make sure that they understand the capabilities and limitations of algorithmic tools, and carefully consider how they will ensure fair treatment of individuals.”

The CDEI review focused on the use of algorithms in four sectors: financial services; local government; policing; and recruitment.
Research conducted in the course of the review found that six in 10 citizens are aware that algorithms are used by organisations in decision-making – but only three in ten said they were aware of their use in local government.

There is widespread support for using data – including information on ethnicity and sex – to be used to tackle issues of bias, the research found.

According to the CDEI, “the review points to the need for an ecosystem of industry standards and professional services to help organisations address algorithmic bias in the UK and beyond.” 

“To catalyse this, the CDEI has initiated a programme of work on AI assurance, in which it will identify what is needed to develop a strong AI accountability ecosystem in the UK,” the centre added. “Other related CDEI work includes: working with the Government Digital Service to pilot an approach to algorithmic transparency; supporting a police force and a local authority to apply lessons learnt and develop practical governance structures; and active public engagement to build understanding of the values that citizens want reflected in new models of data governance.”

For its part, the government needs to play a role of “leadership and coordination”, and the report “urges the government to be clear on where responsibilities sit for tracking progress”.

Adrian Weller, board member for the Centre for Data Ethics and Innovation, said: “It is vital that we work hard now to get this right as adoption of algorithmic decision-making increases. Government, regulators and industry need to work together with interdisciplinary experts, stakeholders and the public to ensure that algorithms are used to promote fairness, not undermine it. The Centre for Data Ethics and Innovation has today set out a range of measures to help the UK to achieve this, with a focus on enhancing transparency and accountability in decision-making processes that have a significant impact on individuals. Not only does the report propose a roadmap to tackle the risks, but it highlights the opportunity that good use of data presents to address historical unfairness and avoid new biases in key areas of life.”


About the author

Sam Trendall is editor of PublicTechnology

Share this page




Please login to post a comment or register for a free account.

Related Articles

Home Office preps Plan B to ensure continuity of UK police database
8 June 2023

Department says that work to deliver replacement of 50-year-old system is on track but that it is ‘prudent’ to create a contingency plan

HMRC launches £140m procurement to support comms digitisation
26 April 2023

Five-year contract will cover all incoming and outgoing messages and ambition to operate in ‘similar ways to leading private sector companies’

Braverman proposes surveillance law update to give authorities more sway over telecoms firms
7 June 2023

Tweaks to Investigatory Powers Act could require companies to provide data even before appeals are settled and alert authorities to technical updates

Scottish minister warns on Westminster’s ‘hands-off’ approach to AI and requests urgent UK summit
6 June 2023

Richard Lochhead compares technology to previous industrial revolutions and says government’s job is to minimise harms and spread opportunities

Related Sponsored Articles

Proactive defence: A new take on cyber security
16 May 2023

The traditional reactive approach to cybersecurity, which involves responding to attacks after they have occurred, is no longer sufficient. Murielle Gonzalez reports on a webinar looking at...