New chair of corruption watchdog looks to focus on AI


Former senior Army officer Doug Chalmers has been appointed to lead the Committee on Standards in Public Life and intends to continue the body’s work exploring the impact of automation

Members of parliament’s Public Administration and Constitutional Affairs Committee have approved the government’s selection of former British Army Lieutenant General Doug Chalmers as the next chair of ethics watchdog the Committee on Standards in Public Life.

Chalmers, who is currently master of Emmanuel College, Cambridge, succeeds former MI5 director general Lord Jonathan Evans – who completed his five-year term at the helm of the independent body at the end of October.

CSPL was created in 1994 as part of the John Major government’s response to persistent sleaze allegations in Westminster, including the “cash for questions” scandal.

It conducts broad inquiries, collecting evidence to assess institutions, policies and practices and makes recommendations to the prime minister.

The committee promotes the “seven principles of public life”, developed by its founding chair Lord Michael Nolan and also known as the Nolan principles. The seven tenets are: Selflessness, integrity, objectivity, accountability, openness, honesty, and leadership.

The watchdog’s work has, in recent years, increasingly considered how these principles can be maintained as government and public services relies on the use of artificial intelligence and automation. The committee published a major report three years setting out a range of measures through which standards can be protected against risks created by AI.


Related content


At last week’s pre-appointment hearing, Chalmers (pictured above) said he was keen to build on this work.

“The committee did a really good report on artificial intelligence and public standards back in 2020,” he said. “It was quite prescient, but that whole world is moving pretty fast, so at some time in my tenure, a return to it would be wise. There is the issue of how some of these machine learning tools are steered to learn, and the foundation models that govern them. Can they be coded in such a way that things like the Nolan principles can be put in there, so that when they bounce across, they know they have to revert back? It can be done elsewhere.”

Chalmers said the coming five years would see “more and more aspects of public life” using machine-learning tools to help with decision-making. He said a further area of interest would be the “trustworthiness” of data fed into the machines for that purpose and what AI did when there were gaps in the data.

Chalmers said the Nolan principles were needed to govern decision making, and did not see any reason why machine-learning tools should not be governed “in a similar manner”.

Chalmers served in the Army from 1984 to 2021, latterly as deputy chief of defence staff, responsible for military strategy and operations.

He told the pre-appointment session that he had first encountered the Nolan principles in around 2007-2008 when he was studying for a master of philosophy degree and researching a thesis on the problems of coordinating government departments.

While PACAC members endorsed Chalmers’s proposed appointment at the hearing, they also criticised ministers for “once again” failing to appoint a candidate to an important public role before the end of their predecessor’s term.

“Public appointments are normally for fixed five-year terms, meaning the date when a new candidate will be needed is known in advance and plenty of time is available in which to ensure that the recruitment process for a successor can be completed,” they said.

“Government should make a greater effort to complete the entire appointments process within the five-year window afforded it by current fixed-term appointments.”

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere