Government warned of risks of letting AI firms influence policy

In light of a new report from the Ada Lovelace Institute, experts and shadow ministers have expressed concern that big tech companies could be allowed to push for favourable regulation

Government needs to do more to ensure artifcial intelligence technology is used safely, a report from specialist reseach body the Ada Lovelace Institute has warned.

Released this week, the Regulating AI in the UK study suggests that existing frameworks will not be enough to protect people from harm, and said that large technology companies have too much power in the development of government policy on the subject.

The government is keen to portray itself as a world leader on AI: the foreign secretary chaired the first UN meeting on AI on Tuesday, a cabinet minister is giving a speech on adapting the civil service to modern technologies on Wednesday, and in September, the UK will host the first global summit on AI to discuss how it can be developed and adopted safely.

Labour’s shadow digital minister Alex Davies-Jones told PublicTechnology sister publication PoliticsHome that she was concerned that “big players” in the tech sector would be “dictating” discussions at the summit and having too much influence over government policy.

Related content

“Those involved in this summit need to be people from civil society; it has to be researchers, academics, those who speak for the country and for people,” she said. “If we haven’t got those voices represented, then of course it’s going to be skewed, of course it’s just going to be the big players dictating what the regulations should be within their favour; not what’s best for the public, not what’s best for society at large.”

Speaking at a webinar to launch the report, UK public policy lead at the Ada Lovelace Institute Matt Davies agreed that industry players were exerting heavy influence on government policy.

“The government is largely reliant on external expertise from industry in order to understand these systems, both trends and the industry as a whole, also specific systems and specific models,” he said. “Obviously, dialogue with industry is always going to be an important component of effective governance, but we think there’s some risks and over-optimising regulation to the needs and perspectives of incumbent industry players.”

The report set out a number of recommendations in for the government’s regulation of AI, including three main tests relating to coverage, capability, and urgency: how well protections extend across different areas where AI is deployed; how well resourced regulators are to carry out their duties in relation to AI; and whether urgent action is required from government to respond to accelerating AI adoption.

Other proposals include exploring the value of establishing an ‘AI ombudsman’ to support people affected by AI, introduce a statutory duty for legislators to comply with the government’s principles on AI, increase funding to regulators for responding to AI-related harms, and create formal channels to allow civil society organisations to meaningfully feed into future regulatory processes – to ensure it is not only tech corporations that are able to do so.

A government spokesperson said: “As set out in our AI White Paper, our approach to regulation is proportionate and adaptable, allowing us to manage the risks posed by AI whilst harnessing the enormous benefits the technology brings. We have been clear that we will adapt the regulatory framework and take targeted further intervention where we see risks that need to be mitigated.

“We have also announced £100m in initial funding to establish the Foundation Model Taskforce to strengthen UK capability and ensure safety and reliability in the specific area of foundation models and will host the first major global summit later this year which will allow us to agree targeted, rapid, internationally co-ordinated action to allow us to safely realise AI’s enormous opportunities.”

Click here to read the full story on PoliticsHome

Zoe Crowther

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter