Government should establish new digital authority to end ‘fragmented oversight’ of big tech, says Lords committee
Self-regulation by online platforms ‘clearing failing’ and regulatory framework ‘out-of-date’
A new ‘digital authority’ should be established by the government to co-ordinate existing technology regulators with additional powers to fill gaps, and report to both government and parliament, according to the Lords Communications Committee.
The committee's Regulating in a digital world report stated that over a dozen UK regulators have a remit covering the digital world but there is no body which has complete oversight. It believes that this has left regulation of the digital environment fragmented, with gaps and overlaps, and this has been compounded by big tech companies failing to adequately tackle online harms.
- Government data ethics body to focus on bias and micro-targeting
- Public-sector AI code of conduct published
- Bursting the bubble – the ethics of political campaigning in an algorithmic age
The committee recommended that a new digital authority, guided by 10 principles should inform regulation of the digital world. These 10 principles include accountability, transparency, respect for privacy and freedom of expression. The committee believes that the principles will help the industry, regulators, the government and users all work towards making the internet a better, more respectful environment. If rights were to be infringed, then the party responsible would be held accountable in a fair and transparent way, it said.
Of the core areas of recommendations, the Lords committee called for Ofcom’s remit to be expanded to include responsibility for enforcing the duty of care on online services that host and curate content that can be uploaded and accessed by the public. In addition, it wants online platforms to make community standards clearer through a new classification framework, and invest in more effective moderation systems to uphold their community standards. For years, Facebook and Twitter have been called on to improve the way they moderate content, including hateful speech.
The Lords committee also said users should have greater control over the collection of personal data, and that data controllers and processors should be required to publish an annual data transparency statement detailing which forms of behavioural data they generate or purchase from third parties. It called on the Information Commissioner’s Office to be empowered by the government to conduct impact-based audits where risks associated with using algorithms are the greatest.
"The government should not just be responding to news headlines but looking ahead so that the services that constitute the digital world can be held accountable to an agreed set of principles,” said the chairman of the committee, Lord Gilbert of Panteg.
Lord Gilbert said that self-regulation by online platforms was “clearing failing” and said the current regulatory framework was out-of-date.
“The evidence we heard made a compelling and urgent case for a new approach to regulation. Without intervention, the largest tech companies are likely to gain ever more control of technologies which extract personal data and make decisions affecting people's lives,” he said.
“Our proposals will ensure that rights are protected online as they are offline while keeping the internet open to innovation and creativity, with a new culture of ethical behaviour embedded in the design of service.”
Department claims that ‘no-one can be identified’ from information published
Rich McHugh replaces Loveday Ryder, who joins DVSA
Joint public-private body Fintech to deliver training
Boris Johnson claims extra funding represents ‘biggest investment since Cold War’