Online Safety Bill’s lack of action on disinformation ‘leaves a problem for the future’, critics warn

As it enters the final stages before passing into law, many parties continue to criticise what they see as significant gaps in the measures contained in government’s internet safety legislation

As it reaches its final stages of parliamentary process, the Online Safety Bill retains many fierce critics, including those who warn that the law’s lack of measures to tackle misinformation and disinformation has simply “left a problem for the future”.

The bill seeks to legislate for social media companies to take responsibility for harmful and illegal content online and make it safer for users – particularly children – to experience the internet. The legislation, which began life as a white paper in 2019, returned to the House of Commons last week for final debates of the Lords’ amendments. It is expected to become law by the end of this year.

In its final stages, there are few opportunities remaining for MPs and peers to add further amendments, but leading figures involved in the bill are concerned there are “huge gaps” remaining and said that campaigning for stronger protections will continue even after it is passed.

Full Fact, a charity that campaigns for tackling misinformation, is also concerned that the bill has not taken the chance to effectively tackle online disinformation – which is typically defined as the deliberate dissemination of false information by hostile actors – and misinformation, which is the inadvertent spreading falsehoods by individuals.


Related content


Glen Tarman, Head of Advocacy and Policy at Full Fact, told PublicTechnology sister publication PoliticsHome the bill had been a “huge missed opportunity”.

“​​Full Fact is deeply disappointed in the failure of the bill to introduce adequate regulation to address harmful misinformation and disinformation, and also to protect freedom of expression. So the Online Safety Bill has been a huge missed opportunity,” he said. “We were pressing that the social media companies should have some obligation to undertake media literacy, and we were disappointed that it will only be voluntary: they will be able to walk away and that’s another missed opportunity.”

Tarman also complained that the government had U-turned on including protections for health misinformation.

“That would have required the platforms to have clear policies on harmful health information,” he said.

Instead, the bill will introduce a duty to take certain content down, but Full Fact argues signposting and direction to correct information is needed to ensure the safety of internet users seeking health advice.

“We’ve ended up with something that’s not fit for purpose,” Tarman said. “It’s left a problem for the future. The power is still within the social media companies, without proper oversight or regulation, to make choices about what we read and see and can say online. Not dealing with harmful misinformation and disinformation adequately simply kicks a problem further up the road that we had every opportunity to address.”

Alex Davies-Jones, the shadow minister for technology and the digital economy, addressed parliament as the bill returned to the Commons last week, and claimed that the bill makes “no effort” to future proof or anticipate emerging harms as new technologies develop.

As the bill still focuses on content rather than social media platform’s business models, she said it “may not go far enough”, and called for the government to commit to a review of the legislation within the next five years to make the UK “the safest place in the world to be online”.

“This is not the end,” she insisted, adding that Labour will continue to push for the bill to be passed in good time and reviewed thereafter.


The full version of this article originally appeared on PublicTechnology sister publication PoliticsHome, and can be read here

Zoe Crowther

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere