Law proposes ‘explicit duty’ for online firms to design services to prevent illegal content

Government continues to tweak and reinforce the provisions of the Online Safety Bill

Credit: Robinraj Premchand/Pixabay

Proposed new laws will introduce an “explicit duty” for firms to design websites and services in a way that mitigates against the possibility that their platform will host illegal activity or content.

Government has announced some small – but significant – tweaks to its Online Safety Bill. If it passes into law, the bill will give the UK one of the world’s most far-reaching legal frameworks for the online world, including some of the biggest potential punishments for breaches.

The latest amendments relate to the duties placed on companies in respect of illegal content and activity on their platforms.

A primary intention of the changes is to make it clear that, as far as possible, preventing illegality and harm should be built into websites and digital services as part of their design process. This includes consideration of online grooming, and other instances where the platform in question is not the location of the crime, but may play a key role in enabling its perpetration.

“The amendments introduce an explicit duty on companies relating to the design of their services,” said a government fact sheet. “Companies must take a safety-by-design approach, managing the risk of illegal content and activity appearing on their services, rather than focusing primarily on content moderation.”


Related content


It added: “In addition, these amendments make clear that platforms have duties to mitigate the risk of their service ‘facilitating’ an offence, including where that offence may actually occur on another site – such as can occur in cross-platform child sexual exploitation and abuse (CSEA) offending – or even offline. This addresses concerns raised by stakeholders that the bill will not adequately tackle activities such as breadcrumbing, where CSEA offenders are posting links or having conversations on a particular site, which are preparatory to a CSEA offence, which then might occur on a different platform, or even offline.”

Although the changes are intended to ensure that companies try and prevent crime taking place on their platforms – rather than just clamping down after it has been detected – further tweaks have also been made to clarify obligations regarding reactive moderation, and “how providers should determine whether content amounts to illegal content”.

An additional clause has been added to the bill with the aim of ensuring that “providers’ systems and processes should consider all reasonably-available contextual information when making judgements” about possible illegal content – including referring to other duties placed on them by other parts of the Online Safety Bill.

The clause also includes specific provisions aimed at ensuring websites implement appropriate processesto prevent the publication of fraudulent adverts. 

These processes should support companies to “ascertain whether, on the basis of all the reasonably available information, there are reasonable grounds to infer that: a) all the relevant elements of [and] offence, including the mental elements, are present; and/or b) no defence is available”, the fact sheet said.

Another newly introduced clause places a requirement on Ofcom, which will serve as the UK’s online harms regulator, to publish guidance on how online firms should make judgements about possible illegal content.

“We expect this will include examples of the kind of contextual and other information that is likely to be relevant when drawing inferences about mental elements and defences, and how far providers should go in looking for that information,” the government said.

Social networks – such as Facebook, Twitter, and TikTok – would be among those most heavily impacted by the new laws, if and when they come into effect. 

Failure to meet their legal obligations could see companies punished by fines of up to 10% of their global turnover. This would equate to more than £5bn in Facebook’s case, or about £250m for Twitter. Senior managers at firms in breach of the laws could also be sentenced to two years in prison.

Amendments to the Online Safety Bill are currently being considered by MPs, who will then debate and vote upon a third reading of the bill before. If it passes this stage, it will move on to the House of Lords.

The latest tweaks were made following “engagement with a broad range of stakeholders”, according to the government.

 

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *

Processing...
Thank you! Your subscription has been confirmed. You'll hear from us soon.
Subscribe to our newsletter
ErrorHere