Digital secretary Jeremy Wright tells web giants that ‘the era of self-regulation is over’
Social media companies will face massive fines or being blocked in the UK altogether if they fail to remove harmful or illegal content from their platforms.
Prime minister Theresa May said the new measures, outlined in the government’s Online Harms White Paper, would ensure firms act to stop child abuse and terrorist content from being spread in a bid to make the UK “the safest place in the world to be online”.
Content that encourages suicide or that is deemed disinformation or cyber-bullying will also be targeted, while new measures to stop children from accessing inappropriate material will be brought in.
Among the rules proposed in the initial 12-week consultation is a mandatory ‘duty of care’, which will force companies to take “reasonable steps” to keep users safe and tackle illegal and harmful activity on their platforms.
A new regulator with enforcement tools is set to be handed the power to dish out “substantial fines”, or in worse cases to block access to sites and potentially take action against company bosses.
Related content
- Our response to online harms should focus as much on cure as prevention
- Government considers creating social-media regulator with sanctioning powers
- Online harms – Labour calls for action to ‘protect our democracy’
Sites can also be obliged to hand over “annual transparency reports” on the amount of harmful content on their platforms and what they are doing to fix the issue.
Codes of practice issued by the regulator will meanwhile outline a responsibility for companies to “minimise the spread of misleading and harmful disinformation with dedicated fact checkers”, particularly during election periods.
Furthermore, firms will be obliged to respond to users’ complaints quickly.
Announcing the plans, May said that companies had “for too long” neglected to protect users, especially children and young people.
“That is not good enough, and it is time to do things differently. We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe. Online companies must start taking responsibility for their platforms, and help restore public trust in this technology.”
‘Voluntary actions have not been applied’
File-hosting sites, public discussion forums, messaging services, and search engines will also fall under the remit of the new rules.
Secretary of state for digital, culture, media and sport Jeremy Wright Wright said of the plans: “The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough. Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users. However, those that fail to do this will face tough action.”
But shadow culture secretary Tom Watson said the plans set out in the white paper would not address “market failure”.
The Labour deputy leader warned: “Labour have been calling for a new regulator with tough powers to bring social media companies into line for the last year. The public and politicians of all parties agree these platforms must be made to take responsibility for the harms, hate speech and fake news they host. The concern with these plans is that they could take years to implement. We need action immediately to protect children and others vulnerable to harm.”
He added: “These plans also seem to stop short of tackling the overriding data monopolies causing this market failure and do nothing to protect our democracy from dark digital advertising campaigners and fake news. This is a start but it’s a long way from truly reclaiming the web and routing out online harms.”