Ofcom will have power to penalise sites that fail to remove harmful content
Social media firms could be fined billions of pounds for not protecting users, under proposed new UK legislation.
The government’s Online Harms Bill, which will be brought forward next year, would give Ofcom the power to fine companies up to £18m or 10% of global turnover, whichever is higher, for failing in their duty of care to users.
In the case of Facebook, this figure would equate to more than £5bn. Twitter could theoretically face a penalty of around £250m.
Such fines could be levied for failing to remove harmful or illegal content, including abuse, incitements to terror, or material encouraging self-harm of suicide.
In addition to financial punishments, the new law would also give the communications watchdog the power to block online platforms that do not protect users, particularly children and other vulnerable people.
Related content
- Our response to online harms should focus as much on cure as prevention
- Digital minister: ‘Online anonymity is important’
- Government unveils plans for social media regulator
Social media sites, websites, apps and other services which host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content.
The regulator would be able to fine tech giants such as Facebook billions of pounds, and require them to publish an audit of efforts to tackle harmful posts.
The bill will also allow Ofcom to demand that tech firms take action against child abuse imagery shared via encrypted messages, even if the apps are designed to ensure privacy.
The overnment said tech platforms will need to do far more to protect children from being exposed to harmful content or activity such as grooming, bullying and pornography.
The bill does not include criminal liability for senior executives at firms that break the rules, but it does allow that to be brought in later through secondary legislation if the initial measures do not work.
Digital secretary Oliver Dowden told parliament the legislation represented “decisive action” to protect both children and adults online.
He said: “A 13-year-old should no longer be able to access pornographic images on Twitter, YouTube will not be allowed to recommend videos promoting terrorist ideologies and anti-Semitic hate crimes will need to be removed without delay.”
In a statement after the announcement, Dowden said: “I’m unashamedly pro tech but that can’t mean a tech free for all. Today Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation. We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech. “
He added: “This proportionate new framework will ensure we don’t put unnecessary burdens on small businesses but give large digital businesses robust rules of the road to follow so we can seize the brilliance of modern technology to improve our lives.”
Ofcom’s chief executive, Dame Melanie Dawes, said: “We’re really pleased to take on this new role, which will build on our experience as a media regulator. Being online brings huge benefits, but four in five people have concerns about it. That shows the need for sensible, balanced rules that protect users from serious harm, but also recognise the great things about online, including free expression. We’re gearing up for the task by acquiring new technology and data skills, and we’ll work with parliament as it finalises the plans.”