The UK’s telecoms and online harms watchdog has provided companies with a range of steps they are required to take in the coming months to ensure compliance with new legislation
Online sites will have to introduce “transformational” new child-safety measures or risk being shut down in the UK, Ofcom has announced.
Social media apps and other platforms have been given a checklist of “40 practical measures” to take in order to ensure compliance with the Online Safety Act. This includes ensuring algorithms can filter out harmful content, and introducing robust age checks to identify those under the age of 18.
Technology firms will have until 24 July to comply with the watchdog’s new code of practice.
Under the new protections, tech companies will also have to offer “supportive information” to those who have encountered or searched for harmful content, as well as streamline the process for users to report any posts. By July, all firms must also appoint a person to be responsible for children’s safety and create a senior body to annually review the management of risk to children.
Platforms that fail to comply with the new duties will risk facing fines of up to 10% of their global turnover, and in extreme cases may be closed down in the UK.
Related content
- Ofcom CEO: ‘Our expectations are high and we’ll come down hard on those who fall short of online safety laws’
- Ofcom requires £113m extra investment and 100 new staff to deliver online safety regulation
- Senior tech execs could face prison for breaching online safety laws
The introduction of the rules follows consultation into online safety proposals that received tens of thousands of responses from children and parents as well as input from experts and other stakeholders. Participants called for better control over users’ online feeds, options to decline invitations to unwanted group chats and stronger content management systems.
Melanie Dawes, Ofcom chief executive, said: “These changes are a reset for children online. They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. Ofcom has been tasked with bringing about a safer generation of children online, and if companies fail to act they will face enforcement.”
Earlier this month Labour MP Gregor Poynton, who chairs the All-Party Parliamentary Group on Children’s Online Safety, urged Ofcom to partner with big tech firms to ensure apps have as much data as possible when verifying a potential user’s age.
Speaking to PublicTechnology sister publication Holyrood, he said: “Apple and Google should look at whether, when someone downloads an app that is for 18 and over…could they flag that that person might be under 18? Therefore, they could then get sent on a different journey for age verification, which is perhaps longer and more in-depth.”

A version of this story originally appeared on PublicTechnology sister publication Holyrood