With just a few weeks until the UK’s new internet safety laws come into effect, the head of the regulator charged with enforcing them is planning for a ‘pivotal year’
Ofcom has announced a timeline for the implementation of the Online Safety Act, with 2025 singled out as a “pivotal year” in the UK’s ambitions to create a safer world for users.
With two months left until the act comes into force, the watchdog has warned social media firms it will “come down hard” on those who fail to comply with the bill, imposing “significant” fines, limiting their access to payment providers or advertisers and even banning their service in the UK.
Melanie Dawes, Ofcom’s chief executive said: “The time for talk is over. From December, tech firms will be legally required to start taking action, meaning 2025 will be a pivotal year in creating a safer life online. We’ve already engaged constructively with some platforms and seen positive changes ahead of time, but our expectations are going to be high, and we’ll be coming down hard on those who fall short.”
The act, which received Royal Assent a year ago, requires social media firms to protect children from harmful content such as self-harm and violent material. It is understood firms which fail to meet their responsibilities could face fines of up to 10% of their global turnover.
Related content
- Ofcom requires £113m extra investment and 100 new staff to deliver online safety regulation
- Online Safety Act: Research claims social sites are failing to remove harmful content
- Senior tech execs could face prison for breaching online safety laws
The timeline for implementing the law in the coming outlines how the regulator looks to ensure companies comply with the act, beginning with the December publication of the “first edition” of its illegal harm’s codes and guidance. From that point, firms will have up to three months to complete a risk assessment.
In January, Ofcom will finalise its children’s access assessment guidance and guidance for pornography providers on age assurance, and platforms will have until April to assess whether their service is likely to be accessed by children. A month later, Ofcom will consult on its best practice guidance on protecting girls online, and in April 2025, it will finalise its children’s safety codes and guidance, with companies having until July to complete the children’s risk assessment.
Later in 2025, the regulator is expected to consult on additional measures for the second edition codes and guidance.
To date, the watchdog claims it has has already secured improved protective measures from UK-based video-sharing platforms, including OnlyFans, which has introduced age verification, and Twitch, which has introduced measures to stop children from seeing harmful videos.
Instagram, Facebook and Snapchat have also implemented measures to help prevent children being contacted by strangers, according to Ofcom.
The regulator said these were “positive stops” but warned “many platforms will have to do far more” when the act comes into force in December.

A version of this story originally appeared on PublicTechnology sister publication Holyrood