Ofcom proposes new online harms requirements for tech firms


The watchdog for online safety has opened a consultation in which the public and industry stakeholders are invited to give their views on measures intended to prevent harmful content spreading

Tech companies could face new demands to block illegal content and introduce more restrictive livestreaming services as Ofcom looks to stay in line with “evolving” online dangers.

The communications watchdog has launched a consultation seeking views on a raft of new measures that aim to keep UK users, especially children, safe from online harms. The new proposals focus on stopping illegal content from going viral, ensuring platforms are safer by design and strengthening the protection of children during livestreams.

The proposed measures would require platforms to have protocols in place to respond to spikes in illegal content during a crisis and prevent their recommender systems from spreading material that might be illegal. Sites would also have to introduce hash matching – a digital fingerprint that can flag a post’s link to prior illicit material – to identify terrorism content and intimate images that are shared without consent, such as explicit deepfakes.

For instance, when tackling child sexual abuse material (CSAM), hash matching can identify exact duplicates of already reported content before they are re-uploaded to a platform.

The new measures would also require platforms to assess the role automated tools can play in detecting harmful posts, including CSAM, content promoting suicide and self-harm and fraudulent material.


Related content


Livestreams would have to be under continuous review from human moderators, and users would be banned from commenting, reacting, or sending gifts to children’s livestreams as well as from recording them.

Oliver Griffiths, Ofcom’s online safety group director, said: “Important online safety rules are already in force and change is happening. We’re holding platforms to account and launching swift enforcement action where we have concerns. But technology and harms are constantly evolving, and we’re always looking at how we can make life safer online. So today we’re putting forward proposals for more protections that we want to see tech firms roll out.”

The consultation is set to close on 20 October, with a final decision set to be published by summer 2026. The measures have already come under fire from campaigners for not going far enough.

The Molly Rose Foundation, an organisation set up in memory of 14-year-old Molly Russell, who took her own life after viewing self-harm content online, said on a post on X: “Ofcom’s new measures will not address the current levels of harm or major new suicide and self-harm threats. It’s time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head on.”

A version of this story originally appeared on PublicTechnology sister publication Holyrood

Sofia Villegas

Learn More →