As the Online Safety Act comes fully into effect, Ofcom has published guidelines for the digital industry, and warn those that ‘come up short’ they can expect to face consequences
The Online Safety Act has this week come into force, with tech firms given three months to comply with the new rules or face the prospect of watchdog Ofcom using “the full extent of our enforcement powers against them”.
As the new laws take effect, Ofcom has published its first illegal harms code of practice and guidance, kicking off “a much wider package of protections”, which will make 2025 “a year of change”, the watchdog said.
The issuance of guidelines marks “a major milestone” in creating a safer online world, introducing 40 new safety measures “explicitly” designed to tackle online grooming and protect children from harm.
Platforms now have three months to review the risk of their users encouraging illegal activity, and then implement any necessary safety and mitigation measures.
From March, children’s profiles and locations must not be publicly visible, should not appear on lists of people users might wish to add, and non-connected accounts must not be able to send them direct messages.
Under the new measures, tech firms will have to appoint a senior person accountable for its compliance with managing illegal content and will have to make reporting functions easier to access. Sites are also required to introduce better testing for their algorithms to prevent the spread of harmful content, as well as automated tools such as hash-matching and URL detection – which are designed to speed up the detection of harmful content – to find child sexual abuse material.
Related content
- Ofcom CEO: ‘Our expectations are high and we’ll come down hard on those who fall short of online safety laws’
- Ofcom requires £113m extra investment and 100 new staff to deliver online safety regulation
- Senior tech execs could face prison for breaching online safety laws
The new measures also hope to protect women and girls by ensuring apps take down non-consensual intimate ages, also known as revenge porn, and remove posts by organised criminals who are coercing women into prostitution.
Sites and apps are also expected to establish a dedicated reporting channel for organisations with fraud expertise, allowing them to flag known scams to platforms in real-time so that action can be taken.
Posts generated, shared, or uploaded via accounts operated on behalf of terrorist organisations proscribed by the UK government will also amount to an offence.
Ofcom chief executive Melanie Dawes warned any tech firms that “come up short can expect Ofcom to use the full extent of our enforcement powers against them”.
“For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today,” she added. “The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”