The dawn of new tools has led to an enormous rise in harmful images being created and shared, with campaigners claiming that big tech firms ‘have favoured profit over safety’
Academic research has uncovered an annual rise of 1,325% rise in AI-generated images of child sexual abuse.
The report, published by the Childlight Global Child Safety Institute, hosted by the University of Edinburgh, highlights the increasing dangers of deepfakes and forms of images or videos of underage individuals created without their consent.
Yearly reports of these images logged by the US-based National Center for Missing and Exploited Children rose from a total of 4,700 in 2023 to over 67,000 in last year.
The report shows that over 55% of assessed child sexual abuse material is produced by relatives.
“People often say home is where the heart is – but sadly for too many children, home is where the hurt is,” said Childlight chief executive Paul Stanfield. “We see betrayal of trust by those known to children on a vast scale, compounded by insufficient protections by tech companies and regulators to avoid digital crime scenes in children’s bedrooms.”
Related content
- Government unveils laws for AI-created abuse material
- Home secretary unveils new tech tools to help police combat child abuse
- National Crime Agency plans digital ‘front door’ for tech firms to report child abuse
The study found that nearly one in five children in western Europe reported experiencing unwanted sexual interactions online before turning 18. These can come from solicitation attempts, grooming and pressured sexual acts. The data suggests that nearly 15 million children are affected in the region, with one in seven children reporting that they had experienced unwanted sexual interactions in the past year.
The report says the “deliberate commercially led choices” of major technology companies are making it harder to prevent these crimes.
When she was 13, Rhiannon-Faye McDonald was abused after being approached by a man posing as a fellow teenager online. Soon afterwards, he arrived at her address in Yorkshire and abused her in person. Today, McDonald campaigns through the Marie Collins Foundation for better online safety regulations.
“For too long technology companies have favoured profit over safety,” McDonald said. “A rising number of children being abused is a direct result. For most victims and survivors, even with the right support, the impacts are significant and long-lasting. We live with misplaced self-blame and the fear of being recognised by those who have seen the images or videos of our abuse. For anybody who believes that it’s ‘just a photo’, this couldn’t be further from the truth.”

A version of this story originally appeared on PublicTechnology sister publication Holyrood


