A dedicated covert team of specialised officers focused on tackling sexual abuse against children perpetrated online tells MPs that offenders operate with ‘impunity’ on what has become a ‘favoured platform’
A covert law-enforcement unit specialised in tackling online child sexual abuse has warned that TikTok is “fostering an environment that promotes risk-taking behaviour in children” and has become a “favoured platform” for abusers, who currently operate with “impunity”.
The Online CSEA (Child Sexual Exploitation and Abuse) Covert Intelligence Team – known as OCCIT – recently sent a memorandum headed “TikTok abuse” to parliament’s Science, Innovation and Technology Committee.
The document advises MPs on the committee that “covert law enforcement operations” led by the unit have noted that, in light of the “risk-taking behaviour” enabled by the video-sharing platform, “online sex offenders have been actively misusing TikTok to locate and abuse victims, whilst simultaneously networking with other offenders”.
The ability to operate with “impunity” means that many child sexual abuse perpetrators now see TikTok as a preferred platform for finding victims – as well as committing crimes of blackmail and extortion, officers have found. The social platform, meanwhile, is not just enabling this activity – but “currently promotes it”, officers have concluded.
“Videos of children are downloaded from TikTok and shared across both the clear and dark web,” the memo says. “This includes footage from livestreams, which offenders record whilst interacting with the victim.”
Videos are often shared across groups with as many as 10,000 members, the memo says. Livestreams, meanwhile, are supposed to be limited only to users aged 18 or upwards. But “children as young as five years old have been observed using these facilities whilst engaging with adults who are not known to them”, according to OCCIT.
“Abuse and the sexualisation of children is frequently noted taking place on the platform itself,” according to the memo. “Within just a few days of reviewing offender behaviours on the platform, OCCIT noted hundreds of accounts dedicated to the sexualisation of children – many of which specifically focussed on those from the UK.”
These accounts “are not hidden and are easily located when searched for [and] they market themselves ostentatiously with some seeking to monetise child abuse content”.
Related content
- National Crime Agency plans digital ‘front door’ for tech firms to report child abuse
- ICO to investigate TikTok over use of children’s data
- ‘This is innovation, and it is a good thing’ – exploring government’s growing use of influencers for public messaging campaigns
Through its investigation, the specialised UK law-enforcement unit has concluded that “concerningly, the TikTok algorithm, recommendations and search tools facilitate the abuse and sexualisation of children”.
“A short time spent interacting with the profiles noted above resulted in TikTok recommending similar profiles or those belonging to real children,” OCCIT says. “It also recommended sexual search terms used by other users. When scrolling through suggested videos, TikTok presented content of real children, many of which with identifiable features such as school uniforms and home addresses.”
The memo adds: “ In addition to networking with one and other, offenders operate with impunity when it comes to contacting children on TikTok… Offender discussions have shown that TikTok is a favoured platform for locating victims of online sexual exploitation. This has shown to include blackmail and extortion.”
Self-generated abuse
The anti-abuse policing unit also found that “a further alarming practice being increasingly enabled by TikTok is the growing trend of ‘self-generated CSAM’ (child sexual abuse material) being monetised by children”.
“Children are being observed creating sexual content of themselves and selling it to online sex offenders,” the memo says. “These practices are taking place on the TikTok platform, where children anticipate and prepare for account bans/closures. Children are increasingly observed marketing TikTok profiles that subsequently direct offenders to platforms such as Telegram, where they make explicit sexual content to order. This also includes content of self-harm.”
The document adds: “This has been noted to escalate all the way up to the most explicit sexual acts being undertaken by children on TikTok live streams, which are recorded and subsequently shared by online sex offenders. Many children engaging in these increasingly concerning practices are eventually subject to blackmail and extortion by adult sex offenders. As a result, they are subject to the most extreme and humiliating abuse imaginable.”
Following its investigations, OCCIT has reached a stark conclusion.
The memo to MPs ends: “From an algorithm that promotes sexualised content of children, to safety features that are failing to detect both offenders and victims. TikTok doesn’t just enable online sexual abuse. It currently promotes it.”
In response to enquiries from PublicTechnology, a spokesperson for TikTok said: “CSAM is abhorrent and categorically prohibited on our platform. We invest significantly in combatting exploitation and staying ahead of bad actors through proactive detection technology and specialist teams, and we take deliberate design decisions that make our platform hostile to predators.”
The company further indicated that it has met with OCCIT and asked if the unit could provide more detail about its findings. The firm also claimed to have investigated the content concerned in the unit’s operations and has removed anything found to be in contravention of the platform’s policies.
TikTok also regularly engages with other agencies focused on online abuse, including the National Crime Agency’s Child Exploitation and Online Protection Centre, the company claimed.
The firm also indicated that it works with other tech firms and has deployed tools including the Google’s Content Safety API and CSAI Match services, and the PhotoDNA technology from Microsoft.

