Online child sexual abuse in England and Wales has risen by 26 percent in a single year, according to new police figures, prompting renewed calls for major technology platforms to strengthen their safety measures.
Police leaders are urging social media companies to deploy AI tools that automatically block indecent images before they are uploaded or shared. Senior officers also argue that technology used by children should include built-in protections, such as phones that limit access to safe platforms and websites.
New national data shows that 122,768 child sexual exploitation offences were recorded in 2024, marking a 6 percent rise from the previous year. Online abuse accounted for 51,672 offences, representing 42 percent of the total. Half of all recorded cases involved child-on-child offending among 10 to 17-year-olds, with the most common behaviour being the sharing of indecent images.
Experts say online exploitation is the fastest-growing threat facing young people. However, it remains unclear whether the increase reflects greater reporting from platforms ahead of the Online Safety Act coming into force or a genuine rise in offending. Research from organisations such as the Youth Endowment Fund suggests the trend is worsening.
Another emerging risk for teenagers is sextortion, where offenders threaten to release sexual images to extort money or further content. The scale of this issue remains difficult to measure.
Children’s charities are urging the government to commission a national study to better understand how young people experience online harm beyond police-recorded crime.
Police figures identify Snapchat as the platform most frequently linked to reported child exploitation and abuse offences, accounting for 54 percent of reports. WhatsApp and Instagram each represented around 8 percent, with encrypted messaging contributing to rising numbers on WhatsApp. Facebook has declined in relevance as its user base ages.
Officers say Snapchat provides the highest level of reporting to law enforcement, while platforms such as TikTok and X show significantly lower reporting levels. They note disparities in how proactive different companies are in identifying and removing harmful content.
Charities describe the newly published reports on child sexual abuse and exploitation as the most comprehensive to date, though they believe only one in ten crimes are actually reported.
A second report examined group-based child sexual abuse, including grooming gangs. In 2024, group-based offending accounted for 3.6 percent of all offences, with 4,450 cases recorded. Approximately 17 percent of these were linked to grooming gangs, 32 percent occurred within families, and 24 percent involved child-on-child abuse.
The data also showed that 78.03 percent of identified offenders were White British, compared with 74.4 percent of the UK population. Offenders of Pakistani origin accounted for 3.94 percent, compared with 2.7 percent of the national population.
Police leaders say the findings highlight a rapid shift of offending into digital spaces and stress the need for stronger action from government, law enforcement, industry, and civil society to stop harm before it reaches children.
