The children’s charity said data provided by 45 UK police services showed 7,062 child sexual offenses were recorded in 2023-24, an 89% increase on 2017-18. when the offense first takes effect.
When the means of communication was revealed – which was 1,824 cases – social media platforms were often used, with Snapchat listed in 48% of those cases.
Meta-owned platforms also proved popular among offenders, with WhatsApp mentioned in 12% of these cases, Facebook and Messenger in 12% and Instagram in 6%.
In response to the figures, the NSPCC called on online regulator Ofcom to strengthen the Online Safety Act.
It says there is currently too much focus on acting after harm has occurred, rather than being proactive in ensuring that the design of social media platforms does not contribute to abuse.
The charity also called on the government to do more to prevent the sexual abuse of children in private messages.
Sir Peter Wanless, chief executive of the NSPCC, said: “One year on from the Online Safety Act becoming law, we’re still waiting for tech companies to make their platforms child-safe.
“We need ambitious regulation from Ofcom, which must significantly strengthen its current approach to getting companies to deal with how their products are being exploited by offenders.
“It is clear that much of this abuse is taking place in private messages, which is why we also need the Government to strengthen the Online Safety Act to give Ofcom more legal certainty to deal with Snapchat-style child sexual abuse and WhatsApp.”
Minister for Protection and Violence Against Women and Girls Jess Phillips said: “Child sexual abuse is a despicable crime that causes long-term trauma to victims and the law is clear – creating, possessing and distributing images of child sexual abuse and grooming a child is illegal.
“I met with heads of law enforcement and the NCA (National Crime Agency) just last week to hear about the tremendous work they are doing to bring these offenders to justice.
“Social media companies have a responsibility to stop this heinous abuse from happening on their platforms.
“Under the Online Safety Act, they will have to stop this kind of illegal content being shared on their sites, including on private and encrypted messaging services, or face significant fines.”
“The shocking case of Alexander McCartney, who raised over 3,500 children on his own, demonstrates more clearly than ever that they need to act now, not wait for enforcement action from the regulator.”
A Snapchat spokesperson said: “Any sexual exploitation of young people is appalling and illegal and we have zero tolerance for it at Snapchat.
“If we identify such activity or it is reported to us, we remove the content, disable the account, take steps to prevent the offender from creating additional accounts, and report them to the authorities.”
“We have additional protections, including in-app warnings, to make it harder for teens to connect with strangers, and our in-app Family Center lets parents see who their teens are talking to and who their friends are.”
An Ofcom spokesman said: “From December, tech firms will be legally required to start taking action under the Online Safety Act and will have to do much more to protect children.
“Our draft codes of practice include robust measures that will help prevent grooming by making it harder for perpetrators to come into contact with children.
“We stand ready to use the full extent of our enforcement powers against any companies that fail when the time comes.”