The UK’s National Society for Prevention of Cruelty to Children (NSPCC) has recorded a staggering 82% increase in the number of online child grooming cases in the last five years.
Its research suggests that the out of almost 34,000 grooming incidents reported to the UK’s police, around 74% of reported incidents involved Snapchat and Meta.
As the UK government is set to make its final decisions next month around the upcoming Online Safety Bill aimed at protecting children online, the NSPCC requests that the full scale of online child grooming is acknowledged.
Snapchat already faced scrutiny earlier this month when UK media regulator estimated the app had “thousands” of underage users despite only removing a few dozen.
According to the NSPCC over 150 different mobile apps were used to groom victims, but 26% of online grooming incidents took place on Snapchat.
NSPCC Chief Executive Sir Peter Wanless stated that the research highlights “the sheer scale of child abuse happening on social media and the human cost of fundamentally unsafe products.”
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataWhilst Wanless expressed that the NSPCC was “pleased” that the UK government was tightening social media laws, he stated that the onus is “now on the tech firms, including those highlighted by these stark figures today, to make sure their current sites and future services do not put children at unacceptable risk of abuse.”
A spokesperson for Snapchat told Verdict that the company was working on its grooming detection technology and in-app reporting of offensive or inappropriate behaviour.
They stated that over the last year Snapchat has reported over 500,000 accounts potentially sharing sexually exploitative content directly to the US National Center for Missing & Exploited Children, which then co-operates with both US and international police departments.
Childnet, a UK based charity that aims to make the internet safer for children, state that any child can be at risk from grooming and that oftentimes many groomers attempt to befriend their victims using fake profiles.
Social media companies may therefore have difficulty assessing which users are at risk as some online groomers mask their age to contact their victims.
When asked on how the NSPCC’s report might impact the UK government’s finalisation of the Online Safety Bill, GlobalData’s principal analyst Laura Petrone stated that it will place “renewed pressure on tech companies to tackle illegal content more thoroughly.”
She also believed that the research would embolden campaigner and MP support of the Online Safety Bill as it passes through parliament.
“Once it becomes law,” Petrone explains, “it would allow Ofcom to require messaging apps to use the technology necessary to detect and eventually block illegal images”.
This, she points out, leads to a grey area for social media companies and their use of encryption.
“Messaging platforms have been left unclear about how this bill will affect encryption,” she says, “and the general public hasn’t been reassured enough about how the bill can be implemented without weakening the security of these messaging services.”
The UK government will need to strike a balance between security and transparency in its future communications with social media companies.