The world’s biggest social media companies were grilled by a US Senate judiciary committee on Wednesday (31 January) over concerns for child safety online, which has led experts to question if the UK’s new Online Safety Bill (OSB) goes far enough in protecting children.
The US hearing, motioned as ‘Big Tech and the Online Child Sexual Exploitation Crisis,’ saw members of Congress question the CEOs of TikTok, Discord, Meta, X and Snap on what they are doing to keep children protected on their platforms.
The hearing mainly focused on Big Tech’s attitude to the online safety legislation that is currently passing through Congress.
Multiple lawmakers have called for laws to be changed so tech companies are no longer free from being sued. US Senator Lindsey Graham said the government would “die waiting” for companies to sort their issues out themselves.
US Senator Amy Klobuchar added that nothing will change “unless we open the courtroom doors”.
Most of the companies present welcomed support for some sort of federal regulation on safety, but many were divided on what that should look like.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataX CEO Linda Yaccarino said the social media platform supported the Strengthening Transparency and Obligations to Protect Children Suffering from Abuse and Mistreatment Act (STOP CSAM Act), a bill which allows victims of sexual online exploitation to sue social media platforms.
Snap showed its support for the Kids Online Safety Act, which requires all online platforms to enforce stricter measures on harmful and inappropriate content being shown to children.
The hearing gave the tech bosses a chance to share how they had improved safety on their platforms, and what they plan for the future. Yaccarino claimed X had suspended 12.4 million accounts in 2023.
TikTok CEO Shou Zi Chew told the committee that the video giant will be spending $2bn in safety efforts for US users over the year.
Does the UK’s OSB go far enough?
The OSB, a new UK law aimed at ensuring social media companies are held responsible for user safety, was passed in September 2023 after almost six years in the making.
The new law means social media companies will need to work harder to protect children from inappropriate content, as well as removing all illegal content.
Robert Prigge, CEO of AI-powered identity verification platform Jumio, told Verdict that it is clear the OSB “does not go far enough”.
“While it’s encouraging to see that the legislation has teeth, including criminal sanctions and significant fines for those who fail to prevent underage children from creating social media accounts and accessing inappropriate content, there remains more to be done on a fundamental level to ensure the Act fulfills its purpose of making the internet safer,” Prigge told Verdict.
The OSB, which is almost 300 pages long, includes a long list of rules for social media and online platforms to adhere to. For example, pornographic websites will now need to check the age of users entering the site.
“This new law means that social media companies will have to disclose what technology they are using, if any, and how they are enforcing their age restrictions,” Prigge said.
“In the [UK OSB’s] current form, it is unclear as to what this entails, and some platforms may exploit this ambiguity to maintain their substandard processes,” he added.
In the US hearing on Wednesday, Zuckerberg suggested that lawmakers require app stores to be the place to verify IDs, instead of social media platforms.
Research from Jumio’s found that 73% of global consumers believe robust identity verification would help prevent underage access to social media, according to Prigge.
“It’s clear that the Act does not go far enough,” Prigge said, “at the very least, it needs a more detailed definition of acceptable identity verification methods to enable platforms to adopt effective risk-based practices.”
Michael Queenan, CEO and co-founder of Nephos Technologies, agreed that all social media platforms in the UK should have a formal age authentication, similar to financial services applications.
“All social media platforms should be putting proper Know Your Customer (KYC) checks in place,” Queenan told Verdict, “email addresses should not be accepted as proof of age when they are so easy to manipulate.”
“To acknowledge KYC and act responsibly, however, would essentially crush the business model and eradicate many key users of this platform, so they will not do it unless it is mandated,” he added.