Online platforms must start evaluating whether their services expose users to illegal content by 16 March 2025, or risk financial penalties as the UK’s Online Safety Act (OSA) comes into effect.

Ofcom, the UK regulator enforcing the law, has published its final codes of practice for handling illegal online content.

Platforms have three months to conduct risk evaluations to detect potential harms on their services.

Failure to do so could result in fines of up to 10% of their global turnover. Ofcom has outlined more than 40 safety measures for platforms to implement starting in March 2025.

Ofcom head Dame Melanie Dawes said: “For too long, sites and apps have been unregulated, unaccountable and unwilling to prioritise people’s safety over profits. That changes from today.”

“The safety spotlight is now firmly on tech firms and it’s time for them to act. We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year.”

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

“Those that come up short can expect Ofcom to use the full extent of our enforcement powers against them.”

However, critics argue that the OSA does not adequately address a wide range of harms affecting children.

Molly Rose Foundation head Andy Burrows expressed disappointment over the lack of specific measures for dealing with suicide and self-harm material, reported BBC.

He said: “Robust regulation remains the best way to tackle illegal content, but it simply isn’t acceptable for the regulator to take a gradualist approach to immediate threats to life.”

Under Ofcom’s codes, platforms must identify how illegal content might appear on their services and implement ways to prevent it from reaching users.

This includes content related to child sexual abuse material (CSAM), controlling behaviour, extreme sexual violence, and promoting or facilitating suicide and self-harm.

Ofcom began consulting on its illegal content codes and guidance in November 2023 and has since strengthened its guidance for tech firms.

This includes clarifying requirements to remove intimate image abuse content and guiding firms on identifying and removing material related to coerced sex work.

Child safety features required by Ofcom’s codes include preventing social media platforms from suggesting friendships with children’s accounts and warning about the risks of sharing personal information.

Certain platforms must also use hash-matching technology to detect CSAM, a requirement now extended to smaller file hosting and storage sites.

Hash matching involves assigning a unique digital signature to media, which can be checked against known content databases, such as those containing CSAM.

Many large tech companies have already implemented safety measures for teenage users and controls for parental oversight.

Concerns have been raised about the OSA’s broad application to various online services, with campaigners warning about privacy implications related to platform age verification requirements.

Parents of children exposed to illegal or harmful content have previously criticised Ofcom for its slow progress.

The regulator’s illegal content codes require parliamentary approval before fully coming into force on 17 March 2025.

However, platforms are advised to prepare now, assuming the codes will pass, and must have measures in place to prevent access to outlawed material by this date.