A PYMNTS Company

Ofcom to Push For Better Age Verification, Filters and 40 Other Checks in New Online Child Safety Code

 |  May 8, 2024

By: Natasha Lomas (TechCrunch)

The UK’s telecommunications watchdog, Ofcom, is tightening the reins on Instagram, YouTube, and over 150,000 other web services to enhance child safety on the internet, introducing a new Children’s Safety Code to compel tech companies to implement improved age verification, content filtering, and downranking measures, among approximately 40 other initiatives.

These efforts target harmful content related to topics like suicide, self-harm, and pornography, with the goal of restricting access for those under 18. Currently in the draft stage and open for feedback until July 17th, the enforcement of the Code is slated to commence next year following its final publication by Ofcom in the spring. Companies will be given a three-month window to conduct their initial child safety risk assessments after the Code’s publication.

The significance of this Code lies in its potential to instigate a significant shift in how internet companies approach online safety. The government has repeatedly expressed its desire for the UK to become the safest online environment globally. However, whether it will effectively prevent harmful digital content from reaching children, akin to how sewage is prevented from polluting waterways, remains uncertain. Critics argue that the legislation could burden tech firms with substantial compliance costs and restrict citizens’ access to certain information.

Simultaneously, non-compliance with the Online Safety Act could result in severe repercussions for UK-based web services of all sizes. Violations could lead to fines of up to 10% of global annual turnover, along with potential criminal liability for senior managers under specific circumstances.