Britain’s stringent online safety framework officially came into effect on Monday, placing a significant onus on social media giants such as Meta’s Facebook and ByteDance’s TikTok to combat criminal activities on their platforms and prioritize user safety by design. According to Reuters, this marks a pivotal step in the country’s efforts to create a safer digital environment.
The initiative is spearheaded by media regulator Ofcom, which has rolled out its first set of codes of practice aimed at addressing illegal harms, including child sexual abuse and content that promotes or assists suicide. Per Ofcom, social media companies have until March 16, 2025, to evaluate the risks posed by illegal content to both children and adults and to implement measures to mitigate those risks. These measures include improved content moderation, streamlined reporting mechanisms, and the incorporation of built-in safety checks.
Ofcom Chief Executive Melanie Dawes underscored the significance of this regulatory milestone, stating that the industry will be closely monitored to ensure compliance with the new safety standards. “We’ll be watching the industry closely to ensure firms match up to the strict safety standards set for them under our first codes and guidance, with further requirements to follow swiftly in the first half of next year,” Dawes noted.
The Online Safety Act, which was enacted last year, enforces tougher obligations on platforms such as Facebook, YouTube, and TikTok, with a particular focus on safeguarding children and eliminating illegal content. In line with the new regulations, high-risk platforms must employ automated tools, including hash-matching and URL detection, to identify and address child sexual abuse material effectively.
Read more: ESG Collaborations in Light of European Antitrust Policy and Enforcement Trends
Failure to comply with these regulations could result in severe penalties. Ofcom has the authority to impose fines of up to £18 million ($22.3 million) or 10% of a company’s global annual revenue. The regulator can also seek court orders to block non-compliant platforms from operating within the UK.
UK Technology Secretary Peter Kyle emphasized the transformative nature of the new codes, describing them as a “material step change in online safety.” He expressed his support for Ofcom’s authority to enforce compliance, stating, “If platforms fail to step up, the regulator has my backing to use its full powers, including issuing fines and asking the courts to block access to sites.”
Source: Reuters
Featured News
Norton Rose Adds Antitrust Partners in Italy
Jan 20, 2025 by
CPI
Antitrust Lawsuit Over Google’s Search Monopoly Proceeds in CA Court
Jan 20, 2025 by
CPI
Digital Markets Act at Two Years: Enforcement in a Shifting Political Climate
Jan 20, 2025 by
CPI
EU Expands Tech Oversight with Updated Anti-Hate Speech Code
Jan 20, 2025 by
CPI
Cargill Settles Turkey Price-Fixing Lawsuit for $32.5 Million
Jan 20, 2025 by
CPI
Antitrust Mix by CPI
Antitrust Chronicle® – Pharmacy Benefit Managers
Jan 20, 2025 by
CPI
Untangling the PBM Mess
Jan 20, 2025 by
CPI
Using Data, Not Anecdotes, to Analyze Criticisms of Pharmacy Benefit Managers
Jan 20, 2025 by
CPI
Vertical Integration and PBMs: What, Me Worry?
Jan 20, 2025 by
CPI
The Economics of Benefit Management in Prescription-Drug Markets
Jan 20, 2025 by
CPI