New UK Provision to Hold Big Tech Responsible for Paid Ads From Fraudsters

UK Law to Hold Big Tech at Fault for Fraudulent Ads

Social media platforms and other Big Tech companies are facing new regulatory headwinds in online advertising, this time from the United Kingdom.

The U.K. introduced new provisions to the Online Safety Bill March 8 that will require the largest and most popular social media platforms like Meta, TikTok and Twitter, as well as search engines like Google, to prevent paid fraudulent adverts appearing on their platforms.

“We want to protect people from online scams and have heard the calls to strengthen our new internet safety laws. These changes to the upcoming Online Safety bill will help stop fraudsters conning people out of their hard-earned cash using fake online adverts,” said U.K. Culture Secretary Nadine Dorries in a statement.

Under the current draft of the Online Safety Bill, Big Tech already has an obligation to protect users from fraud committed by other users. The new duty added to the bill will put the focus on fraudulent paid advertisements whether they are controlled by the platform itself or an advertising intermediary. This includes ads with unlicensed financial promotions, fraudsters impersonating legitimate businesses and ads for fake companies.

The burden will be placed on the companies, which will need to put in place systems and processes to prevent the publication, or hosting, of this type of advertising and remove it as soon as they are made aware of it. The details of how this obligation will be implemented haven’t been determined yet, but it may require significant efforts from the companies to not run afoul of the law.

Some of the suggestions made by the government include scanning for scam advertising before it is uploaded, checking the identity of those who wish to publish ads or ensuring that financial promotions are only made by firms authorized by the Financial Conduct Authority (FCA).

It is also possible that to facilitate a quick implementation of some of these new obligations, new requirements could be added to the code of conduct that the new Digital Markets Unit is drafting. The code, a document containing best practices, will be used to tackle some of the most urgent issues surrounding digital platforms.

The second announcement by the U.K. government to tighten its grip on online advertising is the launch of a public consultation on its Online Advertising Programme. The placement of online advertising in the U.K. is overseen by the Advertising Standards Authority (ASA), which lacks enforcement powers and is governed under a system of self-regulation. Given the increasing number of scams and promotions of fraudulent products, this new program is looking at potential changes in the existing regulation and how to properly fund regulators to combat harmful advertising.

One interesting view is that irrespective of the new regulatory approach to be adopted, the responsibility will not only fall under platforms such as Meta, TikTok or Twitter, but also on the intermediaries in the online ad chain, such as Google, TheTradeDesk and AppNexus.

The three options foreseen in the consultation are to continue with a self-regulatory approach with a few new requirements for intermediaries, publishers and platforms; to introduce a statutory regulator to backstop the self-regulatory approach who would only intervene when the sanctions available to the ASA don’t go far enough to ensure compliance; and to create a new regulator with more powers.

The government didn’t say explicitly which option it would choose, especially because this is part of an open consultation. However, in the report, it is easy to find out that the government may prefer to have a new regulator with strong enforcement powers. On the third option, the document reads, “this is likely to be the most effective approach for increasing accountability in addressing illegal harms like fraudulent advertising (and as will be aligned with the new measures in the forthcoming Online Safety Bill), as criminal enforcement powers would likely be necessary for some of the measures required to address such activity.”