Google, Meta Step Up Efforts To Combat Russia’s Misinformation Campaign 

The international community has unequivocally condemned Russia’s invasion of Ukraine. The U.S., the European Union and several other countries have imposed stiff economic sanctions on the aggressor party, but these governments are not alone. Many of the largest companies of the world are also implementing measures to punish Russian companies. And now, both the public and the private sector are looking for ways to control Russia’s misinformation campaign.  

Last Friday, Meta (FB) announced it would ban the Russian state media’s ability to run ads and monetize them on Meta’s platform. Because of this, the Russian government moved to “partially restrict” Facebook access in the country. Russia justified this by claiming the company engaged in unlawful censorship. CNN reports that Russia ordered Meta to “stop the independent fact-checking and labelling” of four Russian outlets, but Meta refused. Google and YouTube followed Meta’s lead. In a statement to CNN Business on Sunday, Google said, “in response to the war in Ukraine, we are pausing Google monetization of Russian state-funded media across our platforms … We’re actively monitoring new developments and will take further steps if necessary.” Separately, YouTube announced that it will temporarily stop the ability of several Russian channels — including state-sponsored RT — to monetize their content on the platform, “significantly limiting” recommendations to those sites.  

The role of social media platforms in controlling the spread of misinformation may become even more important after the European Commission banned Russian state TV to access EU citizens. On the weekend, European Commission President Ursula von der Leyen announced on Twitter that Putin-backed RT, previously known as Russia Today, and Sputnik would be banned in the EU. ”We will ban the Kremlin’s media machine in the EU. The state-owned Russia Today and Sputnik, and their subsidiaries, will no longer be able to spread their lies to justify Putin’s war,” she said.  This is an effort “to ban their toxic and harmful disinformation in Europe.” Russia most likely will retaliate. As Politico documents, “[w]hen Germany banned RT Deutsch in early February, Moscow forced German media outlet Deutsche Welle to shut down its Russian operations.”  

Big Tech’s response did not come because of goodwill, however. The companies faced criticism for continuing to allow Russian propaganda to spread through their platforms after the invasion started. Without a clear end to the current crisis, the Russian citizens’ response is key to deescalate violence. While not many expected the invasion to take place, Russia successfully misinformed its citizens with the help of Big Tech for many days. Regarding the platforms’ role as key gatekeepers, recent events show that content moderation is possible. While there are many gray areas in which content moderation is not clearly justified, false and misleading information can be labeled as such and eliminated. Psychological research documents the fact that people tend to forget which sources of information lead them to form an opinion about a particular topic.  

Misinformation is a problem that regulators have intended to address for a long time, and not only in the case of autocratic governments. Misrepresentation — and the manipulation users can experience because of it — has been a salient problem in recent scandals such as Cambridge Analytica. More recently, as we have noted, Europe approved the Digital Service Act (DSA) that will hold Big Tech companies accountable for the illegal content posted in their platforms. Big Tech will be required to put in place mechanisms to ensure the content is removed in a timely fashion. Even content considered legal but harmful should be quickly removed. In the U.S., Senators Richard Blumenthal (D., Conn) and Marsha Blackburn (R., Tenn.) introduced bipartisan legislation aimed at holding social-media platforms responsible for harm they cause to children. 

Read More: US Lawmakers Propose Bill To Impose Platform Content Moderation 

See Also: Social Media Platforms Face a Difficult Situation to Deal with Misinformation

 

Sign up here for daily updates on the legal, policy and regulatory issues shaping the future of the connected economy.