Facebook has announced that it is now “fact-checking” photos and videos in an effort to prevent hoaxes and false news stories from making their way to the site.
According to Reuters, the fact-checking began on Wednesday in France with the help of news organization AFP. The project will expand to more countries and partners, Tessa Lyons, a product manager at Facebook, said in a briefing.
The work is part of “efforts to fight false news around elections,” she said.
The decision is no doubt the result of the revelation that the personal data of 50 million Facebook users was harvested by political firm Cambridge Analytica and used to help the Trump presidential campaign.
And last October, it was reported that as many as 126 million Americans, accounting for a third of the nation’s population, were exposed to content placed on Facebook by Russian sources during the 2016 elections.
In addition, manipulated photos and videos have also been an ongoing problem — not just on Facebook, but also on other social media sites.
Lyons did not say how Facebook or AFP was evaluating the photos and videos, or how much they would have to be altered to be deemed fake.
Shares of Facebook closed up 4.4 percent on Thursday, but remained down more than 13 percent from March 16 — after the data scandal came to light.
In the past, Facebook has used third-party fact-checkers to try to catch fake news, as well as given the stories less standing in the News Feed when people share them.
Now, though, the company is “proactively” searching for election-related disinformation so it could remove it more quickly, said Samidh Chakrabarti, another Facebook product manager.
And Alex Stamos, Facebook’s chief security officer, said that the company was also concerned with factors such as reducing “fake audiences,” which is using “tricks” to artificially boost the perception of support for a particular message, as well as “false narratives,” including headlines and language that “exploit disagreements.”