Facebook Unveils Once-Secret Content Policing Guide

Facebook is giving more detail on how and why it chooses to delete content from its network.

The world’s largest social media company published its “community standards” guidebook on its website yesterday, providing more detail on how the network’s 7,500 moderators decide what text, pictures and video are removed. Facebook released the previously employee-only guide to provide more transparency about how it decides what’s acceptable and to get feedback that should allow the company to improve its content decisions, Monika Bickert, Facebook’s vice president of Global Policy Management, said in a company release.

“We will be sharing these updates publicly and will be releasing a searchable archive so that people can track changes over time,” Bickert said in the release.

A Facebook management group meets every two weeks to discuss content policy, according to a Reuters report. The news agency was allowed to observe a recent meeting led by Bickert by agreeing not to report on specific discussions. About two dozen employees, plus colleagues, video conferenced in from abroad and discussed changes and updates to the company’s policies for about an hour, referencing input they had received from civil rights groups and other outsiders. The company told Reuters it will hold public meetings on the content policy in May and June.

In addition to the expanded guidelines, users will now be able to appeal removal of individual pieces of content. Up until now, only the removal of accounts, Facebook Groups and the business marketing-focused Pages could be appealed.

Facebook previously published only a brief, generally worded standards guide for end users, which led to criticism of the company as being inconsistent or politically motivated in its actions, according to the Reuters report.

The new set of procedures are separate from Facebook’s approach to content deemed objectionable by government entities. In those cases, government officials must make a written request for removal, which is then reviewed by Facebook’s attorneys. Content deemed acceptable to standards but which is locally forbidden is blocked from users in the country while remaining available globally. An example of this is the Thailand prohibition on disparaging its royal family, Reuters said.