By Alisha Haridasani
In making public its internal content guidelines, Facebook has outlined the prevailing standards that will effectively govern speech on the world's largest social media network.
“This is our way of clearly explaining publicly how we enforce these rules,” said Monika Bickert, Facebook's vice president of global policy management. “For the last couple of years, I’ve heard people asking ‘what do you mean by hate speech? What do you mean by harassment?’”
The guidelines, published online Tuesday, define hate speech as “a direct attack on people based on what we call protected characteristics,” which include race, religion, sexual orientation, gender identity and other characteristics. The guidelines also cover harassment, child nudity, bullying, and graphic violence.
In an interview Wednesday with Cheddar's Alex Heath, Bickert said that the social media platform doesn’t take its content decisions lightly, and users who disagree can now appeal directly.
The guidelines were developed with input from experts, said Bickert, and the difficult lessons learned from previous attempts to censor content that prompted public backlash.
“We talk to literally hundreds of experts and organizations around the world to understand safety issues and how they’re changing and the trends that are happening so that we can adjust our policies to best protect people,” she said.
In addition to enforcing their own community standards, Facebook is often called on by governments to enforce local laws and censor speech that runs afoul of foreign governments. In the past, the platform has blocked content in Thailand, where citizens are forbidden from criticizing the royal family, and in Germany, where laws governing hate speech go further than they do in the United States.
“Any time that we do restrict content in one country because the government has asked us to do so, we publish that in our government request report," said Bickert. A tally of such requests is published online every six months.
Facebook has said that it kept its internal guidelines under wraps until now because it didn't want users to game the system.
"If people knew what they were using to evaluate content then they had an easier time figuring out 'alright well this is what I can get away with,'" said Axios's media reporter Sara Fischer.
Facebook changed its tactic in a bid to regain public trust after the Cambridge Analytica data privacy scandal, evidence of groups trying to manipulate online discourse during the 2016 election, and previous decisions to remove content without sufficient explanation.
Facebook removed posts by two African-American sisters who had expressed their support for Donald Trump's candidacy in 2016. The decision to censor videos by the sisters, known online as Diamond and Silk, angered many conservatives who had already accused Facebook of liberal bias.
Earlier this month, Diamond and Silk received a message from Facebook that said their content was “unsafe to the community.” Days after that message was sent, Facebook's CEO Mark Zuckerberg testified before Congress that it was a mistake to take the sisters' posts down.
Facebook has also been criticized for not pulling down some content soon enough, especially on Facebook Live where people have posted videos of people being killed.
For full interview, click here.