Skip to main content

Leaked Docs Show How Facebook Decides Which Types of Content Are Allowed

* Documents include memos, flowcharts, blueprints and more
* Reveals how Facebook handles hate speech, self-harm and other issues
* Rapid growth has made it difficult to manage so many accounts 

Ever wondered why some controversial Facebook videos are taken down why others remain? Leaked memos obtained by the Guardian last week may give us a little more insight. The documents reveal startling ways in which Facebook handles sensitive issues like hate speech, self-harm and even revenge porn.

With more than two billion monthly users, the world’s biggest social media site is finding it more difficult to keep up with all the content that’s being churned out. The documents show how Facebook allegedly reviews more than six million reports a week relating to fake accounts alone. Apparently, moderators are feeling constantly overwhelmed and only have about 10 seconds to make a decision when it comes to censoring content.

Because of this split-second decision-making policy, some themes may take precedent over others. Violent videos, for example, are not always removed — especially when these videos have the potential to lead to more public awareness. Awareness of what, however, is up to the particular moderator. Videos that involve self-harm are also less censored, as Facebook claims it doesn’t want to punish people in distress.

There also seems to be a discrepancy in how sexism is covered. In fact, hate speech and misogynistic posts may only be censored when there’s sufficient evidence of a direct threat. However, any comment that threatens a U.S. president will quickly be taken down.

Still confused? In Facebook’s own words:

 We aim to allow as much speech as possible but draw the line at content that could credibly cause real world harm. People commonly express disdain or disagreement by threatening or calling for violence in generally facetious and unserious ways.”