© 2024 Blaze Media LLC. All rights reserved.
Ever Wonder Why Something Gets Banned on Facebook? Here's the Answer

Ever Wonder Why Something Gets Banned on Facebook? Here's the Answer

"Crushed head, limbs, etc. are OK as long as no insides are showing."

What's considered appropriate and not appropriate for posting on Facebook in the company's eyes? According to the Daily Mail, a third-party content-moderating firm followed a rulebook, which was recently leaked, with very specific guidelines.

Here are a few of the rules on Facebook's "Abuse Standards Violations" that would merit content being removed:

  • Any obvious sexual activity, even if naked parts are hidden from view by hands, clothing or other objects. Cartoons, art included. Foreplay allowed (kissing, groping, etc.) even for same sex (man-man/woman-woman)
  • People "using the bathroom."
  • Illegal drugs not shown in the context of medical, academic or scientific study. Depictions of marijuana, unless it involves buying/selling or growing appears to be OK.
  • "Versus photos" or "Vs photos": photos comparing two people side-by-side.
  • Images of drunk and unconscious people, or sleeping people with things drawn on their faces.
  • Urine, vomit, feces, semen, pus and ear wax. (Cartoon feces, urine and spit are OK; real and cartoon snot are OK)
  • Violent speech (For example: I love hearing skulls crack).
  • Crushed head, limbs, etc. are OK as long as no insides are showing.

The Daily Mail goes on to report that Amine Derkaoui, a 21 year old from Morocco, previously worked for oDesk -- the third-party company used by Facebook -- and said that this document is only part of a 17-page instruction book. Derkaoui says he got paid $1 per hour by the company that recently posted its IPO at $100 billion.

Derkaoui leaked the full document and provided an interview to Gawker last week as payback for the meager pay and for not making it out of the training program to be a moderator for Facebook after missing an important test due to Ramadan. Here's more from Gawker's interviews with other content moderators:

The former moderators I spoke to were from countries in Asia, Africa and Central America. They were young and well-educated, which is unsurprising considering their proficiency in English. One is applying to graduate schools to study political science in the U.S. next year; another is currently in an engineering program. For them, Facebook content moderation was a way to make money on the side with a few four-hour shifts a week. Many had done other jobs through oDesk, such as working as a virtual assistant.

Like Derkaoui, most agreed that the pay sucked, while also acknowledging that it was typical of the sort of work available on oDesk. Derkaoui was the only one who cited money as a reason for quitting. The others seemed more affected by the hours they'd spent wading in the dark side of Facebook.

"Think like that there is a sewer channel," one moderator explained during a recent Skype chat, "and all of the mess/dirt/ waste/shit of the world flow towards you and you have to clean it."

Each moderator seemed to find a different genre of offensive content especially jarring. One was shaken by videos of animal abuse. ("A couple a day," he said.) For another, it was the racism: "You had KKK cropping up everywhere." Another complained of violent videos of "bad fights, a man beating another."

The Daily Mail says that some have voiced concern about third parties seeing some of their details, to which Facebook issued the following response:

"In an effort to quickly and efficiently process the millions of reports we receive every day, we have found it helpful to contract third parties to provide precursory classification of a small proportion of reported content.

'These contractors are subject to rigorous quality controls and we have implemented several layers of safeguards to protect the data of those using our service."

The Daily Mail points out that Facebook is rather vague in its community guidelines, which are posted online for public use, stating that "inappropriately graphic content" is banned.  

Gawker reported that Facebook has since updated its document to now approve images of bodily fluids.

Want to leave a tip?

We answer to you. Help keep our content free of advertisers and big tech censorship by leaving a tip today.
Want to join the conversation?
Already a subscriber?