Facebook's rules on graphic content

Posted May 22, 2017

According to the report, Facebook moderators complain about the volume of work which often leaves them only 10 seconds to make a decision about potentially-offensive content.

"We aim to disrupt potential real world harm caused from people inciting or coordinating harm to other people or property by requiring certain details to be present in order to consider the threat credible", says the document. One of the documents says that Facebook did this based on advice from the Samaritans and Lifeline, both anti-suicide nonprofits that operate helplines in the USA and UK. Over the past few years, the social network has undergone much criticism for its admissive policy and the violent content posted. Photos of non-sexual physical abuse and bullying of children are allowable so long as there is no sadistic or celebratory element.

Its large user base of almost 2 billion also means that it is hard to find consensus on content guidelines.

Another example would be how Facebook deals with videos that might depict a violent death.

Reacting to the leak, Monica Bickert, Facebook's Head of Global Policy Management, said, "Keeping people on Facebook safe is the most important thing we do". - Sadism and celebration restrictions apply to all imagery of animal abuse.

More news: Monaco Have Rejected An Astronomical Real Madrid Bid For Kylian Mbappe

For instance, the manual explains that the phrase "Someone shoot Trump" should be deleted by the company because it specifically targets a head of state.

All "handmade" art showing nudity and sexual activity is allowed but digitally made art showing sexual activity is not, the newspaper claimed.

With that said, videos of violent deaths are left untouched as they might raise awareness surrounding mental illness.

Images that are posted, or videos that are livestreamed, with such content, say the documents, can be removed unless the incident has "news value".

Amongst the hundreds of files reportedly seen by The Guardian are guidelines for dealing with self-harm that show how the company will allow users to livestream attempts to self harm because it "doesn't want to censor or punish people in distress who are attempting suicide".

More news: IPhone 8 phone design

It's a bit depressing how hard it is to tease out the rationale behind these guidelines: "I'm going to kill you" is not a credible threat because it's abstract, but they very specific "unless you stop bitching I'll have to cut your tongue out" still works.

Zuckerberg's free content ad network-which continues to have a very strict policy about nudity on the site-is also dodging the publisher tag for a very expensive reason: if it were to edit and curate the posts on its site, the company would suddenly be exposed to libel laws. Extreme cases of abuse is also allowed but must be marked "disturbing".

The Guardian published documents that it received describing Facebook's content filtering policy.

The rules appear to reflect the scars of legal and public relations battles Facebook and other social media platforms have fought over the last decade.

More news: Japan scrambles jets after 4 Chinese vessels and drone near disputed islets