Jacob Johanssen, Senior Lecturer in the Communication and Media Research Institute, has written a blog for the Huffington Post on Facebook’s policy guidelines leaked by The Guardian.

Jacob Johanssen first commented on the policy guidelines for Facebook content moderators which was recently leaked to The Guardian. “They often seem arbitrary and inconclusive and it is perhaps not surprising that Facebook was keen to keep them under wraps. For example, Facebook’s rulebook suggests to allow a user posting a comment such as “fuck off and die”, but “someone shoot Trump” should be deleted on the other hand. Some photos of non-sexual physical abuse and bullying of children should only be deleted if there is a sadistic or celebratory element within them.”

Continuing, the blog dealt with a more in-depth question with regards to Facebook’s guidelines: how such decisions are made by individual moderators and how they decide which post needs to be censored. Jacob Johanssen said: “Facebook is a big data platform of content accumulation. It has 2 billion users worldwide and an estimated 1.3m posts are shared every minute. One would think that this may make content moderation difficult if not impossible. To address the sometimes fine line between free speech and hate speech or content that is deemed problematic, Facebook has teams of moderators across the world, who, as The Guardian notes, are employed by subcontractors, having to scan through hundreds of disturbing images in each shift.”

He added: “This approach is wrong. Facebook would do good to hire more moderators who are properly trained and can spend sufficient time on each case rather than having to make decisions within split seconds.”

Although Facebook argues that the platform can be used as a container where users can freely express their frustrations and get things off their chests, this view can be seen as limited. “This may be true to some extent. But when it comes to violent language that expresses the desire to kill someone else, consequences may be very real and the question is if such statements should be moderated and in what ways.”

To conclude the blog, Jacob Johanssen noted that Facebook openly discussing and assessing its policies could be the right alternative to this issue. “Expressions such as “most often” or “reasonable ground” expose the vagueness and opaqueness of Facebook moderation policies. Facebook now needs to release its policies in full and enter into a constructive dialogue about how and if they should be changed. The same goes for its workers who moderate content and what their working conditions are.”

Read the full blog on the Huffington Post official website.

Press and media enquiries

Contact us on:

[email protected]