The German newspaper known as Süddeutsche Zeitung has got its hands on what it claims to be are internal documents which are going to be used to guide how content is moderated on our most popular social media network, Facebook. These documents will let the world in on what Facebook deems as hate speech alongside other offensive content. This has been something that Facebook has been reluctant to disclose for a long time.
In the European regions, authorities have long urged Facebook to be more proactive and quicker in removing xenophobic as well as racist remarks that are aimed toward migrants. Germany especially, has argued that Facebook must curtail racist comments and hate speech in a time where anti-migrant sentiments are on the rise.
According to the newspaper, Facebook strictly prohibits content that targets a person based on characteristics such as race, national origin, religion, or sexual orientation — factors that the company defines as a “protected category.” There are also some categories of the population that receive extra safeguards, these include youths and the elder senior citizens.
The line for hate speech against migrants, however, is still rather blurry. According to Süddeutsche Zeitung:
For instance, saying “fucking Muslims” is not allowed, as religious affiliation is a protected category. However, the sentence “fucking migrants” is allowed, as migrants are only a “quasi protected category” – a special form that was introduced after complaints were made in Germany. This rule states that promoting hate against migrants is allowed under certain circumstances: statements such as “migrants are dirty” are allowed, while “migrants are dirt” isn’t.
Facebook has now outsourced the job for its content moderation to a company known as Arvato in Germany. The quality of these content moderation, however, seems to have faltered. It has been reported that employees right at the bottom of the chain are expected to take a look and judge 2,000 posts a day, it’s no better for those at the top who have around 8 seconds to decide whether a video should be remain online or be removed.
An employee had told SZ: “I’ve seen things that made me seriously question my faith in humanity.”
A Dentist-To-Be Dabbling in Tech Journalism: