As Facebook faces intense scrutiny from high-ranking German officials over how the platform handles hate speech, a German newspaper has shared what it's reporting to be internal documents that show a detailed yet confusing blueprint for how Facebook handles hate speech on its site.
The report was published last week, just as German politicians were threatening to create a new law that would punish platforms like Facebook for refusing to properly delete hate speech and fake news. The new law would be in addition to the country's internet censorship laws that are far more flexible compared to the United States.
The confusion with labeling hate speech
Most notable in the internal documents, especially in light of the current ideological tug-of-war between Germany and the social media platform, is the "protected category" and "non-protected category," the means by which Facebook organizes which specific groups should be protected by hate speech.
Those categories include: sex, religious affiliation, national origin, gender identity, race, ethnicity, sexual orientation, and disability or serious illness. There are additional sub-categories, too, like age, political affiliation, and appearance.
And this is where it gets confusing, especially when "protected categories" and "non-protected categories" are combined and certain categories take precedence, causing some conflicting examples.
In one example, it's pointed out that a post insulting "Irish women" is not permissible because both nationality and gender, "protected categories," are at play. But something negative posted about "Irish teens" would be permitted as "teens" is a non-protected category, somehow seemingly canceling out the nationality protection.
In another, equally confusing example, according to an excerpt from the document, posting "migrants are filthy cockroaches that will infect our country"or "migrants are scum" violate the site's policy but posting "fucking migrants" and "migrants are so filthy" does not.
RELATED: How to avoid Facebook phishing scams
According to SZ, "migrants" were classified as a "quasi protected category" in Germany and yet the above examples show a clear inconsistency in how the rules are applied.
Keepers of the policy
SZ also spoke with employees at Arvato, an outside company that assists Facebook in Germany with keeping track of such violations and moderating the website. Overall, more than 600 people are working there on behalf of Facebook and finding it increasingly difficult to maintain the company's policies.
Workers complained of the "unclear" guidelines that change often and of the heavy workload — workers lowest on the chain of command are told to moderate 2,000 posts a day — at low pay, just above minimum wage (in Germany, between US$9 and $10 an hour).
There's also an emotional toll, as workers have to view videos that range from child pornography to terrorist beheadings and other violent images. One employee told SZ, "I've seen things that made me seriously question my faith in humanity. Things like torture and bestiality."
Image: Süddeutsche Zeitung
Mashable has reached out to Arvato for comments regarding the document and the employees' statements.
Where does Facebook go from here?
It's been a tough year for Facebook, particularly the last few months, and the ongoing saga in Germany only shines a brighter spotlight on its failings.
While those 600-plus workers toil away, moderating posts based on seemingly contradictory guidelines, Facebook fired the human editors who maintained its popular "trending topics" section which many say led to the proliferation of fake news on the platform that affected the outcome of the U.S. presidential election (Facebook CEO Mark Zuckerberg originally denied this accusation).
The platform has taken some steps to remedy the situation, such as searching for a "Head of News Partnerships" to bring more newsroom experience to the company. The company has also banned fake news sites from its ad network and Zuckerberg himself has backtracked on his initial denial, vowing to fight the spread of fake news on the site.
In a statement via email, a Facebook spokesperson told Mashable, "Facebook is no place for the dissemination of hate speech, racism or calls for violence. We evaluate reported content seriously. And as we learn from experts, we continue to refine the way we implement our policies to keep our community safe, especially for people that may be vulnerable or under attack."
But if SZ's reports are accurate, it's a sign of the company taking one step forward but two steps back. Even where the human element is reportedly involved in moderating the site, it's hard to see that plan being very effective given the working conditions and the shifting, conflicting guidelines.