Leaked documents show Facebook's guidelines on hate speech are a muddled mess

As Facebook faces intense scrutiny from high-ranking German officials over how the platform handles hate speech, a German newspaper has shared what it's reporting to be internal documents that show a detailed yet confusing blueprint for how Facebook handles hate speech on its site.

The documents were obtained and published by German newspaper Süddeutsche Zeitung, the newspaper that originated the reporting on the infamous "Panama Papers."

The report was published last week, just as German politicians were threatening to create a new law that would punish platforms like Facebook for refusing to properly delete hate speech and fake news. The new law would be in addition to the country's internet censorship laws that are far more flexible compared to the United States.

The confusion with labeling hate speech

Most notable in the internal documents, especially in light of the current ideological tug-of-war between Germany and the social media platform, is the "protected category" and "non-protected category," the means by which Facebook organizes which specific groups should be protected by hate speech.

Those categories include: sex, religious affiliation, national origin, gender identity, race, ethnicity, sexual orientation, and disability or serious illness. There are additional sub-categories, too, like age, political affiliation, and appearance.

And this is where it gets confusing, especially when "protected categories" and "non-protected categories" are combined and certain categories take precedence, causing some conflicting examples.

In one example, it's pointed out that a post insulting "Irish women" is not permissible because both nationality and gender, "protected categories," are at play. But something negative posted about "Irish teens" would be permitted as "teens" is a non-protected category, somehow seemingly canceling out the nationality protection.

In another, equally confusing example, according to an excerpt from the document, posting "migrants are filthy cockroaches that will infect our country"or "migrants are scum" violate the site's policy but posting "fucking migrants" and "migrants are so filthy" does not.

RELATED: How to avoid Facebook phishing scams

7 PHOTOS
How to avoid Facebook phishing scams
See Gallery
How to avoid Facebook phishing scams

1. Exercise common sense

Why is somebody offering you something that costs them money to purchase - and to market - for free? Does there seem to be a legitimate reason for the offer? What value does the party giving away the object receive in return? Does that value warrant giving away the object - or is the offer simply too good to be true? As you probably learned as a child - "don't take candy from strangers."

2. Consider how much is being given away

Legitimate giveaways done for marketing purposes are typically inexpensive items, downloadable materials, or extremely small quantities of expensive items to a small percentage of sweepstakes winners selected from a targeted group; any offer that claims to be giving away large numbers of expensive items should raise a red flag as doing so rarely makes sense from a business standpoint, especially if the offer is being promoted to the general public on social media.

(Adam Gault via Getty Images)

3. Check if a page is verified

Most major businesses are verified (with a white check on a blue circle - some small businesses have similar marks that are white on gray), so if an offer is ostensibly coming from a large business and the page from which it is being posted is not verified, that may signal problems. Not all businesses are verified; if you see a post from a business that is not verified, however, you can search on the business's name and see if there is a verified account for the business - if there is, you know that the unverified account is likely fake.

More From Inc.com: 10 Things You Can Do in Your Daily Life to Improve Your Personal Development

(Nastco)

4. Look at the fine print

Legitimate sweepstakes and giveaways always have some sorts of "fine print" associated with them - if there are no "Offer Details," "Terms and Conditions," or the like, consider a huge red flag to have been raised.

(Reptile8488 via Getty Images)

5. Look for signs of an unprofessional post

Spelling mistakes, grammar mistakes, misuse of idioms, writing that appears to have been auto-translated or written without knowledge of "how people speak," or photos that don't seem to match the post are all red flags. Do you really think a major firm running a marketing campaign doesn't check its content before posting it on Facebook?

(Just One Film via Getty Images)

6. Check the page's age and what appeared on it prior to the questionable post

it is a bad sign if a page was created right before an offer post was made. Of course, criminals know that people look out for page age - so they may create pages and post for a while before using the page for scams. So look out for what content was shared before? Does it make sense coming from the business? Do the comments on those posts make sense? Often there are giveaways on such pages that something is amiss.

(AOL)
HIDE CAPTION
SHOW CAPTION
of
SEE ALL
BACK TO SLIDE

According to SZ, "migrants" were classified as a "quasi protected category" in Germany and yet the above examples show a clear inconsistency in how the rules are applied.

Keepers of the policy

SZ also spoke with employees at Arvato, an outside company that assists Facebook in Germany with keeping track of such violations and moderating the website. Overall, more than 600 people are working there on behalf of Facebook and finding it increasingly difficult to maintain the company's policies.

Workers complained of the "unclear" guidelines that change often and of the heavy workload — workers lowest on the chain of command are told to moderate 2,000 posts a day — at low pay, just above minimum wage (in Germany, between US$9 and $10 an hour).

There's also an emotional toll, as workers have to view videos that range from child pornography to terrorist beheadings and other violent images. One employee told SZ, "I've seen things that made me seriously question my faith in humanity. Things like torture and bestiality."

A screenshot of the report by Süddeutsche Zeitung.

A screenshot of the report by Süddeutsche Zeitung.

Mashable has reached out to Arvato for comments regarding the document and the employees' statements.

Where does Facebook go from here?

It's been a tough year for Facebook, particularly the last few months, and the ongoing saga in Germany only shines a brighter spotlight on its failings.

While those 600-plus workers toil away, moderating posts based on seemingly contradictory guidelines, Facebook fired the human editors who maintained its popular "trending topics" section which many say led to the proliferation of fake news on the platform that affected the outcome of the U.S. presidential election (Facebook CEO Mark Zuckerberg originally denied this accusation).

The platform has taken some steps to remedy the situation, such as searching for a "Head of News Partnerships" to bring more newsroom experience to the company. The company has also banned fake news sites from its ad network and Zuckerberg himself has backtracked on his initial denial, vowing to fight the spread of fake news on the site.

In a statement via email, a Facebook spokesperson told Mashable, "Facebook is no place for the dissemination of hate speech, racism or calls for violence. We evaluate reported content seriously. And as we learn from experts, we continue to refine the way we implement our policies to keep our community safe, especially for people that may be vulnerable or under attack."

But if SZ's reports are accurate, it's a sign of the company taking one step forward but two steps back. Even where the human element is reportedly involved in moderating the site, it's hard to see that plan being very effective given the working conditions and the shifting, conflicting guidelines.


Read Full Story