Facebook is Still Struggling When It Comes to Hate Speech | Information Technology News

Facebook is Still Struggling When It Comes to Hate Speech

A German newspaper Süddeutsche Zeitung shared a report, citing internal documents, which reveals how Facebook takes care of hate speech and fake news.

The report was published in the papers last week, when the German politicians threatened to create laws against social networks for refusing to permanently delete hate speech or fake news from their platforms.

The internal documents in the paper reveal how Facebook categorizes its content into two different categories, called the “non-protected category” and the “protected category” by which Facebook organizes which specific groups should be protected by hate speech and which ones shouldn’t be.

These categories include sex, religious affiliation, national origin, gender identity, race, ethnicity, sexual orientation, disability or serious illness, age, political affiliation and appearance as well.

Inconsistency in the Rules

Facebook has a problem when it comes to dealing with hate speech and here’s why.

Given the above mentioned categories, Facebook faces a conflict it encounters certain content on its network which falls into both protected and un-protected categories.

Let’s say there’s hate speech in a post against “Pakistani Women”, now that would fall into the protected category since it has both gender and race included in it, but if it was hate speech against “Pakistani teens” instead, this wouldn’t fall into the protected category since “teens” is a non-protected category.

Let’s talk about another example given in the German paper for more clarification.

Even though “migrants” falls into the protected category, if someone posts “migrants are scum”, that would violate the site’s policy, but posting “migrants are filthy” would not.

These two examples go a long way in showing the inconsistency in the rules applied by the social media platform.

Workers Under Great Pressure

There are around 600 people working for Facebook that take care of this matter.

A lot of them have complained that the guidelines are almost always unclear and the workload on their shoulders is also quite heavy. They say they get paid quite low, just a little above the minimum wage for moderating more than 2,000 posts each day.

They say that working on this kind of work also has ill-effects on their emotional state since they often have to view things like child pornography, terrorist beheadings and other violent images. One of the employees said:

“I’ve seen things that made me seriously question my faith in humanity. Things like torture and bestiality”.

Via Mashable

Leave a Reply

Your email address will not be published. Required fields are marked *

*