This slide reveals Facebook’s cringeworthy hate speech policies
How does Facebook decide what is and is not hate speech?
According to a new report from ProPublica, with slides that feature the Backstreet Boys.
The image below comes from the organization’s story that features a deep dive into Facebook’s moderation efforts, complete with never-before-seen documents that detail how moderators are trained.
One of the most cringeworthy parts shows a slide asking, "Which of the below subsets do we protect?" It reads female drivers, black children, and white men. The section for white men shows a picture of the Backstreet Boys.
The answer? The Backstreet Boys White men.
At the absolute minimum, this seems like an absolutely atrocious way to train your staff https://t.co/MWAwHuftqn pic.twitter.com/WDLcRvdmwd
— Alex Hern (@alexhern) June 28, 2017
The policies, according to ProPublica, allowed moderators to delete hate speech against white men because they were under a so-called "protected category" while the other two examples in the above side were in "subset categories," and therefore, attacks were allowed.
The revelation will do little to change the narrative around how Facebook handles hate speech—or more particularly how it has not effectively protected people on the platform who might be part of a "subset."
Facebook’s moderation policies are a complicated system, where these "protected categories" are based on race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation, and serious disability/disease, according to ProPublica. Meanwhile, black children wouldn’t count as a protected category because Facebook does not protect age, and female drivers wouldn’t count because Facebook does not protect occupation.
According to Facebook, its policies aren’t perfect.
“The policies do not always lead to perfect outcomes,” Monika Bickert, head of global policy management at Facebook, told ProPublica. “That is the reality of having policies that apply to a global community where people around the world are going to have very different ideas about what is OK to share.”
That’s a similar excuse Facebook put forth in a blog Tuesday as part of its "Hard Questions" series.
The good news is that Facebook is trying to better itself. That includes being more transparent about its practices, which comes only after reports like ProPublica’s and The Guardian’s recent "Facebook Files" series.
At least Facebook’s come far. ProPublica revealed that back in 2008, when the social network was four years old, Facebook only had a single page for its censorship rulebook, and there was a glaring overall rule:
"At the bottom of the page it said, ‘Take down anything else that makes you feel uncomfortable,’” said Dave Willner, who had joined Facebook’s content team in 2008, told ProPublica.
Willner then worked to created a 15,000-word rulebook, which is still in part used today at the company. And yet, there remains to be many problematic areas on how the network polices itself. The Guardian’s Facebook Files revealed numerous issues like how Facebook allows bullying on the site and other gray areas.
Facebook did not immediately respond to a request for comment.