With this lack of transparency and accountability, the company plays judge, jury, and executioner with its patchwork of policies, leaving many users stuck in automated customer service loops. Third, Facebook does not share the details of its enforcement guidelines, release data on the prevalence of hate speech, or give users an opportunity to appeal decisions and receive individualized support. Sometimes Facebook formally implements exceptions to rules, and then the risk of reviewers’ own opinions or biases interfering is huge. Moreover, it appears that Facebook’s own policies aren’t applied consistently. However, given that Facebook’s user base just topped 2 billion, even something that affects 1 percent of users still affects 20 million people.
Second, Facebook’s approach to most issues, including authentic names or hate speech, is to create one-size-fits-all policies that it claims will work for the majority of users. (It is also not too surprising that Facebook ultimately protects white men, given its employee demographics.) Such a color-blind and non-intersectional approach fails to acknowledge the ways in which groups discriminated against differently. As leaked documents recently published by ProPublica indicate, its policies aim to prevent harassment of users based on “protected categories” like race, gender, and sexual orientation however, by making exceptions for subsets of protected groups, the company’s protocols paradoxically “protect white men from hate speech but not black children,” as ProPublica reported. We’ve identified four interrelated problems.įirst, Facebook’s leadership doesn’t seem to understand the nuances of diverse identities. The company is also increasingly under pressure from users, groups, and now governments to improve its procedures-Germany just passed legislation requiring social media companies to remove hate speech. However, as with its real names policy, while Facebook’s intentions may be noble, its algorithms and human-review teams still make too many mistakes. And just like Facebook’s dangerous and discriminatory real names policy, these examples demonstrate how the company’s own practices often amplify harassment and cause real harm to marginalized groups like LGBTQ people, communities of color, and domestic violence survivors-especially when used as a form of bullying to silence other users for their identities or political activities. Whether intentional or not, these moderation fails constitute a form of censorship. However, Facebook’s algorithmic and human reviewers seem unable to accurately parse the context and intent of their usage. While these words are still too-often shouted as slurs, they’re also frequently “ reclaimed” by queer and transgender people as a means of self-expression. Both are organizers with the #MyNameIs campaign. Lil Miss Hot Mess ( is a PhD student in media studies at NYU by day and a drag queen by night.
Dottie Lux ( is an event producer and the creator of Red Hots Burlesque, a queer burlesque and cabaret she is also a co-owner at San Francisco’s Legacy Business The Stud Bar.