New Facebook Rules Show How Hard It Is to Police 1.4B Users

Facebook has released an updated version of its community guidelines, which spell out in unprecedented detail what constitutes unacceptable behavior on the platform.

Once upon a time, governing the Facebook community was relatively simple, because users---mostly American college students---shared at least some cultural context for what was and wasn’t acceptable. But now Facebook’s 1.39 billion users span a range of ages, ethnicities, religions, gender identities, and nationalities, and Facebook’s ability to create a space that meets everyone’s definition of “safe” increasingly has been called into question.

Which is why today, Facebook updated its community guidelines, spelling out in unprecedented detail what constitutes unacceptable behavior. Yet the unwieldy specificity of the new guidelines only proves that Facebook’s policies and procedures surrounding user activity will never be a finished product. As the world's largest social network, Facebook certainly can learn a lot from the past, but it can never fully anticipate the future.

"It’s a challenge to maintain one set of standards that meets the needs of a diverse global community," Facebook executives wrote in a news release announcing the update. “For one thing, people from different backgrounds may have different ideas about what’s appropriate to share---a video posted as a joke by one person might be upsetting to someone else, but it may not violate our standards.”

The new guidelines address everything from hate speech to nudity, making it clear that things like revenge porn, graphic images that glorify violence, and posts that threaten self harm or harm to others are explicitly prohibited. As always, anyone who violates these rules runs the risk of having their posts --- or their accounts --- blocked. According to the post, the updates to Facebook’s community guidelines aren’t so much tweaks to the guidelines, but fully formed explanations of how Facebook historically has assessed controversial content. For instance, though Facebook always has had a policy against nudity, the new guidelines get very specific about what type of nudity that rule refers to.

“We remove photographs of people displaying genitals or focusing in on fully exposed buttocks,” the new policy reads. “We also restrict some images of female breasts if they include the nipple, but we always allow photos of women actively engaged in breastfeeding or showing breasts with post-mastectomy scarring.”

This and other lines in the community guidelines reveal just how reactionary many of these rules are. Facebook’s anti-nudity policies have, in the past, provoked the ire of breast cancer survivors whose post-surgery photos repeatedly were removed. Banned photos of breastfeeding have been similarly controversial.

These impassioned responses essentially have forced Facebook to add nuance to its blanket policies. "Our policies can sometimes be more blunt than we would like and restrict content shared for legitimate purposes," Facebook's updated nudity policy reads. "We are always working to get better at evaluating this content and enforcing our standards."

Facebook’s new guidelines also address last year's "real name" scandal, in which the company drew fire from the drag queen community for its policy requiring users to set up accounts with their real names. After two weeks of fighting, Facebook rethought its policy, with Chris Cox, the company’s chief product officer writing a lengthy apology on Facebook. “We're going to fix the way this policy gets handled so everyone affected here can go back to using Facebook as you were,” he wrote.

Now, Facebook says users are free to sign up with their "authentic identities," or the names they go by on a daily basis. And yet, just as these rules are adjustments to past policies, they too, will likely be subject to adjustment down the line. The fact is, keeping social media "safe" for all users, with their many definitions of what that word means, is an unachievable task, and yet, Facebook is not alone in chasing it. Twitter has faced similar dissent from its community regarding the policing of harassment on the platform, leading CEO Dick Costolo to recently pledge to begin kicking trolls off of Twitter "left and right, and making sure that when they issue their ridiculous attacks, nobody hears them."

But for Facebook, Twitter, and indeed, all social networks, the central challenge of accomplishing this mission is that they cannot solve it the way they solve other problems, which is to say, with technology. Today, Facebook still relies on people to report a violation, and other people within Facebook to determine whether that piece of content truly does violate its guidelines. That's because no amount of expertly crafted algorithms can substitute for human sensitivity and judgment. But with billions of pieces of content being created every day from all corners of the globe, no amount of rules can adequately address every little judgment call these human moderators are forced to make. And so, the guidelines will always be a work in progress, forever begging to be rewritten.