Facebook : Why Do You Leave Up Some Posts But Take Down Others?

By Monika Bickert, Vice President of Global Policy Management, Facebook

We want Facebook to be a safe place where people can also freely discuss different points of view. That’s why we need clear rules of the road, and consistent enforcement of those rules. Those rules are called our Community Standards, and today, for the first time, we published the detailed guidelines that our content reviewers use to apply them to content on our site.

One of the most important things that we have to take into account is context. For example, we would allow someone to use a racial slur self-referentially as a form of empowerment, but if that same slur were used by someone else as an attack, we’d remove it. And to allow for open debate on the issues of the day, we make exceptions to our policies against bullying when it comes to public figures. This lets you criticize elected officials, celebrities, business leaders and other newsmakers in a way that we would’t allow for a private individual.

Sometimes using examples is the easiest way to explain why we make decisions, and I’ll show several based on real posts. To avoid publishing pictures or posts that break our rules, I’ll only use content that’s allowed, while describing content that isn’t.

Hate Speech

Our Community Standards make an important distinction between targeting people and targeting particular beliefs or institutions.

We believe that people should be able to share their views and discuss controversial ideas on Facebook. That’s why our policies allow people to criticize, and even condemn, religious institutions or political parties. But we draw the line when the focus shifts to individuals or groups of people. We don’t allow direct attacks based on what we call protected characteristics: race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity, and serious disability or disease.

In this example, the image and comment suggests that same-sex marriage is a sin. Because the post targets the institution of same-sex marriage, not a specific person or group of people, we allow it. If the same “sin is sin” text were paired with the photo of a same-sex couple, it would be removed.

There’s no doubt that sexual orientation, political beliefs or religion are often at the core of one’s identity — and an attack on the group at large can feel acutely personal. So while we don’t always agree with the opinions expressed — banning debate around issues of identity and ideology would stifle speech and potentially productive exchange of ideas.

Nudity

Nudity can be a powerful form of protest, so it does not violate our policy to post bare breasts in an image that is clearly depicting an act of protest, as in the example here.

We allow bare breasts in other contexts as well, including images of breastfeeding or childbirth, and for health reasons, like breast cancer awareness.

But we also know that nude images are sometimes shared without the consent of the person depicted, even if the person originally consented to the taking of the photograph. We take that very seriously. Because age and consent are very difficult to determine, we strictly limit the types of nude images we allow.

Regulated Goods

Drugs are treated differently across countries and regions. Setting different policies based on the laws of every jurisdiction where we operate is simply not feasible. Instead, we’ve tried to take a common sense approach that we think is most in-line with what people find acceptable.

We don’t allow people to share photos or videos depicting the use of non-medical drugs — or to discuss their own use or encourage others to use. And we don’t allow the sale, trade, or solicitation of any drugs — be it non-medical drugs, pharmaceutical drugs, or marijuana.

With this said, we know that sharing one’s personal battle with addiction can be a powerful way to help one’s own and other people’s recovery. So we do allow discussion and depiction of non-medical drugs in the context of recovery testimonies, like in this example.

Sexual Exploitation

We generally don’t allow descriptions or depictions of non-consensual sexual touching on Facebook.

The #MeToo movement — and the harrowing stories of assault and harassment that were bravely shared by people around the world — led us to evaluate the need for some exceptions. This example is based on a real post in which a survivor recounted being sexually touched as a minor. Her story was widely shared. Given the context — and the intent to raise awareness and shine light on an important issue — we decided to make an exception and allow this post as newsworthy content. We also allowed others to share the post if in support of the survivor or to condemn her abuser. If the post was shared without context or with a neutral caption, we took it down. Needless to say, if the same story was recounted by the perpetrator, it would be removed.

Dangerous Individuals & Organizations

We ban terrorist groups and hate organizations from Facebook and take down any content that praises them, any of their actions, or members. Because we want to encourage counterspeech and the flow of information, we allow posts that explicitly condemn the organizations or report on their activity — as is the case in this example featuring the ISIS flag.

If a post lacks clear condemnation or news value, we take it down. That means that if a person or Page opposes the actions of a terrorist or hate group but shares a photo of that group without context, our reviewers will remove it. We do this because it is difficult for us to tell the person’s intent, and because such posts can easily be shared by others to praise that group.

In addition to our team of content reviewers, we use artificial intelligence and machine-learning to detect and remove terrorist content. And we partner with other technology companies to ensure that content found and removed from Facebook does not crop up elsewhere on the internet.