Editor's note: Soraya Chemaly is a media critic and feminist activist. Her work focuses on the role of gender in culture, with an emphasis on sexualized violence. She blogs at Feminism is Fantastic
(CNN) -- Last fall, after I wrote an article about misogyny found on Facebook, people began to send me links to content that they had tried and failed to have removed by the site. Among these was a seven-minute video depicting a gang-rape of a girl by the side of the road and illustrations of an assault posted on Facebook by a man whose victim feared for her life.
I began looking more deeply into the subject and more people contacted me. I came across "humor" pages with names like "Raping Your Girlfriend," and text and images of popular rape memes depicting about-to-be-raped, incapacitated girls. There were easily accessed pictures and videos of girls and women frightened, humiliated, bruised, beaten, raped, gang raped, bathed in blood, and, in a recently publicized case, beheaded.
In one instance, Facebook declined to remove an image of a woman, mouth covered in tape, in which the caption read, "Don't tap her and rap her. Tape her and rape her." The photo went viral. Facebook's response to readers who reported it read, "We reviewed the photo you submitted, but found it did not violate our community standards."
This defied reason and Facebook's own terms, which prohibit posts that "attack others based on their race, ethnicity, national origin, religion, sex, gender, sexual orientation, disability or medical condition."
Soraya ChemalyIn a widely reported move this week, Facebook, to its credit, changed its response and is improving its efforts to crack down on such content. I'm recounting here what preceded this decision, because it is instructive both about our culture and what it takes to change it—with this media giant and potentially other sites.
Over the past six months I communicated with Facebook employees to understand their process for determining what content met their criteria and on what those judgments were based. At each stage, they were open, clear about their policies and guidelines and their commitment to balancing user safety with freedom of expression.
Facebook in fact had in place the formal language of a reasonable content policy geared toward ensuring users' safety, but it was not implementing it effectively. This failure disproportionately affected girls and women.
The only conclusion could be that individual moderators were interpreting the polices in ways that were gender biased. How? When 16-year-old boys in Chicago feel comfortable sharing their rape of a 12-year-old-girl for status points, when a group of men in Indonesia film their gang-rape of a young woman by the side of the road and post it for hundreds of people to "like," and when content reviewers daily consider these and related images depicting women scared and cowering, bruised and beaten, to be acceptable, it's not a "glitch" in the system. It is the system.
These images are not created by Facebook. But they regular pass through the Facebook content review process run by people—people whose sexist norms are reflective of the larger culture, and who govern what is "appropriate."
Last Tuesday, I joined Jaclyn Friedman, executive director of Women, Action & the Media, and Laura Bates, founder of The Everyday Sexism Project, to launch a global campaign to confront institutionalized sexism in media. We wrote an open letter to Facebook, co-signed by more than 100 organizations, asking the company to take concrete steps to better understand gender-based hate speech on its site and to train people to recognize violence against women as hateful.
We encouraged users of Facebook to send messages to its advertisers encouraging them to boycott the social media network until it addressed our concerns. Over seven days, men and women around the world sent more than 60,000 tweets using the hashtag #FBrape, and 5,000 e-mails to targeted advertisers, 16 of whom withdrew their advertising.
In a rare statement on its site on Wednesday, Facebook responded, noting that its "systems to identify and remove hate speech have failed to work as effectively" as they would like and they pledging to evaluate and update policies, guidelines and practices relating to hate speech. The company acknowledged the importance of improving training for content moderators (the focus of our requests), so they are better able to understand gender-based hate.
Anger after teen commits suicide Combating sexual assault in the military Somalia's women struggle with rapeThe company said these steps will be taken in collaboration with representatives from Women, Action & the Media, the Everyday Sexism Project and members of our broader community.
Why did we focus on Facebook? Because it is a microcosm of a global culture. This week, people have repeatedly asked me, "Is Facebook sexist?" "Is it Facebook's geek boy tech culture?" No, I suspect that Facebook is no more or less sexist or misogynistic than any other aspect of our culture and media.
But unlike culture at large or "the Internet," Facebook created a useful mechanism for change. It wrote rules about speech. Once Facebook built a structure to review content, it established itself as an arbiter of norms. By doing this, it became a potential agent of positive change for the purposes of confronting historic media imbalances and commonly accepted gender prejudices. It was because Facebook had clearly thought about these topics and invested significant resources in creating a structured and detailed content review process and guidelines for user interactions that we focused on it.
No, this is not feminists censoring the Internet. We are only demanding that Facebook apply its existing content policy rules to women fairly. The fact is, the threat of violence silences women every day -- on and offline. For example, content considered worthy of removal at Facebook for being unsafe must threaten "imminent harm." "Imminent harm" is a luxury for women who experience rape and domestic violence as pervasive threats. The notion of imminent harm will never be equally applicable to men and women as a baseline criterion, as long as we live in a world where women feel far less safe than men.
In the United States, for example, men are 25% more likely to feel safe walking in public at night then women are. We don't live at the OK Corral. What woman can afford to wait until the harm is imminent, when we all adapt our lives to avoid rape and, for one in four or us in the United States, the daily terrorism of abuse? Some 70% of women on the planet will experience violence at the hands of men, most often men they know. People concerned with the potential loss of free speech on the Internet would do well to reprioritize their efforts on this actual loss of free speech being experienced purely as a function of a human being's sex.
Our campaign resonated with so many because we are in the midst of a shifting cultural tide in which gender based violence -- historically kept private -- is better understood as a pandemic problem, including in the public forum of the Internet. Facebook's response is important and indeed may both represent and reflect a cultural shift. It's important because Facebook's action represents an open acknowledgment that violence against women is a serious issue -- not openly stated in the past in the context of this content -- and that it deserves serious attention. Making that acknowledgment an institutional reality is what we will now strive to do.
Follow us on Twitter @CNNOpinion.
Join us on Facebook/CNNOpinion.
{ 0 comments... read them below or add one }
Post a Comment