Along with all those friend updates, vacation photos and links to funny cartoons, some pretty sickening stuff has made its way onto the pages of Facebook on occasion.
But after a coalition of women’s advocates called for an advertising boycott, Facebook officials acknowledged this week that they have fallen down on the job of policing the world’s largest social network for pages and photos that endorse violence against women.
“We need to do better – and we will,” Facebook executive Marne Levine wrote in a corporate blog post, adding that some content has not been removed quickly enough, and other material that should be pulled “has not been or has been evaluated using outdated criteria.”
According to an open letter by the Women, Action and the Media coalition, these included pages with such titles as “Violently Raping Your Friend Just for Laughs” and “Raping Your Girlfriend” along with photos of women who are bruised, bleeding or tied up. Some of the pages appear to be attempts at sick humor, but they’re clearly graphic and unsettling. (We won’t bother linking to the pages, which appear to have been taken down as of Wednesday.)
Ironically, as the coalition pointed out, Facebook has been criticized in the past for over-reacting to the occasional photo or cartoon that showed breast-feeding or other nudity. “It appears that Facebook considers violence against women to be less offensive than non-violent images of women’s bodies,” the letter charged.
In her blog post, Levine said the company tries to strike a balance between barring harmful speech and allowing free speech. “We prohibit content deemed to be directly harmful, but allow content that is offensive or controversial,” she wrote, defining harmful content as “anything organizing real world violence … or that directly inflicts emotional distress …”
But the company somehow didn’t spot or stop the kinds of pages highlighted by the WAM coalition’s letter, which prompted a campaign of more than 60,000 tweets and 5,000 emails from individuals upset about the violent pages. The coalition also alerted advertisers, and according to Reuters, several said they would pull their ads until Facebook could offer assurances that the ads wouldn’t appear on those pages.
Facebook’s Levine said in her blog post that the company plans to update its criteria and training for staffers who review reports of hateful speech or harmful content. It will also require individuals to disclose their own identities when they post items that don’t qualify as “actionable hate speech” but still appear cruel or insensitive.