Should Facebook be more transparent about how it moderates hate speech?

With more than 2 billion users worldwide, Facebook has been making tough calls when it comes to policing hate speech, harassment, nudity and violence online.

But some digital rights groups are calling on the tech firm to share more details about how it decides what to pull down or keep up.

“I think that because you have greater power, you have greater responsibility. And with that responsibility, sometimes you’re going to have to bring out a little bit more transparency,” Eva Galperin, Electronic Frontier Foundation’s director of cybersecurity, told a Facebook executive at a panel discussion on Wednesday night in San Francisco.

EFF found out through its online censorship project that Facebook’s online rules are not being enforced evenly across the board, she said. When people get locked out of their accounts it can interrupt their work or daily lives, especially because Facebook is linked to other apps.

Some activists have accused the company’s content moderators before of punishing minority users.

Facebook Chief Security Officer Alex Stamos pushed back, noting that mistakes are bound to happen but are also rare. Some posts might mistakenly get flagged when a user speaks out against hate speech.

“If you turn up that dial of trying to prevent hate speech you will also turn up the dial of false positives,” he said.

Stamos estimated that the number of accounts that Facebook shuts down per day is “at least seven or eight figures” based on the number of spam and fraudulent accounts that are created daily. CNBC, who spoke to Stamos after the event, said he confirmed that the tech firm shuts down more than 1 million accounts per day.

In the past, public criticism has led Facebook to change how it enforces some of its online rules. The tech firm apologized last year after it pulled down an iconic Vietnam War photo that depicted a naked girl fleeing a napalm attack.

But Stamos questioned whether divulging more details about how Facebook enforces its rules will help.

“I’m not sure if we’re in a media environment honestly where a lot of transparency in this area is going to end up with people being better off,” he said.

Here’s the full panel discussion with Stamos and Galperin, which was part of interviews about cybersecurity by the Advanced Computing Systems Association or USENIX:

Photo by Associated Press




Share this Post

  • hoapres

    Facebook doesn’t believe in free speech.