Facebook adding 3,000 reviewers in wake of violent videos

Amid the rash of violence that’s been broadcast on Facebook recently, Mark Zuckerberg said Wednesday that the company is adding 3,000 people to help review videos on the platform.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook — either live or in video posted later,” Facebook’s CEO wrote in a post published this morning. “It’s heartbreaking, and I’ve been reflecting on how we can do better for our community.”

The social network has been slammed for not responding quickly enough to take down disturbing videos, including one that showed a man being shot in the head in Cleveland last month. The video was uploaded by the suspected shooter, Steve Stephens, who later killed himself when the police caught up with him.

After being criticized for leaving the video up for a couple of hours, Facebook felt compelled to release a timeline of how the events unfolded on its platform. Although the video of the shooting was up for a couple of hours, Facebook pointed out that the company acted less than half an hour after it was first reported — although it said it needed to do better.

Zuckerberg says now that the company will add 3,000 workers to its worldwide community operations team of 4,500. The “reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation,” he said. He said Facebook — which has 1.9 billion users — gets millions of reports every week, and bulking up on reviewers should help speed things up.

A couple of weeks after the Cleveland killing, a Thai man killed his 11-month-old daughter while streaming video on Facebook Live. The videos reportedly were viewable on Facebook for a whole day before being taken down.

Other Facebook videos have also shown suicides, sexual assault, beatings and police brutality.

In his post, Zuckerberg also repeated what the company has said before in response to videos that show violence: Facebook is supplementing the efforts of human reviewers with technology that’s meant to improve the reporting process. The company has said before that this technology includes artificial intelligence.

The effect of this increased attention on Facebook’s review process is bound to raise censorship issues, though. For example, the company has already been accused of racially biased censorship, and advocacy groups have called on the social network to be more transparent about how it chooses to take down content or shut down accounts.

 

Photo by Associated Press

 

Tags: , , ,

 

Share this Post



 
 
 
  • Simon Miller

    How much does it pay? Do people over 50 not need to apply?

 
 
css.php