Facebook vows to do more to combat fake news after facing criticism for Trump’s win

Don’t believe everything you read on social media.

During the 2016 presidential election, misinformation and fake news stories flowed quickly through Facebook, Twitter and other social media sites.

Some of these false articles included one that reported Megyn Kelly was fired from right-wing Fox News for backing Democratic presidential nominee Hillary Clinton. Another story by the “Denver Guardian,” a news outlet that doesn’t exist, reported that an FBI agent tied to Clinton’s latest e-mail leaks was found dead in an apparent murder suicide. Fake Twitter accounts, including one pretending to be CNN, popped up.

Now some people are blaming social media sites, including Facebook, for Republican Donald Trump’s victory in the presidential election.

New York Magazine published a piece titled “Donald Trump Won Because of Facebook,” pointing to the fake news circulating through the world’s most popular social media site.

“Facebook’s labyrinthine sharing and privacy settings mean that fact-checks get lost in the shuffle. Often, no one would even need to click on and read the story for the headline itself to become a widely distributed talking point, repeated elsewhere online, or, sometimes, in real life,” wrote Max Read for the magazine.

And even President Barack Obama expressed concerns about the spread of misinformation at a rally this week.

“The way campaigns have unfolded, we just start accepting crazy stuff as normal and people if they just repeat attacks enough and outright lies over and over again,” Obama said this week at a rally in Ann Arbor, Michigan. “As long as it’s on Facebook, and people can see it, as long as its on social media, people start believing it, and it creates this dust cloud of nonsense.”

In a statement to TechCrunch, Adam Mosseri, VP of product management at Facebook, said the tech firm takes misinformation on the website “very seriously” and is working to do more.

“In Newsfeed we use various signals based on community feedback to determine which posts are likely to contain inaccurate information, and reduce their distribution. In Trending we look at a variety of signals to help make sure the topics being shown are reflective of real-world events, and take additional steps to prevent false or misleading content from appearing. Despite these efforts we understand there’s so much more we need to do, and that is why it’s important that we keep improving our ability to detect misinformation,” he said in the statement.

The tech firm, which has said repeatedly it’s not a media company, didn’t say exactly what it planned to do in the future to stop misinformation from spreading.

For what it’s worth, online users in the United States are also skeptical of the news they see on Facebook and other social media sites.

Only about 4 percent of U.S. adults who use the web trust the information they get from social media sites, according to a survey this year by the Pew Research Center.

Photo Credit: Associated Press

 
 

Share this Post



 
 
 
  • Facebook is going to determine if my post contains inaccurate information? So if I disagree that Oswald shot Kennedy, my post would be blocked?

 
 
css.php