Skip to content
(FILES)This February 25, 2013 file photo taken in Washington, DC, shows the splash page for the Internet social media giant Facebook. A computer program that analyzes your Facebook "likes" may be a better judge of your personality than your closest friends and family, according to research out January 12, 2015. The study in the Proceedings of the National Academy of Sciences was led by researchers at the University of Cambridge and Stanford University.  AFP PHOTO / Karen BLEIER / FILESKAREN BLEIER/AFP/Getty Images
(FILES)This February 25, 2013 file photo taken in Washington, DC, shows the splash page for the Internet social media giant Facebook. A computer program that analyzes your Facebook “likes” may be a better judge of your personality than your closest friends and family, according to research out January 12, 2015. The study in the Proceedings of the National Academy of Sciences was led by researchers at the University of Cambridge and Stanford University. AFP PHOTO / Karen BLEIER / FILESKAREN BLEIER/AFP/Getty Images
PUBLISHED: | UPDATED:

It seems like only yesterday we were talking about the political echo chamber in social media. (Actually, it was last week.) Now Facebook is essentially saying it’s more users’ fault than it is the social network’s.

A new study by Facebook addresses concerns about it serving as a stage for political divisiveness. As I wrote last week, people such as former Treasury Secretary Robert Rubin refer to social media as “a conductor of ideology, an echo chamber” that presumably is bad for democracy.

In a study published Thursday in the journal Science, Facebook said: On average, about 23 percent of users’ friends have opposing political views. Also, about 29 percent of what users see on their news feeds “cuts across ideological lines.” Whatever political echo chamber there is, according to the study, stems from users’ actions on the site more than the social network’s filtering.

Not surprisingly, some find fault with that assertion.

“There is no scenario in which ‘user choices’ vs. ‘the algorithm’ can be traded off, because they happen together,” writes Christian Sandvig, an associate professor of Communication Studies and Information at the University of Michigan and associate at the Berkman Center for Internet & Society at Harvard University, on the blog of the Social Media Collective at Microsoft Research New England.

But no, no, Facebook insists: “By showing that people are exposed to a substantial amount of content from friends with opposing viewpoints, our findings contrast concerns that people might ‘list and speak only to the like-minded’ while online,” says a blog post written by the data scientists who conducted the study. “The composition of our social networks is the most important factor affecting the mix of content encountered on social media with individual choice also playing a large role.”

The study was based on a look at 10.1 million Facebook users “who self-report their ideological affiliation,” and 7 million URLs shared between July 7, 2014 and January 7, 2015.

Other critics of the study expressed concern with the sample of users. The New York Times noted that researchers and academics said that people who report their political leanings are “almost certainly going to behave quite differently, on average, than people who do not.”

Photo from AFP/Getty Images