Skip to content

Breaking News

In this June 11, 2014 photo, a man poses for photographs in front of the Facebook sign on the Facebook campus in Menlo Park, Calif. In the coming weeks, Facebook will start offering advertisers another way to tailor ads in the U.S., based on information gathered from other websites you visit and the apps you use. (AP Photo/Jeff Chiu)
In this June 11, 2014 photo, a man poses for photographs in front of the Facebook sign on the Facebook campus in Menlo Park, Calif. In the coming weeks, Facebook will start offering advertisers another way to tailor ads in the U.S., based on information gathered from other websites you visit and the apps you use. (AP Photo/Jeff Chiu)
PUBLISHED: | UPDATED:

It could’ve happened to you: Facebook experimented with manipulating users’ emotions through their News Feeds. The research — which found that emotions can be contagious — involved almost 700,000 people, a small percentage of the social network’s more than 1 billion users. Among the biggest questions: Who cares?

In the outraged camp are privacy advocates who say, among other things, that Facebook shouldn’t have conducted the study without letting users know. But there’s also talk that it’s no big deal in this Big Data world; people agree when they join Facebook that their information will be used in all kinds of ways, so why all the hand-wringing? There was informed consent.

Nevertheless, Susan Fiske, the Princeton psychology professor at Princeton who edited the study, said she found it “creepy.” She told the Atlantic: “It’s ethically okay from the regulations perspective, but ethics are kind of social decisions.” Oh, and in case there’s any question, the study’s authors told her the experiment was approved because Facebook manipulates its users’ News Feeds “all the time.”

Speaking of all the time, researchers often conduct studies related to emotions. So why is Facebook (whose partners in the study were UC-San Francisco and Cornell University) getting grief? Adrienne LaFrance in the Atlantic article pointed out that researchers who get funding from the government are required to obtain consent from their subjects. But in this case, we had no idea Facebook had turned users (You? Me?) into marionettes/lab rats/guinea pigs.

The study, in a nutshell: For a week in 2012, Facebook managed the flow of users’ feeds so that only certain types of posts (negative or positive) showed up, then studied how the users posted/acted afterward.

“It’s gross,” writes Linda Homes for NPR. “There are people who can’t afford to be made to feel very much worse than they already do; there are people at all times who are existing pretty close to the line between okay and not okay, and more who are existing pretty close to the line between somewhat not okay and really not okay.”

Even the Facebook data scientist who led the study is now questioning its value: “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Adam Kramer wrote on his Facebook page.” (HT Wall Street Journal.) Kramer also wrote: “The experiment in question was run in early 2012, and we have come a long way since then.”

Opinions and questions abound, but here are a couple of last questions, at least for now: Will the outrage outweigh people’s need/desire to use Facebook? Or will this be what finally causes the social network’s users to take a long, hard look at how big a part Facebook (and maybe, other corporations) plays in many of their lives — and make some changes?