How do you Like it now? Facebook may have messed with your mind

It could’ve happened to you: Facebook experimented with manipulating users’ emotions through their News Feeds. The research — which found that emotions can be contagious — involved almost 700,000 people, a small percentage of the social network’s  more than 1 billion users.  Among the biggest questions: Who cares?

In the outraged camp are privacy advocates who say, among other things, that Facebook shouldn’t have conducted the study without letting users know. But there’s also talk that it’s no big deal in this Big Data world; people agree when they join Facebook that their information will be used in all kinds of ways, so why all the hand-wringing? There was informed consent.

Nevertheless, Susan Fiske, the Princeton psychology professor at Princeton who edited the study, said she found it “creepy.” She told the Atlantic: “It’s ethically okay from the regulations perspective, but ethics are kind of social decisions.” Oh, and in case there’s any question, the study’s authors told her the experiment was approved because Facebook manipulates its users’ News Feeds “all the time.”

Speaking of all the time, researchers often conduct studies related to emotions. So why is Facebook (whose partners in the study were UC-San Francisco and Cornell University) getting grief? Adrienne LaFrance in the Atlantic article pointed out that researchers who get funding from the government are required to obtain consent from their subjects. But in this case, we had no idea Facebook had turned users (You? Me?) into marionettes/lab rats/guinea pigs.

The study, in a nutshell: For a week in 2012, Facebook managed the flow of users’ feeds so that only certain types of posts (negative or positive) showed up, then studied how the users posted/acted afterward.

“It’s gross,” writes Linda Homes for NPR. “There are people who can’t afford to be made to feel very much worse than they already do; there are people at all times who are existing pretty close to the line between okay and not okay, and more who are existing pretty close to the line between somewhat not okay and really not okay.”

Even the Facebook data scientist who led the study is now questioning its value: “In hindsight, the research benefits of the paper may not have justified all of this anxiety,” Adam Kramer wrote on his Facebook page.” (HT Wall Street Journal.) Kramer also wrote: “The experiment in question was run in early 2012, and we have come a long way since then.”

Opinions and questions abound, but here are a couple of last questions, at least for now: Will the outrage outweigh people’s need/desire to use Facebook? Or will this be what finally causes the social network’s users to take a long, hard look at how big a part Facebook (and maybe, other corporations) plays in many of their lives — and make some changes?


Photo of Facebook’s iconic “Like” by Kirstina Sangsahachart/Daily News


Tags: , ,


Share this Post

  • Makikiguy

    Now, I am outraged.

  • This is bogus reporting. FB carried out an experiment? Guess what? Amazon, Google, Yahoo, Bing, Netflix, and many others are ALWAYS testing. It’s called A/B Split Testing or MVT (Multivariate Testing). It’s ALWAYS being used. Google often carries out dozens of tests simultaneously. That’s how websites, selections, and ads are optimized.

    • advocatus diaboli

      that doesn’t make it bogus reporting. Should we stop reporting all crimes if they are not unique? Do two or more wrongs make a right? You do raise an issue: why aren’t more people questioning the details on all large internet site use of member data and id they have an SLA not to interfere with feeds?

  • ignorance iseverywhere

    People still don’t believe Facebook is a tool used by our government to collect information and data on people. just like Twitter. DUH. what more proof do people need. sickening.

  • Tim Timmerson

    Facebook is narcissistic garbage
    your lives will be much better if you log off of their marketing scheme