“In-person interaction and nonverbal cues are not strictly necessary for emotion contagion.”
The experiment, which is now known to have taken place back in January 2012, involved the Facebook News Feeds of nearly 700,000 users. The users were chosen at random and none of them were aware of these occurrences. The goal was to analyze what types of news feed received certain responses, thus altering the content to see what made users happy or sad.
Emotions appeared to be contagious. Who knew? Witnessing Facebook feeds full of happy news and updates make users post positively themselves. The same rings true for sad updates. Although in the tech world, this is similar to person-to-person interaction in that, when you are witnessed to sorrow or grief (or vice versa), the mood reflects in yourself.
Since the research was published in the June 17 issue of Proceedings of the National Academy of Sciences, policy analysts, attorneys, and privacy researchers have been picking through the details. It is one thing to use data to see what users search for, click on, and look at on their devices. However, tweaking an individual’s news feed to manipulate their emotional state seems to be a whole other ball game. In the throws of the Internet world, this could definitely be considered a scandal. Yet Facebook may not have violated the law, or the company’s own policies for that matter. Even if legal documents don’t necessarily equate with morality”, many feel Facebook should have been more ethical.
“No posts were hidden, they just didn’t show up on some loads of Feed”, said Kramer.
Responding to the backlash of this experiment Kramer added, “In hindsight, the research benefits of the paper may not have justified all of this anxiety”. While Facebook may deem this as standard operating procedure in business, there is an inexcusable gap “between the ethical standards of industry practice, versus the research community’s ethical standards for studies involving human subjects” (Ed Felton of Princeton University). It is questioned how the idea got past the ethics committees in the first place. In the end, we find out the answer is yes, if we see a negative status update from a friend, there’s a chance we will start to feel sad ourselves. So what? Was finding out something we already know worth being subject to a secret 700,000-person experiment? Don’t go posting about how upset this made you on Facebook, you may spread the “upset” vibe.