Tuesday, August 5, 2014

What does Facebook's "experiment" mean for academic researchers?



What does Facebook's "experiment" mean for academic researchers?

The statement best capturing user outrage was published by the Huffington Post in
You May Have Been a Lab Rat in a Huge Facebook Experiment
"Facebook decided to try to manipulate some people's emotional states -- for science."

  
The research, actually quite unremarkable by merely confirming well documented evidence about "emotional contagion", involved the manipulation of Facebook's News Feed (the stream of status updates, photos and news articles that appears when a user first logs on). 
For a week in January 2012, a group of Facebook employees advised by a group of academic researchers, modified the algorithmic outputs of 689,003 user's News Feeds. Applying scientifically mediocre procedures, one group of users was shown fewer posts containing words believed to evoke positive emotions, such as "love," "nice" and "sweet," while another group was shown fewer posts with negative words, like "hurt," "ugly" and "nasty." 
As it turns out, the experiment suggests the Internet is just like real life: People who were shown fewer positive words in their News Feeds tended to write posts of their own that contained fewer positive words and more negative words. The reverse was also true. 
What irked the user community was the fact that Facebook conducted the experiment without informing users they were involved in the experiment, before or afterwards. What irks me is that the findings were published in the Proceedings of the National Academy of Sciences.
Decide for yourself. Was the activity Facebook carried out actually that outrageous for a commercial entity? Or was the real egregiousness carried out when the Proceedings of the National Academy of Sciences agreed to publish the findings as "research"? See what other's think and add your opinion here.