For one week in January 2012, data scientists skewed what almost 700,000 Facebook users saw when they logged into its service. Some people were shown content with a preponderance of happy and positive words; some were shown content analyzed as sadder than average. These manipulated users were more likely to post either especially positive or negative words themselves. This tinkering was just revealed as part of a new study, published in the prestigious Proceedings of the National Academy of Sciences.
People are supposed to be, under most circumstances, told that they’re going to be participants in research and then agree to it and have the option not to agree to it without penalty.
Susan Fiske, the professor of psychology at Princeton University who edited the study
Many previous studies have used Facebook data to examine “emotional contagion,” as this one did. This study is different because, while other studies have observed Facebook user data, this one set out to manipulate it. Though it was blessed as legal, the question now is whether it was ethical. As Adrienne LaFrance writes, even Susan Fiske, the editor of the study, had some serious concerns about it.
The level of outrage that appears to be happening suggests that maybe it shouldn’t have been done…I’m still thinking about it and I’m a little creeped out, too.