Christopher X J. Jensen
Associate Professor, Pratt Institute

When Facebook performs a manipulative experiment on its users, the results are interesting, the methods disturbing

Posted 03 Aug 2014 / 0

2014-08-03aDid you know that Facebook performs scientific research? If I told you that Facebook is constantly analyzing the activity of its users, that would probably not surprise you. But does Facebook go the next step by performing manipulative experiments on its users? A recent publication in the prestigious Proceedings of the National Academy of Sciences USA (PNAS) entitled “Experimental evidence of massive-scale emotional contagion through social networks” sets the record straight: Facebook feels perfectly entitled to manipulate you in the name of science.

Previous studies have shown that people are very sensitive to the emotional states of their social partners. Humans are not just aware of the emotional states of others (empathy), we also tend to adopt the emotional states of those we interact with. This phenomenon is called “emotional contagion” and has tremendous implications for the social ecology of human beings. If we do indeed have the tendency to adopt the emotional states of our fellow social group members, we will to have explain how such a tendency evolved. I am particularly interested in the implications of emotional contagion for human cooperation.

The results of this Facebook experiment are pretty interesting. Facebook subjected users to more newsfeed posts that were either positive or negative and then looked at the emotional content of those manipulated users’ subsequent posts. Given that people have demonstrated emotional contagion in face-to-face interactions, it is important to know whether similar interactions in the virtual world of social networks can produce similar behavioral changes. The results of this study suggest that the overall emotional content of one’s newsfeed affects one’s subsequent social behavior on Facebook. The figure below, taken from the article, summarizes these results: 2014-08-03bNearly 700,ooo people participated unwittingly in this study, a phenomenal sample size. Statistically speaking, this large sample size is a double-edged sword: on the one hand it means that the results are much more likely to represent the population of Facebook users as a whole, but on the other it means that even very small effects will be detected as statistically significant. At this sample size statistical significance and effect size need to be considered separately, as the actual impact of the manipulations were rather small.

Research of this kind can only happen in this manner on Facebook. Although it is true that Facebook provides a unique platform on which to perform this kind of experiment, that is not what I am referring to. It is not just that Facebook is uniquely positioned to pull off this kind of experiment logistically: it is also uniquely unencumbered by the ethical checks that generally limit the scope of this sort of research. If I want to conduct studies on the academic achievement of my students and publish those results, I need to get the consent of my students before I conduct these studies. The gold standard for scientific ethics is to have both my research plan and the consent agreement I distribute to my test subjects reviewed and approved by an Institutional Review Board (IRB). All indications are that the Facebook experiment was not subject to IRB review or approval (for more nuance on this issue, see these articles in The Atlantic and The Guardian). The paper now comes with a rather bizarre “Editorial Expression of Concern” that wiggles around these ethical issues. Apparently Facebook broke no laws when it conducted this study, and the Cornell University researchers who helped analyze the results were not required to submit their plans to their IRB because the research has already been conducted. And yet taken as a whole, the research seems to violate the ethical standards of PNAS publication.

You can almost see the dialogue surrounding this Facebook experiment being framed around the phrase “oh, but look at the scientific potential”. Although there are severe logistical limits on what kinds of experiments one could perform from behind the dashboard of Facebook, those questions — like the one asked in this article — that are amenable to testing on social media can be tested with unprecedented thoroughness. No other researcher can compete with Facebook in terms of number of test subjects and access to those test subjects. Based on how Facebook conducted this study, if you have accepted their user agreement, you are a potential test subject. That is a pretty big scientific fringe benefit of running the world’s largest social networking site.

Unfortunately, the judgment of history is not on the side of the “scientific potential” argument for conducting research on unwitting test subjects. Both medical and psychological experiments have been conducted in manners that do not fully inform participants of the potential risks of participating, and there is broad consensus that these experiments are unethical. The only thing that differentiates these experiments from the Facebook experiments is the severity of their impact. Arguably, seeing more grumpy posts in your Facebook newsfeed is pretty tame compared to what other unwitting test subjects have been exposed to. But this argument is more than a bit self-serving: if the paper demonstrates that the emotional tenor of Facebook posts affects the mood and subsequent behavior of test subjects, that means that there was a substantial effect on those exposed to these manipulations. Does it matter if the effect is not as severe? If you are worried about the ethical ramifications of this study, please realize that you are only seeing the tip of the iceberg. If this kind of research is deemed to be worthy of publication, imagine what kinds of experiments the Facebook lab is performing but not publishing. If you think that the “research” that Facebook (or Google or Yahoo or any of the other major information-gathering companies) is conducting is solely based on analyzing your unmanipulated behavior, this paper should give you pause. Apparently Facebook is more than comfortable about placing you into a manipulative experiment without your explicit consent. And — in case you were wondering — it was all a part of the user agreement you did not read and accepted anyway.


If you are looking to hear more commentary on this new Facebook research publication, check out this segment on WNYC’s The Brian Lehrer Show (in which I make a very brief cameo as a caller).

There’s also an interesting pair of articles in The Chronicle of Higher Education that present viewpoints sympathetic to and in favor of this sort of research.

A Major Post, Articles, Behavior, Behavioral Ecology, Communication, Consciousness, Emotion, Empathy, Ethics, Experiments (General), Happiness, Law, Methods, Psychological Adaptation, Sociology, Web

Leave a Reply