In 2012, Facebook ran a study that manipulated the News
Feeds of almost 700,000 users. They wanted to see if they could affect people’s
behavior based on what was in their News Feed. The study centered on emotional
contagion, a theory that suggests exposure to positive or negative information
can make people feel more or less happy or sad.
Using this method, the researchers were able to manipulate
Facebook’s News Feed algorithm and show test subjects different amounts of
positive and negative content. For example, the experiment showed some of the
participants primarily neutral to happy posts from friends; other participants
saw a lot of content that was mainly sad or angry.
This is a form of emotional contagion, and the research has
raised concerns among Facebook users, as well as experts in the field of social
comparison. It also has prompted many users to ask about how Facebook seeks
consent for its research.
The question of consent comes down to a few things, and the
most obvious one is whether or not the research was ethically sound. The best
way to ensure that the research is ethically sound is to obtain explicit
consent from all participants, before and after the experiment. However,
getting informed consent can come with a few tradeoffs for Facebook.
First, it’s important to note that obtaining consent can be
difficult for some people. Some may be unwilling to participate because they
think it will affect their well-being, or they fear that it will make them
uncomfortable. It is also possible that they do not want to be included in an
experiment because they are afraid of being targeted by advertisers or others.
For this reason, it’s important that the Facebook Experiment
be conducted in a way that honors the trust that the users put into the site by
using it every day. This can be done through a formal process called “informed
consent,” in which people are informed of what the study is and the potential
implications it might have for their life.
It’s worth noting that if Facebook did not get informed
consent, they would be violating their terms of service, which require users to
agree to the use of their data for data analysis, testing, and research. It’s
also worth noting that if they did get informed consent, it would be an
important step in the right direction for their ethics.
To improve the Facebook Experiment, Facebook could ask a
small number of users to privately rate how they felt at the moment of posting.
This would be an effective and confidential way to capture the impact of
emotional contagion in the News Feed.
This approach has been used in other studies, and it would
be simple to implement on Facebook. It could be as simple as a pop-up window
asking people to rate their emotions at the moment of posting or as complex as
a full-fledged survey integrating with the status update box itself.