Question
In 2014 it was learned that Facebook had been experimenting on its own users' emotionalmanipulability, by altering the news feeds of almost 700,000 users to
In 2014 it was learned that Facebook had been experimenting on its own users' emotionalmanipulability, by altering the news feeds of almost 700,000 users to see whether Facebook engineers placing more positive or negative content in those feeds could create effects of positive or negative'emotional contagion' that would spread between users. Facebook's published study,
21
which concluded that such emotional contagion could be induced via social networks on a"massive scale," was highly controversial, since the affected users were unaware that they were the subjects of a scientific experiment, or that their news feed was being used to manipulate their emotions and moods.12
Facebook's Data Use Policy, which users must agree to before creating an account, did not include the phrase "constituting informed consent for research" until four months after the studyconcluded. However, the company argued that their activities were still covered by the earlier data policy wording, even without the explicitreference to 'research.'13Facebook also argued that the purpose of the study was consistent with the user agreement, namely, to give Facebook knowledge it needs to provide users with a positive experience on the platform.
Critics objected on several grounds, claiming that:
A) Facebook violated long-held standards for ethical scientific research in the U.S. and Europe, which requirespecificandexplicitinformed consent from human research subjects involved in medical or psychological studies;
B) That such informed consent should not in any case be implied by agreements to a generic Data Use Policy that few users are known to carefully read or understand;
C) That Facebook abusedusers' trust by using their onlinedata-sharing activities for an
undisclosed and unexpected purpose;
D) That the researchers seemingly ignored thespecific harmsto people that can come from emotional manipulation. For example, thousands of the 689,000 study subjects almost certainly suffer from clinical depression, anxiety, or bipolar disorder, but were not excluded from the study by those higher risk factors. The study lacked key mechanisms of research ethics that are commonly used to minimize the potential emotional harms of such a study, for example, a mechanism for debriefing unwitting subjects after the study concludes, or a mechanism to exclude participants under the age of 18 (another population especially vulnerable to emotional volatility).
Question:
1.Of the eight types of ethical challenges for data practitioners that we listed in Part Two, which two types are most relevant to the Facebook emotional contagion study? Briefly explain your answer.
2.Were Facebook's users justified and reasonable in reacting negatively to the news of the study? Was the studyethical? Why or why not?
3.To what extent should those involved in the Facebook study haveanticipatedthat the study might be ethically controversial, causing a flood ofdamaging media coverage and angry public commentary? If the negative reactionshouldhave been anticipated by Facebook researchers and management, why do you think itwasn't?
4.Describe 2 or 3 things Facebook could have done differently, to acquire the benefits of the study in a less harmful, less reputationally damaging, and more ethical way.
5.Who is morallyaccountablefor any harms caused by the study? Within a large organization like Facebook, how shouldresponsibilityfor preventing unethical data conduct be distributed, and why might that be a challenge to figure out?
Step by Step Solution
There are 3 Steps involved in it
Step: 1
Get Instant Access to Expert-Tailored Solutions
See step-by-step solutions with expert insights and AI powered tools for academic success
Step: 2
Step: 3
Ace Your Homework with AI
Get the answers you need in no time with our AI-driven, step-by-step assistance
Get Started