There are now hundreds of articles and blogs on the Facebook emotions experiment. Yesterday’s non-apology apology from COO Sandberg only intensifies the discussion. She said that she’s sorry we got upset about it. I suspect that she’s sorry we found out about it. She hasn’t said that she’s sorry they did the experiment.
*
intentions and interesting research questions conducting research that we have
later judged to be unethical. As researchers, no matter our intentions, we are pretty bad at assessing ourselves.
The history of science is also filled with pretty awful people doing terrible things. Scientists have psychologically damaged patients, given them
dangerous drugs, and even infected them with syphilis all in the interest of
increasing knowledge.
In response to such terrible abuses and the challenges of objectively reviewing even much less dangerous work, scientists of all
types have collaborated over the last half of the 20th century to
develop robust systems of assessing the ethical basis of experiments.
a Facebook experiment designed to manipulate their users’ emotions have
sparked considerable outrage and debate among scientists and researchers. The
episode raises questions about how well our review systems are functioning
today, especially given the ease and power with which companies like Facebook
can intrude into our private lives and thoughts.
Review Boards (IRBs). IRBs ensure that researchers don’t expose their subjects
to undue risks, consider whether informed consent is necessary, and otherwise
protect subject (i.e. us) from risky experimentation. It’s bureaucratic, but important work.
getting IRB approval to be really annoying. Instead of filling out paperwork, I
just want to collect data and do my research (I study student learning, among
other things). I know I’m a good guy and not going to hurt anyone.
mistakes and expose research subjects to risks. Research subjects are much safer with IRB protocols
in place, and so are researchers, really, if we don’t want to go down in history as unethical.
because the people studying you right now, who might even be manipulating you,
may not have gone through any such process.
They may assume you’ve agreed to be guinea pigs when you click “I
accept” on an End User License Agreement (EULA), no matter how long, no matter
how obscurely written.
academic researchers to study how emotions transferred. For one week, 689,003
users had either more negative or positive posts added to their news feed. The
researchers wanted to see how emotions transferred from one person to another
online.
likely to write happy posts? If you saw negative posts, would you, likewise, go
negative? The answer toboth questions is yes. The effect was modest but statistically significant.
how we react. Are we more likely to click on the funny ad or the serious ad?
Are we more likely to share with a friend with the light background or the dark
background? By making users sadder, in
particular, Facebook did something that feels different.
using Facebook if their news feed was too negative, so they deliberately made
some feeds more negative. We only found out about it because Kramer collaborated with two academic researchers, and sought publication in an academic journal.
about it because two academic researchers used the fruit of Kramer’s research
to publish an academic piece on emotional transfer.
Yes, the effect was tiny, but I do know people who are depressed or
experiencing suicidal ideations who do not need even the most modest
nudge. It’s also an ethical problem that
they
didn’t screen minors from the experiment. I hope we can all agree that
psychological manipulation experiments on children requires some pretty severe
oversight. Facebook didn’t get any.
about who reviewed the experiment. The academic journal cited Cornell’s IRB,
which released a statement
saying that because its researcher had only results and not individual data, no
IRB approval was needed.
review process, but has not specified the details. Facebook maintains that they
have come a long way since 2012 and now put any research projects through
comprehensive internal review. My
request for clarification of that internal review process was not
returned.
you can get informed consent. While some have argued that agreeing to
Facebook’s Terms of
Service counts, let’s be clear. There’s nothing “informed” about clicking,
“I agree.” As technology ethicist Catherine Flick notes,
modern EULAs do not resemble the informed consent in the way that either good
research or good medical practice mandates.
Moreover, as Kashmir
Hill discovered, Facebook only added “research” to its terms in mid-2012,
four months after the emotion study.
constituted review board can determine that risks are minimal and consent isn’t
necessary. That’s just what happened with other experiments
that used Facebook and generated no outrage. Before Timothy Ryan, a political
scientist, studied, “What makes us click,” he got IRB approval.
probably wasn’t dangerous. It did, however, involve deliberately manipulating
people in order to see what happened next.
academia or not, need to embrace more rigorous and independent ethical
oversight of their experimentation.
history of science and the depth of access that these companies have to our
innermost lives demand it.