Facebook Announces Stricter Guidelines For Research And Experiments On Its Users

After a study on whether emotional manipulation in the News Feed could make people sad turned into a PR disaster, Facebook has now set up a formal review process for pre-approving research on its users, and Research At Facebook website to centralize all the academic work done on its enormous data set.

Facebook admitting to screwing up with the emotions study, explaining “We were unprepared for the reaction the paper received when it was published and have taken to heart the comments and criticism. It is clear now that there are things we should have done differently. For example, we should have considered other non-experimental ways to do this research. The research would also have benefited from more extensive review by a wider and more senior group of people. Last, in releasing the study, we failed to communicate clearly why and how we did it.”

Screen Shot 2014-10-02 at 10.12.36 AM

Facebook is not changing how it receives consent from users to experiment on them, and it doesn’t mention any external auditors for research, which could alarm some academics calling for both. Facebook bakes consent into its long, legalese Terms Of Service statement, so users automatically surrender to experimentation just for using the social network. They don’t have to sign or agree to anything about specific experiments, nor are the even usually aware of them.

Facebook’s new framework for internal and external research now has clear guidelines. Facebook says if the research focuses on specific populations or demographics or is related to content “considered deeply personal (such as emotions), the study will have to endure an enhanced review process will before being pre-approved. A panel of senior researchers in different subject matters like privacy, legal, research, policy, and engineering will determine if a study meets the guidelines. Facebook will also train its engineers during their six-week introductory bootcamp on how research should be conducted. And veteran employees will also get education on proper research methods during annual security and privacy training sessions.

Screen Shot 2014-10-02 at 10.32.14 AM

Facebook’s CTO Mike Schroepfer writes, “We believe in research, because it helps us build a better Facebook. Like most companies today, our products are built based on extensive research, experimentation and testing.” Yet the announcement is sympathetic to the public, which thought the emotional manipulation study crossed the line in terms of ethics.

At the time of the uproar, I made three suggestions for how to win back public trust:

  • Opt-Outs – Facebook should get more clear consent or allow people to opt of studies with potential for serious negative impact. For example, Facebook seeing whether it could make people sad was potentially dangerous as the study likely included people who were clinically depressed, and may have warranted an opt-out. An A/B test of whether users prefer one design of the Facebook profile to another is innocuous enough to fly without explicitly consent. Facebook didn’t budge here.
  • External Review – Facebook should congregate an external review board of academics and experts to assess the potential harm to users of any risky study. This would ensure Facebook’s business motives didn’t cloud the judgement of the auditors. Facebook now has an internal review board, but they may be susceptible to company groupthink, or the pressures of Facebook accomplishing its mission and succeeding financially.
  • Training – Facebook should teach all employees conducting experiments about how to do so with minimal risk to use well-being. Here Facebook is making strong progress by adding research practice education to its new employee and yearly training sessions. More transparency about what its guidelines are and the rules for following them would boost the confidence of Facebook’s userbase.

Screen Shot 2014-10-02 at 10.38.40 AM

If we’ve gained anything from the emotional manipulation study backlash, it’s that more of Facebook’s research will now be out in the open. Before, it was buried in academic journals and often lacked comprehensible explanations of what Facebook was doing and why. That both made it feel like Facebook was shadily being secretive, and left research open to sensationalist interpretation. While the concept of the emotional manipulation study was troubling, its execution only made people posted 0.1% fewer positive words in their posts. That’s a minuscule emotional change that was trumped up as Facebook hurling people into depression.

With more research transparency and a public conversation started about what’s ethical when experimenting on users, any happiness sacrificed by subjects of the emotions study has gone to the greater good.

facebook-ethics