Facebook’s Head Of Policy On Emotion Experiment: “That’s Innovation”

While Facebook COO Sheryl Sandberg today apologized for its controversial emotion manipulation experiment being “poorly communicated”, another executive said this kind of research makes the product better, and “it’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation.”

Facebook’s head of global policy management Monika Bickert showed little remorse when she spoke at the Aspen Ideas Festival yesterday, but did say “What I think we have to do in the future is make sure we’re being transparent, both to regulators and to people using the product about exactly what we’re doing.”

Facebook Head Of Global Policy Management Monika Bickert speaking at the Aspen Ideas Festival, 7/1/2014

Facebook Head Of Global Policy Management Monika Bickert speaking at the Aspen Ideas Festival, 7/1/2014

Tempers flared this week when mainstream media picked up on an experiment Facebook ran in 2012 on 689,003 of its own users. You can read my detailed take on it here. For a week, Facebook showed people less positive posts in the News Feed, and found they posted 0.1% fewer positive words in their own posts. A more depressing feed led people to publish very slightly more depressed updates. This teaches Facebook that emotions are contagious, that seeing happy updates might not make you sad like some suggest, and that showing them could make you use Facebook more. Anger arose for a number of reasons:

  • Facebook didn’t get express opt-in permission or even offer an opt-out, claiming the Data Use Policy users automatically opt into upon sign-up includes consent to “data testing, analysis, research” (though that line wasn’t added until four months after the study was conducted)
  • Facebook purposefully tried to depress people, rather than just testing to see what made them engage more with the service
  • Facebook didn’t have an independent ethics board pre-approve the test
  • One of the authors of the study has also received U.S. government funding to research “Modeling Discourse and Social Dynamics in Authoritarian Regimes” including how revolutions start.
  • Hiding negative posts to increase engagement is akin to self-serving censorship that makes Facebook a success theater where people can’t get help for issues in their lives
  • European regulators are looking into whether the experiment broke privacy laws or settlements Facebook has entered.
  • And a general fear that big tech companies have enormous power to influence society with very little transparency or control given to users

This has led others and I to call for more ethical experimentation and a bigger discussion of the morality of influence by these companies. Facebook Ethics On the opposite side, some of the positions supporting the experiment say:

  • These types of A/B tests are conducted all the time by companies, advertisers, politicians, charities, and more to find out how best to make us use, buy, vote, or donate. I mean, I said we were Facebook’s product testing guinea pigs way back in 2012.
  • Opt-ins would complicate the site and screw up experiment results. You can simply stop using products like Facebook if you don’t want to be a guinea pig
  • Regulation would slow down innovation and give Facebook a bad rap since people detest the NSA
  • The backlash against Facebook’s study will reduce transparency about this kind of research, leading companies to continue conducting but not share their findings.

The biggest support comes from the utilitarian perspective that supporters including Bickert hold, which says that these tests make Facebook as well as other products and services better for consumers. If 690,000 people were part of an experiment and some were purposefully depressed, that’s acceptable if it teaches Facebook to show posts that makes all of its 1.28 billion happier in the long-run. When asked in Aspen what she thought about the experiment and if she foresaw regulation against it, Bickert said (emphasis added, video above):

You’ve pointed out a couple interesting issues and one is the tension between legislation and innovation.  It is — in the specific incident that you’re referring to, although I’m not really the best expert and probably our public statements are the best source for information there, I believe that was a week’s worth of research back in 2012. And most of the research that is done on Facebook — if you walk around campus and you listen to the engineers talking, it’s all about ‘how do we make this product better’, ‘how do we better suit the needs of the population using this product’, and ‘how do we show them more of what they want to see and less of what they don’t want to see’.

And that’s innovation. That’s the reason that when you look at Facebook or YouTube you’re always seeing new features. And that’s the reason if you have that annoying friend from high school that always posts pictures of their toddler every single day, that’s reason that you don’t see all those photos in your news feed.

So it’s concerning when we see legislation that could possibly stifle that sort of creativity and that innovation. At the same time its incumbent on us if we want to make sure we don’t see that legislation, it’s incumbent upon us to make sure we’re transparent about what we’re doing and that people understand exactly why we’re doing what we do.”

The idea that this is all “innovation” is sure to rub some people the wrong way. There’s already a widespread distrust of Facebook’s power.

There’s little likelihood that Facebook and others will stop this kind of A/B testing. Still, there’s hope that these companies follow Bickert’s suggestion and become more transparent. That could include reviewing more risky or controversial tests with an independent ethics board, or allowing users some way to find out if they’ve quietly been placed into an experiment.

Facebook and other companies could also provide some open access to privacy-protected anonymized data to outside researchers. There are benefits to understanding humanity locked in the data of these big tech companies that might never be researched if they don’t have profit potential. Luckily, a source tells me there is a small contingent of engineers and other employees inside Facebook who are advocating for more basic science research — not just experiments to make the service more addictive.