Media & Entertainment

The Morality Of A/B Testing


We don’t use the “real” Facebook. Or Twitter. Or Google, Yahoo, or LinkedIn. We are almost all part of experiments they quietly run to see if different versions with little changes make us use more, visit more, click more, or buy more. By signing up for these services, we technically give consent to be treated like guinea pigs.

But this weekend, Facebook stirred up controversy because one of its data science researchers published the results of an experiment on 689,003 users to see if showing them more positive or negative sentiment posts in the News Feed would affect their happiness levels as deduced by what they posted. The impact of this experiment on manipulating emotions was tiny, but it raises the question of where to draw the line on what’s ethical with A/B testing.

First, let’s look at the facts and big issues:

The Experiment Had Almost No Effect

Check out the study itself or read Sebastian Deterding’s analysis for a great breakdown of the facts and reactions.

Essentially, three researchers, including one of Facebook’s core Data Scientists Adam Kramer, looked to prove whether emotions are contagious via online social networks. For a week, Facebook showed people fewer positive or negative posts to people in News Feed, and then measured how many positive or negative words they included in their own posts. People shown fewer positive posts (a more depressing feed) posted 0.1% fewer positive words in their posts — their status updates were a tiny bit less happy). People shown fewer negative posts (a happier feed) post 0.07% fewer negative words — their updates were a tiny bit less depressed).

Here you can see that the experiments only reduced the usage of positive or negative words a tiny amount.

News coverage has trumpeted that the study was harmful, but it only made people ‘sad’ by a minuscule amount.

Plus, that effect could be attributed not to an actual emotional change in the experiment participants, but them just following the trends they see on Facebook. Success theater may just be self-perpetuating, as seeing fewer negative posts might cause you to manicure your own sharing so your life seems perfect too. One thing the study didn’t find was that being exposed to happy posts on Facebook makes you sad because your life isn’t as fun, but again, the findings measured what people posted not necessarily how they felt.

Facebook Didn’t Get Consent Or Ethics Board Approval

Facebook only did an internal review to decide if the study was ethical. A source tells Forbes’ Kashmir Hill it was not submitted for pre-approval by the Institutional Review Board, an independent ethics committee that requires scientific experiments to meet stern safety and consent standards to ensure the welfare of their subjects. I was IRB certified for an experiment I developed in college, and can attest that the study would likely fail to meet many of the pre-requisites.

privacy-iconInstead, Facebook holds that it manipulates the News Feed all the time to test what types of stories and designs generate the most engagement. It wants to learn how to get you to post more happy content, and spend more time on Facebook. It saw this as just another A/B test, which most major tech company, startups, news sites, and more run all the time. Facebook technically has consent from all users, as its Data Use Policy people automatically agree to when they sign up says “we may use the information we receive about you…for…data analysis, testing, research and service improvement.”

Many consider that to be a very weak form of consent, as participants didn’t know they were in the experiment, its scope or intent, its potential risks, if data would be kept confidential, and they weren’t provided any way to opt out. Some believe Facebook should ask users for consent and offer an opt out for these experiments.

Everyone Is A/B Testing

So the negative material impact of this specific study was low and likely overblown, but the controversy vaults the ethics question into a necessary public discussion.

Sure, there are lots of A/B Tests, but most are pushing for more business-oriented results like increasing usage or clicks or purchases. This study purposefully sought to manipulate people’s emotions positively and negatively for the sake of proving a scientific theory about social contagion. Affecting emotion for emotion’s sake is why I believe the study has triggered such charged reactions. Some people don’t think the intention of the experimenter matters, because who’s to know what they really want, especially when it comes to big for-profit companies. I think it is an important factor for distinguishing what may need oversight.

Facebook Study tweets

Either way, there is some material danger to experiments that depress people. As Deterding notes, the National Institute Of Mental Health says 9.5% of Americans have mood disorders, which can often lead to depression. Some people who are at risk of depression were almost surely part of Facebook’s study group that were shown a more depressing feed, which could be considered dangerous. Facebook will endure a whole new level of backlash if any of those participants were found to have committed suicide or had other depression-related outcomes after the study.

That said, every product, brand, politician, charity, and social movement is trying to manipulate your emotions on some level, and they’re running A/B tests to find out how. They all want you to use more, spend more, vote for them, donate money, or sign a petition by making you happy, insecure, optimistic, sad, or angry. There are many tools for discovering how best to manipulate these emotions, including analytics, focus groups, and A/B tests. Often times people aren’t given a way to opt out.


Facebook may have acted unethically. Despite its constant testing to increase engagement which could fall into a grayer area, this experiment tried to directly sway emotions.

A brand manipulating its own content to change someone’s emotions to complete a business objective is simpler and expected. A portal manipulating the presence of content shared with us by friends to depress us for the sake of science is different.

You might guess McDonalds, with its slogan “I’m loving it” is trying to make you feel like you’re less happy without it, and a politician is trying to make you feel more optimistic if you vote for them. But many people don’t even understand the basic concept of Facebook using a relevancy-sorting algorithm to filter the News Feed to be as engaging as possible. They probably wouldn’t suspect Facebook might show them fewer happy posts from friends so they’ll be sadder in order to test a theory of social science.

In the end, an experiment with these intentions and risks may have deserved its own opt-in, which Facebook should consider offering in the future. No matter how you personally perceive the ethics, Facebook made a big mistake with how it framed the study and now the public is seriously angry.

But while Facebook has become the lighting rod, the issue of ethics in A/B testing is much bigger. If you believe toying with emotions is unethical, most major tech companies as well as those in other industries are guilty too.

Regulation, Or At Least Safeguards

So what’s to be done? The variety of companies that run these tests range from large to small, and the risks of each test fall on a highly interpretable spectrum from innocuous to gravely dangerous. Banning any testing that “manipulates emotions” would cause endless arguments about what qualifies, be nearly impossible to enforce, and could often slow-down innovation or degrade the potential quality of products we use.

But there are still certain companies with outsized power to impact people’s emotions in ways that are tough for the average person to understand.

I'm Lovin It

That’s why a good start would be for companies running significant tests that manipulate emotions to offer at least an opt out. Not for every test, but ones with some real risk like showing users a more depressing feed. Just because everyone else isn’t doing it, doesn’t mean big tech companies can’t be pioneers of better ethics. Volunteering to provide a choice as to whether people want to be guinea pigs could bolster confidence amongst users. Let people opt out of the experiments via a settings page and give them the standard product that evolves in response to those experiments. Not everyone has to be put on the front lines to find out what works best. Consent is worth adding a little complexity to the product.

FTC-logoAs for providing users some independent protection against harmful emotional manipulation on a grand scale, the Federal Trade Commission might consider auditing these practices. The FTC already has settlements with Facebook, Google, Twitter, Snapchat, and more companies to audit their privacy practices for ten to twenty years. The FTC could layer on ethical oversight for experimentation and product changes with the same goal of protecting consumer well-being. Unfortunately, it’s also settlements with the FTC that says companies can’t take away privacy controls that incentivize them not to offer any new ones.

At the very least, the tech companies should educate their data scientists and others designing A/B tests about the ethical research methods associated with having experiments approved by the IRB. Even if the tech companies don’t actually submit individual tests for review, just being aware of best practices could go a long way to keeping tests safe and compassionate.

The world has quickly become data-driven. It’s time ethics caught up.

More TechCrunch

Welcome back to TechCrunch’s Week in Review. This week had two major events from OpenAI and Google. OpenAI’s spring update event saw the reveal of its new model, GPT-4o, which…

OpenAI and Google lay out their competing AI visions

Expedia says Rathi Murthy and Sreenivas Rachamadugu, respectively its CTO and senior vice president of core services product & engineering, are no longer employed at the travel booking company. In…

Expedia says two execs dismissed after ‘violation of company policy’

When Jeffrey Wang posted to X asking if anyone wanted to go in on an order of fancy-but-affordable office nap pods, he didn’t expect the post to go viral.

With AI startups booming, nap pods and Silicon Valley hustle culture are back

OpenAI’s Superalignment team, responsible for developing ways to govern and steer “superintelligent” AI systems, was promised 20% of the company’s compute resources, according to a person from that team. But…

OpenAI created a team to control ‘superintelligent’ AI — then let it wither, source says

A new crop of early-stage startups — along with some recent VC investments — illustrates a niche emerging in the autonomous vehicle technology sector. Unlike the companies bringing robotaxis to…

VCs and the military are fueling self-driving startups that don’t need roads

When the founders of Sagetap, Sahil Khanna and Kevin Hughes, started working at early-stage enterprise software startups, they were surprised to find that the companies they worked at were trying…

Deal Dive: Sagetap looks to bring enterprise software sales into the 21st century

Keeping up with an industry as fast-moving as AI is a tall order. So until an AI can do it for you, here’s a handy roundup of recent stories in the world…

This Week in AI: OpenAI moves away from safety

After Apple loosened its App Store guidelines to permit game emulators, the retro game emulator Delta — an app 10 years in the making — hit the top of the…

Adobe comes after indie game emulator Delta for copying its logo

Meta is once again taking on its competitors by developing a feature that borrows concepts from others — in this case, BeReal and Snapchat. The company is developing a feature…

Meta’s latest experiment borrows from BeReal’s and Snapchat’s core ideas

Welcome to Startups Weekly! We’ve been drowning in AI news this week, with Google’s I/O setting the pace. And Elon Musk rages against the machine.

Startups Weekly: It’s the dawning of the age of AI — plus,  Musk is raging against the machine

IndieBio’s Bay Area incubator is about to debut its 15th cohort of biotech startups. We took special note of a few, which were making some major, bordering on ludicrous, claims…

IndieBio’s SF incubator lineup is making some wild biotech promises

YouTube TV has announced that its multiview feature for watching four streams at once is now available on Android phones and tablets. The Android launch comes two months after YouTube…

YouTube TV’s ‘multiview’ feature is now available on Android phones and tablets

Featured Article

Two Santa Cruz students uncover security bug that could let millions do their laundry for free

CSC ServiceWorks provides laundry machines to thousands of residential homes and universities, but the company ignored requests to fix a security bug.

1 day ago
Two Santa Cruz students uncover security bug that could let millions do their laundry for free

TechCrunch Disrupt 2024 is just around the corner, and the buzz is palpable. But what if we told you there’s a chance for you to not just attend, but also…

Harness the TechCrunch Effect: Host a Side Event at Disrupt 2024

Decks are all about telling a compelling story and Goodcarbon does a good job on that front. But there’s important information missing too.

Pitch Deck Teardown: Goodcarbon’s $5.5M seed deck

Slack is making it difficult for its customers if they want the company to stop using its data for model training.

Slack under attack over sneaky AI training policy

A Texas-based company that provides health insurance and benefit plans disclosed a data breach affecting almost 2.5 million people, some of whom had their Social Security number stolen. WebTPA said…

Healthcare company WebTPA discloses breach affecting 2.5 million people

Featured Article

Microsoft dodges UK antitrust scrutiny over its Mistral AI stake

Microsoft won’t be facing antitrust scrutiny in the U.K. over its recent investment into French AI startup Mistral AI.

2 days ago
Microsoft dodges UK antitrust scrutiny over its Mistral AI stake

Ember has partnered with HSBC in the U.K. so that the bank’s business customers can access Ember’s services from their online accounts.

Embedded finance is still trendy as accounting automation startup Ember partners with HSBC UK

Kudos uses AI to figure out consumer spending habits so it can then provide more personalized financial advice, like maximizing rewards and utilizing credit effectively.

Kudos lands $10M for an AI smart wallet that picks the best credit card for purchases

The EU’s warning comes after Microsoft failed to respond to a legally binding request for information that focused on its generative AI tools.

EU warns Microsoft it could be fined billions over missing GenAI risk info

The prospects for troubled banking-as-a-service startup Synapse have gone from bad to worse this week after a United States Trustee filed an emergency motion on Wednesday.  The trustee is asking…

A US Trustee wants troubled fintech Synapse to be liquidated via Chapter 7 bankruptcy, cites ‘gross mismanagement’

U.K.-based Seraphim Space is spinning up its 13th accelerator program, with nine participating companies working on a range of tech from propulsion to in-space manufacturing and space situational awareness. The…

Seraphim’s latest space accelerator welcomes nine companies

OpenAI has reached a deal with Reddit to use the social news site’s data for training AI models. In a blog post on OpenAI’s press relations site, the company said…

OpenAI inks deal to train AI on Reddit data

X users will now be able to discover posts from new Communities that are trending directly from an Explore tab within the section.

X pushes more users to Communities

For Mark Zuckerberg’s 40th birthday, his wife got him a photoshoot. Zuckerberg gives the camera a sly smile as he sits amid a carefully crafted re-creation of his childhood bedroom.…

Mark Zuckerberg’s makeover: Midlife crisis or carefully crafted rebrand?

Strava announced a slew of features, including AI to weed out leaderboard cheats, a new ‘family’ subscription plan, dark mode and more.

Strava taps AI to weed out leaderboard cheats, unveils ‘family’ plan, dark mode and more

We all fall down sometimes. Astronauts are no exception. You need to be in peak physical condition for space travel, but bulky space suits and lower gravity levels can be…

Astronauts fall over. Robotic limbs can help them back up.

Microsoft will launch its custom Cobalt 100 chips to customers as a public preview at its Build conference next week, TechCrunch has learned. In an analyst briefing ahead of Build,…

Microsoft’s custom Cobalt chips will come to Azure next week

What a wild week for transportation news! It was a smorgasbord of news that seemed to touch every sector and theme in transportation.

Tesla keeps cutting jobs and the feds probe Waymo