OKCupid on Monday wrote a viral blog post where it admitted to experimenting on human beings, comparing tests it runs to improve its dating algorithms to the study Facebook did that took over the news cycle early this month. Today Reuters reported that like Facebook’s experiment, these tests could violate Federal Trade Commission regulations.
“But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site,” OKCupid Founder Christian Rudder wrote in the tongue-in-cheek post. “That’s how websites work.”
This actually seems a lot worse than the Facebook experiment: http://t.co/e7Z9KN7cvw
— Ezra Klein (@ezraklein) July 28, 2014
OKCupid has been conducting experiments on you for funsies http://t.co/N8DEvIysRg
— Jezebel (@Jezebel) July 28, 2014
As Josh Constine noted, when we use Facebook, Google, Twitter, Google, Yahoo or Linkedin, we agree to be part of experiments that will alter our experiences in an attempt to make us visit a site longer or click more things.
That’s what OKCupid was doing.
In Monday’s post, Rudder described three “of the more interesting” experiments the company has run. In one experiment, OKCupid removed all the photos from its website as it was rolling out a blind dating app to see how it impacted use. In the second, OKCupid ran a test to see how much a user’s picture affects viewer’s perception of their personalities. In the third, OKCupid told users that they had a 90 percent compatibility rate with users who they actually shared a 30 percent rate with.
By removing photos from its website, OKCupid learned information it could apply to its blind dating app. In its second test, it found that users saw personality and looks to be the same thing, and now instead of rating people on both personality and looks, users simply give one overall rating. Its third test seems to be the most controversial, but essentially it confirmed that OKCupid’s dating algorithm actually works — users don’t just work out because OKCupid suggests it.
All of these tests or experiments were done to improve user’s experience on the OKCupid website. When people sign up for OKCupid, they’re signing up for a service that is going to connect them with strangers based on data they enter.
Manipulating that data and finding best practices is just OKCupid doing its job. At the basic level, all social networks are altering what you see in your feed to make the time you spend on them better.
But Facebook’s study went beyond that. Facebook manipulated content in users’ feeds to see if the emotional tone of their News Feeds impacted the tone of their own posts on the social network, deliberately making people sad. After conducting the test on almost 700,000 users, it published those results in an academic journal.
Unlike OKCupid, Facebook didn’t alter the user’s experience simply to improve the algorithm for a business purpose. In this study, the company essentially conducted a psychological experiment that many consider unethical.
People sign up for Facebook to interact with their friends and read the content they share — both good and bad. As many before me have noted, Facebook shouldn’t mess with that for the sake of a study, especially one conducted without its users consent.
Some of the OKCupid experiments were obvious to the users — for example, you can see when no one has a picture on the website. Rudder also told Reuters the users were notified of the tests after the fact, and he also said users consent to such testing under the “diagnostic research” provision in the site’s Terms of Service agreement.
OKCupid’s comparison of its own tests to the Facebook study blows a common practice employed by many web services out of proportion. The world of big data is new, and this is certainly not the last time we’re going to see concerns flare up about tech companies employing trial and error to improve their services. But OKCupid is just being transparent about something most companies are doing. Facebook took it a step too far, and OKCupid shouldn’t defend Facebook’s study by disclosing its own experiments.