Why don't OKCupid's experiments bother us like Facebook's did? TIM CARMODY · JUL 28 2014
Hi, everybody! Tim Carmody here, guest-hosting for Jason this week.
OK Cupid's Christian Rudder has responded to the outcry over Facebook's experiments with user emotions by... publishing a list of experiments that the dating site has run on its users, along with their results.
And it's not little stuff either! To test its matching algorithms, OKC has selectively hidden users' profile images, their profile text, and even told pairs of users they were a good match when the algo said they weren't, and vice versa.
In short, Facebook may have hid stuff from you, but OK Cupid might have actually lied to you.
But... nobody's really upset about this. Or if they are, they're mostly just upset (or dryly observing, it's hard to tell) that other people aren't upset.
Why? I have some theories:
- It's early yet. It took the Facebook story some time to steep before it really picked up steam.
- OKC users are less likely to be troubled by this sort of thing than Facebook users are. And people get more upset when they feel like they personally might have been messed with. Hilary Parker pointed out that non-online daters are less likely to get upset on online daters' behalf: even if you don't actively look down on OKC users (and many do), you might be more likely to think they got what they deserved. OK Cupid has a history of disclosing these kinds of numbers, and there's a laissez-faire attitude towards users gaming accounts for their own purposes.
- We trust Facebook in a way we don't trust OKC. Facebook is the safe baby internet, with our real friends and family sending us real messages. OKC is more internet than the internet, with creeps and jerks and catfishers with phony avatars. So Facebook messing with us feels like a bigger betrayal.
- OKC's matching algorithm may be at least as opaque as Facebook's news feed, but it's clearer to users that site matches and views are generated using an algorithm. Reportedly, 62 percent of Facebook users weren't aware that Facebook's news feed was filtered by an algorithm at all. (That study has a small sample size, but still, we can infer that lots of Facebook users have no idea.)
- The results of OKC's experiments are less troubling. Facebook's study showed that our posting behavior (and maybe our feelings) were pretty susceptible to manipulation without a whole lot of effort. OKC's results seemed more complimentary. Sure, lots of people on dating sites are shallow, and sometimes you may have ended up in longer conversations than you might like with incompatible people, but good matches seem to find a way to connect no matter what OKC tells us! So... the algorithm works and I guess we can trust what they tell us? My head hurts. (Jess Zimmerman adds that part of the Facebook intervention was deliberately designed to cause harm, by making people unhappy, at least as mediated through their posts. The difference here depends on whether you think trying to match you up with someone incompatible might be causing them harm.
- The tone of the OKC post is just so darned charming. Rudder is casual, self-deprecating. It's a blog post! Meanwhile, Facebook's "emotional contagion" scholarly paper was chillingly matter-of-fact. In short, the scientism of the thing just creeped us the fuck out.
- This is related to the tone issue, but OKC seems to be fairly straightforward about why it performed the experiment: they didn't understand whether or how their matching algorithm was working, and they were trying to figure that out to make it better. Facebook seemed to be testing user's emotional expressions partly to solve a scholarly dispute and partly just to see if they could. And most of the practical justifications folks came up with for the Facebook study were pretty sinister: tricky folks into posting more often, into clicking on ads, into buying stuff. (Really, both experiments are probably a mix of product testing and shooting frogs for kicks, but the perception seems to be different.)
- The Facebook study had an added wrinkle in that academics were involved in designing the study and writing it up. This raised all sorts of factual and ethical issues about university institutional review boards and the responsibility of the journal's editors and publishers that don't seem to be relevant here. I mean, maybe SOMEbody should be veryifying that experiments done on human subjects are ethical, whether it's in a university, medical, or government context or not, but it's not like someone may have been asleep at the switch. Here, there is no switch.
- Maybe we're all just worn out. Between Facebook, this, Uber ratings, and god knows what, even if you're bothered by this kind of experimentation, it's more difficult to stay angry at any one company. So some people are jaded, some people would rather call attention to broader issues and themes of power, and some people are just tired. There's only so many times you can say "see? THIS! THIS is what I've been telling you about!" or "I can't believe you're surprised by this" before you're just like, ¯\_(?)_/¯.
I don't agree with all of these explanations, and all of them feel a little thin. But maybe for most of us, those little scraps of difference are enough.
Update: Here's a tenth reason that I thought of and then forgot until people brought up variations of it on Twitter: Facebook feels "mandatory" in a way that OKCupid doesn't. It's a bigger company with a bigger reach that plays a bigger part in more people's lives. As Sam Biddle wrote on Twitter, "Facebook is almost a utility at this point. It's like ConEd fucking with us."