Advertise here with Carbon Ads

This site is made possible by member support. โค๏ธ

Big thanks to Arcustech for hosting the site and offering amazing tech support.

When you buy through links on kottke.org, I may earn an affiliate commission. Thanks for supporting the site!

kottke.org. home of fine hypertext products since 1998.

๐Ÿ”  ๐Ÿ’€  ๐Ÿ“ธ  ๐Ÿ˜ญ  ๐Ÿ•ณ๏ธ  ๐Ÿค   ๐ŸŽฌ  ๐Ÿฅ”

kottke.org posts about OKCupid

The problem with OKCupid is the problem with the social web

Hi, everybody! Tim Carmody here, guest-hosting for Jason this week.

On Monday, I tried to list some reasons why OKCupid’s self-acknowledged experiments on its users didn’t seem to be stirring up the same outrage that Facebook’s had. Here at the end of the week, I think I was largely right: fewer people are upset, the anger is more tempered, and that has a lot to do with the reasons I gave. But one reaction I didn’t expect is that some people took it as saying that I wasn’t upset by what OKCupid did, or that people shouldn’t be as upset by it.

What OKCupid did has actually made me madder and madder as the week’s gone on, but for reasons that are different from other people’s. I think this is pretty important, so I’m going to try to explain why.

Let’s start with the Facebook “social contagion” study. Most Facebook critics focused on the people who were the subjects of the study, for good reasons. Did these users give consent? Can terms of service count as consent for an academic study? Should they have been informed of the study afterwards? Is Facebook responsible for any harm these users might have suffered? Is an increase or decrease in engagement really a sign that users’ emotions were affected? How else has Facebook attempted to influence its users, or might try in the future? These are all good questions.

But what if you flip it around? What if you weren’t one of the subjects whose moods Facebook was trying to study, but one of their friends or family? What if you were one of the people whose posts were filtered because your keywords were too happy, too angry, or too sad?

I think there’s no way to know whether the Facebook study may have harmed people who weren’t being studied. And even though the TOS basically says that users give Facebook permission to do whatever they want not only with the users’ data, but all of their friends’ too, you can’t call that consent with a straight face. (This is just another reason that software terms of service are a rotten legal and ethical basis for research. They just weren’t built for that reason, or to solve any of those problems.)

So Facebook didn’t just mess around with some of its users’ feeds, hoping to see if it might mess around with their feelings. It used some of its users’ posts in order to do it. Arguably, it made them complicit.

To be clear, filtering posts, giving preference to some and not others, is how Facebook’s newsfeed algorithm always works. Facebook users have been complaining about this for a long time, especially brands and news organizations and other companies who’ve built up their subscriber counts and complain that hardly anybody ever sees their posts unless they pay off Facebook’s ad department. And Facebook makes no guarantees, anywhere, that they’re going to deliver every message to every user who’s subscribed to it. Readers miss posts all the time, usually just because they’re just not looking at the screen or reading everything they could see. Facebook isn’t certified mail. It’s not even email. All this is known.

However.

We all buy in to Facebook (and Twitter, and OKCupid, and every other social media network), giving them a huge amount of personal data, free content, and discretion on how they show it to us, with the understanding that all of this will largely be driven by choices that we make. We build our own profiles, we select our favorite pictures, we make our own friends, we friend whatever brands we like, we pick the users we want to block or mute or select for special attention, and we write our own stories.

Even the filtering algorithms, we’re both told and led to assume, are the product of our choices. Either we make these choices explicitly (mute this user, don’t show me this again, more results like these) or implicitly (we liked the last five baby pictures, so Facebook shows us more baby pictures; we looked at sites X, Y, and Z, so we see Amazon ads for people who looked at X, Y, and Z. It’s not arbitrary; it’s personalized. And it’s personalized for our benefit, to reflect the choices that we and the people we trust have made.

This is what makes the user-created social web great. It’s the value it adds over traditional news media, traditional classified ads, traditional shopping, everything.

We keep copyright on everything we write and every image we post, giving these services a broad license to use it. And whenever the terms of service seem to be saying that these companies have the right to do things we would never want them to do, we’re told that these are just the legal terms that the companies need in order to offer the ordinary, everyday service that we’ve asked them to do for us.

This is why it really stings whenever somebody turns around and says, “well actually, the terms you’ve signed give us permission to do whatever we want. Not just the thing you were afraid of, but a huge range of things you never thought of.” You can’t on one hand tell us to pay no attention when you change these things on us, and with the other insist that this is what we’ve really wanted to do all along. I mean, fuck me over, but don’t tell me that I really wanted you to fuck me over all along.

Because ultimately, the reason you needed me to agree in the first place isn’t just because I’m using your software, but because you’re using my stuff. And the reason I’m letting you use my stuff, and spending all this time working on it, is so that you can show it to people.

I’m not just a user of your service, somebody who reads the things that you show it to me: I’m one of the reasons you have anything that you can show to anyone at all.

Now let’s go back to the OKCupid experiment. Facebook didn’t show some of its users posts that their friends wrote. But at least it was a binary thing: either your post was shown, just as you wrote it, or it wasn’t. OKCupid actually changed the information it displayed to users.

You can pick nits and say OKC didn’t change it, but rather, just selectively repressed parts of it, deleting photos on some profiles and text on others. But if you’ve ever created a profile on any web site, you know that it’s presented as being a whole ensemble, the equivalent of a home page. The photos, the background, the description, the questions you answer: taken altogether, that’s your representation of yourself to everyone else who may be interested. It’s the entire reason why you are there.

Now imagine you’re an OKCupid user, and you strike up a conversation with someone or someone strikes up a conversation with you. You assume that the other person has all of your information available to them if they’re willing to look at it. That’s the basis of every conversation you have on that site. Except they don’t. The profile that OKCupid has implicitly promised they’ll show to everyone who looks at it has been changed. The other person either doesn’t know what you look like (and assumes you can’t be bothered to post a photo) or doesn’t know anything else about you (and assumes you can’t be bothered to write anything about yourself.) Both of you have been deceived, so the site can see what happens.

This is why I question the conclusion that OKC users who were only shown profiles with pictures are shallow, because their conversations were almost as long as the ones who were shown full profiles. This is how I imagine those conversations going:

Rosencrantz: So what do you do?
Guildenstern: Um I work in marketing?
Rosencrantz: That’s great! Where did you go to school?
Guildenstern: I went to UVA
Guildenstern: Wait a minute are you some kind of bot?
Rosencrantz: What makes you say that?
Guildenstern: You keep asking me questions that are in my profile, did you even read it
Rosencrantz: I’m looking at it right now, why didn’t you answer any of the questions
Guildenstern: lol I guess you can’t read nice pic though goodbye

That’s a high-value interaction by the OKC researchers’ standards, by the way.

This is also why I don’t have much patience with the idea that “The worst thing could have happened [with the OkCupid testing] is people send a few more messages, and maybe you went on a date you didn’t like.” (Rey Junco told this to ReadWrite to explain why he thought Facebook’s study was worse than OKCupid’s, but you see versions of this all over.)

First, going on “a date you didn’t like” isn’t a frivolous thing. It definitely incurs more material costs than not seeing a Facebook status. And bad (or good) messages or a bad or good date can definitely have a bigger emotional impact as well.

More importantly, though, don’t make this just a question about dates or feelings, about what somebody did or didn’t read and what its effect on them was. I don’t care if you think someone making a dating profile is a frivolous thing. Somebody made that. They thought the company hosting it could be trusted to present it honestly. They were wrong.

So this is the problem I see not just with Facebook and OKCupid’s experiments, but with most of the arguments about them. They’re all too quick to accept that users of these sites are readers who’ve agreed to let these sites show them things. They don’t recognize or respect that the users are also the ones who’ve made almost everything that those sites show. They only treat you as a customer, never a client.

And in this respect, OKCupid’s Christian Rudder and the brigade of “and this surprises you?” cynics are right: this is what everybody does. This is the way the internet works now. (Too much of it, anyway.) It doesn’t matter whether your site is performing interventions on you or not, let alone publishing them. Too many of them have accepted this framework.

Still, for as long as the web does work this way, we are never only these companies’ “products,” but their producers, too. And to the extent that these companies show they aren’t willing to live up to the basic agreement that we make these things and give them to you so you will show them to other people โ€” the engine that makes this whole world wide web business go โ€” I’m not going to have anything to do with them any more. What’s more, I’ll get mad enough to find a place that will show the things I write to other people and tell them they shouldn’t accept it either. Because, ultimately, you ought to be ashamed to treat people and the things they make this way.

It’s not A/B testing. It’s just being an asshole.

Update: OKCupid’s Christian Rudder (author of the “We Experiment On Human Beings” post) gave an interview to Alex Goldman and PJ Vogt for On the Media’s TLDR podcast.

Rudder says some of the negative response “is my own fault, because, y’know, the blog post is sensationally written, for sure.” But he doesn’t back off of that tone one bit. In fact, he doubles down.

Alex Goldman: Have you thought about bringing in, say, like an ethicist to, to vet your experiments?

Christian Rudder, founder of OkCupid: To wring his hands all day for a hundred thousand dollars a year?… This is the only way to find this stuff out. If you guys have an alternative to the scientific method, I’m all ears.

I think he maybe should have just written the blog post and left it alone.

Update: University of Maryland Professor of Law James Grimmelmann say that not only were OKCupid’s and Facebook’s studies unethical, but they were illegal.

Most of the resulting discussion has treated this as a story about ethics. Which it is โ€” and the lapses of ethical judgment shown by Facebook and OkCupid are scandalous. But the ethics are only half of the story. What Facebook and OkCupid did wasn’t just unethical. It was illegal. A common assumption is that even if research laws ought to apply to private companies, they don’t. But that assumption is false. Facebook and OkCupid are bound by research laws, and those research laws quite clearly prohibit what they did.


Why don’t OKCupid’s experiments bother us like Facebook’s did?

Hi, everybody! Tim Carmody here, guest-hosting for Jason this week.

OK Cupid’s Christian Rudder has responded to the outcry over Facebook’s experiments with user emotions by… publishing a list of experiments that the dating site has run on its users, along with their results.

And it’s not little stuff either! To test its matching algorithms, OKC has selectively hidden users’ profile images, their profile text, and even told pairs of users they were a good match when the algo said they weren’t, and vice versa.

In short, Facebook may have hid stuff from you, but OK Cupid might have actually lied to you.

But… nobody’s really upset about this. Or if they are, they’re mostly just upset (or dryly observing, it’s hard to tell) that other people aren’t upset.

Why? I have some theories:

  1. It’s early yet. It took the Facebook story some time to steep before it really picked up steam.
  2. OKC users are less likely to be troubled by this sort of thing than Facebook users are. And people get more upset when they feel like they personally might have been messed with. Hilary Parker pointed out that non-online daters are less likely to get upset on online daters’ behalf: even if you don’t actively look down on OKC users (and many do), you might be more likely to think they got what they deserved. OK Cupid has a history of disclosing these kinds of numbers, and there’s a laissez-faire attitude towards users gaming accounts for their own purposes.
  3. We trust Facebook in a way we don’t trust OKC. Facebook is the safe baby internet, with our real friends and family sending us real messages. OKC is more internet than the internet, with creeps and jerks and catfishers with phony avatars. So Facebook messing with us feels like a bigger betrayal.
  4. OKC’s matching algorithm may be at least as opaque as Facebook’s news feed, but it’s clearer to users that site matches and views are generated using an algorithm. Reportedly, 62 percent of Facebook users weren’t aware that Facebook’s news feed was filtered by an algorithm at all. (That study has a small sample size, but still, we can infer that lots of Facebook users have no idea.)
  5. The results of OKC’s experiments are less troubling. Facebook’s study showed that our posting behavior (and maybe our feelings) were pretty susceptible to manipulation without a whole lot of effort. OKC’s results seemed more complimentary. Sure, lots of people on dating sites are shallow, and sometimes you may have ended up in longer conversations than you might like with incompatible people, but good matches seem to find a way to connect no matter what OKC tells us! So… the algorithm works and I guess we can trust what they tell us? My head hurts. (Jess Zimmerman adds that part of the Facebook intervention was deliberately designed to cause harm, by making people unhappy, at least as mediated through their posts. The difference here depends on whether you think trying to match you up with someone incompatible might be causing them harm.
  6. The tone of the OKC post is just so darned charming. Rudder is casual, self-deprecating. It’s a blog post! Meanwhile, Facebook’s “emotional contagion” scholarly paper was chillingly matter-of-fact. In short, the scientism of the thing just creeped us the fuck out.
  7. This is related to the tone issue, but OKC seems to be fairly straightforward about why it performed the experiment: they didn’t understand whether or how their matching algorithm was working, and they were trying to figure that out to make it better. Facebook seemed to be testing user’s emotional expressions partly to solve a scholarly dispute and partly just to see if they could. And most of the practical justifications folks came up with for the Facebook study were pretty sinister: tricky folks into posting more often, into clicking on ads, into buying stuff. (Really, both experiments are probably a mix of product testing and shooting frogs for kicks, but the perception seems to be different.)
  8. The Facebook study had an added wrinkle in that academics were involved in designing the study and writing it up. This raised all sorts of factual and ethical issues about university institutional review boards and the responsibility of the journal’s editors and publishers that don’t seem to be relevant here. I mean, maybe SOMEbody should be veryifying that experiments done on human subjects are ethical, whether it’s in a university, medical, or government context or not, but it’s not like someone may have been asleep at the switch. Here, there is no switch.
  9. Maybe we’re all just worn out. Between Facebook, this, Uber ratings, and god knows what, even if you’re bothered by this kind of experimentation, it’s more difficult to stay angry at any one company. So some people are jaded, some people would rather call attention to broader issues and themes of power, and some people are just tired. There’s only so many times you can say “see? THIS! THIS is what I’ve been telling you about!” or “I can’t believe you’re surprised by this” before you’re just like, ยฏ\_(?)_/ยฏ.

I don’t agree with all of these explanations, and all of them feel a little thin. But maybe for most of us, those little scraps of difference are enough.

Update: Here’s a tenth reason that I thought of and then forgot until people brought up variations of it on Twitter: Facebook feels “mandatory” in a way that OKCupid doesn’t. It’s a bigger company with a bigger reach that plays a bigger part in more people’s lives. As Sam Biddle wrote on Twitter, “Facebook is almost a utility at this point. It’s like ConEd fucking with us.”