The problem with OKCupid is the problem with the social web TIM CARMODY · AUG 01 2014
Hi, everybody! Tim Carmody here, guest-hosting for Jason this week.
On Monday, I tried to list some reasons why OKCupid's self-acknowledged experiments on its users didn't seem to be stirring up the same outrage that Facebook's had. Here at the end of the week, I think I was largely right: fewer people are upset, the anger is more tempered, and that has a lot to do with the reasons I gave. But one reaction I didn't expect is that some people took it as saying that I wasn't upset by what OKCupid did, or that people shouldn't be as upset by it.
What OKCupid did has actually made me madder and madder as the week's gone on, but for reasons that are different from other people's. I think this is pretty important, so I'm going to try to explain why.
Let's start with the Facebook "social contagion" study. Most Facebook critics focused on the people who were the subjects of the study, for good reasons. Did these users give consent? Can terms of service count as consent for an academic study? Should they have been informed of the study afterwards? Is Facebook responsible for any harm these users might have suffered? Is an increase or decrease in engagement really a sign that users' emotions were affected? How else has Facebook attempted to influence its users, or might try in the future? These are all good questions.
But what if you flip it around? What if you weren't one of the subjects whose moods Facebook was trying to study, but one of their friends or family? What if you were one of the people whose posts were filtered because your keywords were too happy, too angry, or too sad?
It's a small thing, but I haven't seen anybody discuss the Facebook emotion study from the perspective of authors of the filtered posts.— Tim Carmody (@tcarmody) June 29, 2014
You had good news; maybe your child was born. You had bad news; maybe a call for help. Your friends never saw it, bc of an involuntary study— Tim Carmody (@tcarmody) June 29, 2014
The emotions study shows definitively that this opacity of what posts do or don't get delivered by Facebook is universal and without limit.— Tim Carmody (@tcarmody) June 29, 2014
I think there's no way to know whether the Facebook study may have harmed people who weren't being studied. And even though the TOS basically says that users give Facebook permission to do whatever they want not only with the users' data, but all of their friends' too, you can't call that consent with a straight face. (This is just another reason that software terms of service are a rotten legal and ethical basis for research. They just weren't built for that reason, or to solve any of those problems.)
So Facebook didn't just mess around with some of its users' feeds, hoping to see if it might mess around with their feelings. It used some of its users' posts in order to do it. Arguably, it made them complicit.
To be clear, filtering posts, giving preference to some and not others, is how Facebook's newsfeed algorithm always works. Facebook users have been complaining about this for a long time, especially brands and news organizations and other companies who've built up their subscriber counts and complain that hardly anybody ever sees their posts unless they pay off Facebook's ad department. And Facebook makes no guarantees, anywhere, that they're going to deliver every message to every user who's subscribed to it. Readers miss posts all the time, usually just because they're just not looking at the screen or reading everything they could see. Facebook isn't certified mail. It's not even email. All this is known.
We all buy in to Facebook (and Twitter, and OKCupid, and every other social media network), giving them a huge amount of personal data, free content, and discretion on how they show it to us, with the understanding that all of this will largely be driven by choices that we make. We build our own profiles, we select our favorite pictures, we make our own friends, we friend whatever brands we like, we pick the users we want to block or mute or select for special attention, and we write our own stories.
Even the filtering algorithms, we're both told and led to assume, are the product of our choices. Either we make these choices explicitly (mute this user, don't show me this again, more results like these) or implicitly (we liked the last five baby pictures, so Facebook shows us more baby pictures; we looked at sites X, Y, and Z, so we see Amazon ads for people who looked at X, Y, and Z. It's not arbitrary; it's personalized. And it's personalized for our benefit, to reflect the choices that we and the people we trust have made.
This is what makes the user-created social web great. It's the value it adds over traditional news media, traditional classified ads, traditional shopping, everything.
We keep copyright on everything we write and every image we post, giving these services a broad license to use it. And whenever the terms of service seem to be saying that these companies have the right to do things we would never want them to do, we're told that these are just the legal terms that the companies need in order to offer the ordinary, everyday service that we've asked them to do for us.
This is why it really stings whenever somebody turns around and says, "well actually, the terms you've signed give us permission to do whatever we want. Not just the thing you were afraid of, but a huge range of things you never thought of." You can't on one hand tell us to pay no attention when you change these things on us, and with the other insist that this is what we've really wanted to do all along. I mean, fuck me over, but don't tell me that I really wanted you to fuck me over all along.
Because ultimately, the reason you needed me to agree in the first place isn't just because I'm using your software, but because you're using my stuff. And the reason I'm letting you use my stuff, and spending all this time working on it, is so that you can show it to people.
I'm not just a user of your service, somebody who reads the things that you show it to me: I'm one of the reasons you have anything that you can show to anyone at all.
Now let's go back to the OKCupid experiment. Facebook didn't show some of its users posts that their friends wrote. But at least it was a binary thing: either your post was shown, just as you wrote it, or it wasn't. OKCupid actually changed the information it displayed to users.
You can pick nits and say OKC didn't change it, but rather, just selectively repressed parts of it, deleting photos on some profiles and text on others. But if you've ever created a profile on any web site, you know that it's presented as being a whole ensemble, the equivalent of a home page. The photos, the background, the description, the questions you answer: taken altogether, that's your representation of yourself to everyone else who may be interested. It's the entire reason why you are there.
Now imagine you're an OKCupid user, and you strike up a conversation with someone or someone strikes up a conversation with you. You assume that the other person has all of your information available to them if they're willing to look at it. That's the basis of every conversation you have on that site. Except they don't. The profile that OKCupid has implicitly promised they'll show to everyone who looks at it has been changed. The other person either doesn't know what you look like (and assumes you can't be bothered to post a photo) or doesn't know anything else about you (and assumes you can't be bothered to write anything about yourself.) Both of you have been deceived, so the site can see what happens.
This is why I question the conclusion that OKC users who were only shown profiles with pictures are shallow, because their conversations were almost as long as the ones who were shown full profiles. This is how I imagine those conversations going:
Rosencrantz: So what do you do?
Guildenstern: Um I work in marketing?
Rosencrantz: That's great! Where did you go to school?
Guildenstern: I went to UVA
Guildenstern: Wait a minute are you some kind of bot?
Rosencrantz: What makes you say that?
Guildenstern: You keep asking me questions that are in my profile, did you even read it
Rosencrantz: I'm looking at it right now, why didn't you answer any of the questions
Guildenstern: lol I guess you can't read nice pic though goodbye
That's a high-value interaction by the OKC researchers' standards, by the way.
This is also why I don't have much patience with the idea that "The worst thing could have happened [with the OkCupid testing] is people send a few more messages, and maybe you went on a date you didn't like." (Rey Junco told this to ReadWrite to explain why he thought Facebook's study was worse than OKCupid's, but you see versions of this all over.)
First, going on "a date you didn't like" isn't a frivolous thing. It definitely incurs more material costs than not seeing a Facebook status. And bad (or good) messages or a bad or good date can definitely have a bigger emotional impact as well.
More importantly, though, don't make this just a question about dates or feelings, about what somebody did or didn't read and what its effect on them was. I don't care if you think someone making a dating profile is a frivolous thing. Somebody made that. They thought the company hosting it could be trusted to present it honestly. They were wrong.
So this is the problem I see not just with Facebook and OKCupid's experiments, but with most of the arguments about them. They're all too quick to accept that users of these sites are readers who've agreed to let these sites show them things. They don't recognize or respect that the users are also the ones who've made almost everything that those sites show. They only treat you as a customer, never a client.
And in this respect, OKCupid's Christian Rudder and the brigade of "and this surprises you?" cynics are right: this is what everybody does. This is the way the internet works now. (Too much of it, anyway.) It doesn't matter whether your site is performing interventions on you or not, let alone publishing them. Too many of them have accepted this framework.
Still, for as long as the web does work this way, we are never only these companies' "products," but their producers, too. And to the extent that these companies show they aren't willing to live up to the basic agreement that we make these things and give them to you so you will show them to other people -- the engine that makes this whole world wide web business go -- I'm not going to have anything to do with them any more. What's more, I'll get mad enough to find a place that will show the things I write to other people and tell them they shouldn't accept it either. Because, ultimately, you ought to be ashamed to treat people and the things they make this way.
It's not A/B testing. It's just being an asshole.
Rudder says some of the negative response "is my own fault, because, y'know, the blog post is sensationally written, for sure." But he doesn't back off of that tone one bit. In fact, he doubles down.
Alex Goldman: Have you thought about bringing in, say, like an ethicist to, to vet your experiments?
Christian Rudder, founder of OkCupid: To wring his hands all day for a hundred thousand dollars a year?... This is the only way to find this stuff out. If you guys have an alternative to the scientific method, I'm all ears.
I think he maybe should have just written the blog post and left it alone.
Update: University of Maryland Professor of Law James Grimmelmann say that not only were OKCupid's and Facebook's studies unethical, but they were illegal.
Most of the resulting discussion has treated this as a story about ethics. Which it is -- and the lapses of ethical judgment shown by Facebook and OkCupid are scandalous. But the ethics are only half of the story. What Facebook and OkCupid did wasn't just unethical. It was illegal. A common assumption is that even if research laws ought to apply to private companies, they don't. But that assumption is false. Facebook and OkCupid are bound by research laws, and those research laws quite clearly prohibit what they did.