Today, Facebook's gained an ally in the fight for data mining and social experiments. OkCupid, the (mostly) free online dating service, offered a succinct corroboration: "We Experiment On Human Beings!"
While that may be true, that's not what has users frustrated with the site. Inputting the data, uploading pictures and editing a profile, and answering enough questions to choke a 1950s calculator takes a lot of time and effort for a user.
And on a site plagued with some pretty abysmal partner choices, moving percentages around just because you can doesn't sit well with many.
Women have complained and created communities about some of the less than great options. And those interactions includedbacklash of MRA memes, slang, and 'beware' warnings in an online world where people have no problem stalking and sending creepy messages. Mention a fedora and you'll find multiple people asking which OkCupid-er you mean.
Rudder's admittance of capaciously changing data has a pretty dangerous edge since all you know about the person is what they say online. While that would still be true if the numbers hadn't changed, a user can walk into the date with some idea of the person they're talking to.
And the boasting, the cavalier attitude, doesn't necessarily sit well for many people. How can you recommend a site to another person, even if finding your partner on OkCupid, if you can't trust the system to give you a better clue? It's essentially responding to a CraigsList personal since the profiles aren't seemingly relevant.
Rudder defended the company's decisions and policies to Buzzfeed.
"I understand a lot of the issues and why the anger is there. It’s confusing. But people also need to understand that every website, every part of modern web development — nobody launches a redesign without testing on different users." Ignoring the paternalistic edge to the comments, he openly admitted to the company’s decisions.
"It’s just not unusual at all and I can’t remember a time we launched a significant feature and didn’t test it on 10, 20, or 30% of users."
There's something like playing god in this, too.
Rudder openly admits to using the power of suggestion. "When we tell people they are a good match, they act as if they are." But isn't the idea to find someone who is a good match and is compatible? According to the Dataclysm author, the answer is yes. "As you can see, the ideal situation is the lower right: to both be told you’re a good match, and at the same time actually be one."
However, those four messages can also be really short "how are you" or "wanna go get a beer" without a lot of actual interaction. "And if you have to choose only one or the other, the mere myth of compatibility works just as well as the truth."
Anyone who's ever been in a long-term relationship knows that's a lie.
Or anyone who has used the site and actually read some of the barebones profiles with just pics of killer abs meant to distract. Ask women about the disappointment in finding out those high numbers mean nothing when there's no basis for conversation. Or just ask for naked body shots and/or sex without even saying hello first. Or the catfishers.
The list could go on, really.
He implies the beneficial elements of the manipulation to Buzzfeed because "if the algorithm changes, yeah, they go on different dates, discover different people, maybe even marry somebody different." So while proclaiming "that’s not me playing god," he seems to be doing just that. "Any decision the site makes has those implications because people are really using these services in their lives.”
Online users aren't stupid-they know about data mining, stats, selling ad space-but it's different when a site shows no remorse. Pro-experiment comments on the internet have included "but it's free." Well, that's not entirely true. Pay some cash and you're ahead of the line, getting premium services.
So are those paying for the service also being duped? That's beyond ethical debates about filtering out "unattractive" people and now concerns financial percussions. Someone's paying for a service not being provided.
And in the meantime, Rudder wanted to make everything crystal clear on why he's defending Facebook.
"They’re going to get blowback simply because people just hate them." Rudder defended Facebook's actions by insinuating the consumer just doesn't understand the value in experimentation. “Facebook has that relationship with the internet commentariat."
"They’re public enemy number-one when it comes to anything to do with data.” Yet right after saying the academic journal publishing makes the emotional manipulation okay, the OkCupid founder does a 180-degree turn. “In some ways they deserve it, though, given the way they advertise, just jamming information into the feed.”
Admitting to Newsweek that he's not "a fan of Facebook or anything," he somehow compared Facebook's plight to "how Fox News treats Obama." The former Creative Director to TheSpark.com added more about the unfair criticisms on Mark Zuckerberg's invention. "And I feel like if you put the words Facebook, data, users, and privacy in a sentence, you’re dead. It doesn’t even matter what it says."
He seems to think OkCupid's safe from the same level of criticism based on the fact "OkCupid is obviously much lower profile." Plus, as he helpfully points out, the manipulation's totally cool since they're upfront in the beginning. " Our terms and conditions permit this stuff explicitly."
Somewhere, he gets a little lost in explaining what the company’s done after the willing experiments, though. "We notified the users who were most affected by email after the fact."
There's a difference in finding a better algorithm and lying to customers and users.
Forbes reports the company was purchased for $50 million dollars three years ago—interestingly, the same time Rudder stopped openly posting about these social experiments.
If OkCupid's choices were a true academic study, the review board would yank all results so fast that the beaker would be spinning for days--that whole pesky ethics thing again. And consumer questions about what they're really paying for.