This weekend, the Internet discovered a study published earlier this month in an academic journal that recounted how a Facebook data scientist, along with two university researchers, turned 689,003 users’ New Feeds positive or negative to see if it would elate or depress them. The purpose was to find out if emotions are “contagious” on social networks. (They are, apparently.) The justification for subjecting unsuspecting users to the psychological mind game was that everyone who signs up for Facebook agrees to the site’s “Data Use Policy,” which has a little line about how your information could be used for “research.” Some people are pretty blase about the study, their reaction along the lines of, “Dude. Facebook and advertisers manipulate us all the time. NBD.” Others, especially in the academic environment, are horrified that Facebook thinks that the little clause in the 9,045-word ToS counts as “informed consent” from a user to take part in a psychological experiment, and that an ethics board reportedly gave that interpretation a thumbs up. The larger debate is about what companies can do to their users without asking them first or telling them about it after.
I asked Facebook yesterday what the review process was for conducting the study in January 2012, and its response reads a bit tone deaf. The focus is on whether the data use was appropriate rather than on the ethics of emotionally manipulating users to have a crappy day for science. That may be because Facebook was responding to a privacy reporter:
“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” says a Facebook spokesperson. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”
It’s particularly fascinating to me that Facebook puts this in the “research to improve our services” category, as opposed to “research for academic purposes” category. That makes me wonder what other kind of psychological manipulation users are subjected to that they never learn about because it isn’t published in an academic journal. This gives more fodder to academic Ryan Calo who has argued that companies need to get their psychological studies of users vetted in some way that echoes what happens in the academic context. When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were mandated by the government in the 1970s because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience, for example. Interestingly, the Facebook “emotional contagion” project had funding from the government — the Army Research Office — according to a Cornell profile of one of the academic researchers involved. And the professor who edited the article said the study was okayed by an Institutional Review Board. That approval has led most academic commentators’ jaws to hit the floor.
Before this story broke, Betsy Haibel wrote a relevant post that linguistically elevated the stakes by calling companies’ assumption of consent from users as corporate rape culture. “The tech industry does not believe that the enthusiastic consent of its users is necessary,” wrote Haibel. “The tech industry doesn’t even believe in requiring affirmative consent.”
When I signed up for 23andMe — a genetic testing service — it asked if I was willing to be part of “23andWe,” which would allow my genetic material to be part of research studies. I had to affirmatively check a box to say I was okay with that. As I suggested when I wrote about this yesterday, I think Facebook should have something similar. While many users may already expect and be willing to have their behavior studied — and while that may be warranted with “research” being one of the 9,045 words in the data use policy — they don’t expect that Facebook will actively manipulate their environment in order to see how they react. That’s a new level of experimentation, turning Facebook from a fishbowl into a petri dish, and it’s why people are flipping out about this.