Menu
Three Breast Woman Jasmine Tridevil is a Hoax

Three Breast Woman Jasmine Tridevil is a Hoax

iPhone 6 Plus Has Not the Best Smartphone Screen

iPhone 6 Plus Has Not the Best Smartphone Screen

Amber Heard Topless Photo Leaked

Amber Heard Topless Photo Leaked

Kim Kardashian Leaked Photos Backlash

Kim Kardashian Leaked Photos Backlash

Stephanie Beaudoin Dubbed Worlds Hottest Criminal

Stephanie Beaudoin Dubbed Worlds Hottest Criminal

Facebook Doesn't Understand The Fuss About Its Emotion Manipulation Study

Jun 29 2014, 1:26pm CDT | by , in News | Also on the Geek Mind

Facebook Doesn't Understand The Fuss About Its Emotion Manipulation Study
 
 

YouTube Videos Tweets Comments

Full Story

Facebook Doesn't Understand The Fuss About Its Emotion Manipulation Study

This weekend, the Internet discovered a study published earlier this month in an academic journal that recounted how a Facebook data scientist, along with two university researchers, turned 689,003 users’ New Feeds positive or negative to see if it would elate or depress them. The purpose was to find out if emotions are “contagious” on social networks. (They are, apparently.) The justification for subjecting unsuspecting users to the psychological mind game was that everyone who signs up for Facebook agrees to the site’s “Data Use Policy,” which has a little line about how your information could be used for “research.” Some people are pretty blase about the study, their reaction along the lines of, “Dude. Facebook and advertisers manipulate us all the time. NBD.” Others, especially in the academic environment, are horrified that Facebook thinks that the little clause in the 9,045-word ToS counts as “informed consent” from a user to take part in a psychological experiment, and that an ethics board reportedly gave that interpretation a thumbs up. The larger debate is about what companies can do to their users without asking them first or telling them about it after.

I asked Facebook yesterday what the review process was for conducting the study in January 2012, and its response reads a bit tone deaf. The focus is on whether the data use was appropriate rather than on the ethics of emotionally manipulating users to have a crappy day for science. That may be because Facebook was responding to a privacy reporter:

“This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account,” says a Facebook spokesperson. “We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. We carefully consider what research we do and have a strong internal review process. There is no unnecessary collection of people’s data in connection with these research initiatives and all data is stored securely.”

It’s particularly fascinating to me that Facebook puts this in the “research to improve our services” category, as opposed to “research for academic purposes” category. That makes me wonder what other kind of psychological manipulation users are subjected to that they never learn about because it isn’t published in an academic journal. This gives more fodder to academic Ryan Calo who has argued that companies need to get their psychological studies of users vetted in some way that echoes what happens in the academic context. When universities conduct studies on people, they have to run them by an ethics board first to get approval — ethics boards that were mandated by the government in the 1970s because scientists were getting too creepy in their experiments, getting subjects to think they were shocking someone to death in order to study obedience, for example. Interestingly, the Facebook “emotional contagion” project had funding from the government — the Army Research Office — according to a Cornell profile of one of the academic researchers involved. And the professor who edited the article said the study was okayed by an Institutional Review Board. That approval has led most academic commentators’ jaws to hit the floor.

Before this story broke, Betsy Haibel wrote a relevant post that linguistically elevated the stakes by calling companies’ assumption of consent from users as corporate rape culture. “The tech industry does not believe that the enthusiastic consent of its users is necessary,” wrote Haibel. “The tech industry doesn’t even believe in requiring affirmative consent.”

When I signed up for 23andMe — a genetic testing service — it asked if I was willing to be part of “23andWe,” which would allow my genetic material to be part of research studies. I had to affirmatively check a box to say I was okay with that. As I suggested when I wrote about this yesterday, I think Facebook should have something similar. While many users may already expect and be willing to have their behavior studied — and while that may be warranted with “research” being one of the 9,045 words in the data use policy — they don’t expect that Facebook will actively manipulate their environment in order to see how they react. That’s a new level of experimentation, turning Facebook from a fishbowl into a petri dish, and it’s why people are flipping out about this.

 

You Might Also Like

Updates


Sponsored Update

Update: 2

Tweets

 

Advertisement


More From the Web

Shopping Deals

 
 
 

<a href="/latest_stories/all/all/31" rel="author">Forbes</a>
Forbes is among the most trusted resources for the world's business and investment leaders, providing them the uncompromising commentary, concise analysis, relevant tools and real-time reporting they need to succeed at work, profit from investing and have fun with the rewards of winning.

 

 

Comments

blog comments powered by Disqus

Latest stories

Bruce Jenner and Kris Jenner decide to Split up
Bruce Jenner and Kris Jenner decide to Split up
Bruce Jenner and Kris Jenner have decided to split up.
 
 
Taylor Swift Invites Fans at Home for 1989 Secret Listening Session
Taylor Swift Invites Fans at Home for 1989 Secret Listening Session
Taylor Swift invited some lucky fans to listen to her upcoming new album “1989” where and she also talked about her ex Harry Styles.
 
 
Ferrari 458 Recall is Hilarious
Ferrari 458 Recall is Hilarious
You won't believe the reason why the Ferrari 458 super sports car is recalled.
 
 
Gwen Stefani and Pharrell Williams show their Stuff on The Voice Season 7
Gwen Stefani and Pharrell Williams show their Stuff on The Voice Season 7
Gwen Stefani and Pharrell Williams showed their stuff on The Voice Season 7.
 
 
 

About the Geek Mind

The “geek mind” is concerned with more than just the latest iPhone rumors, or which company will win the gaming console wars. I4U is concerned with more than just the latest photo shoot or other celebrity gossip.

The “geek mind” is concerned with life, in all its different forms and facets. The geek mind wants to know about societal and financial issues, both abroad and at home. If a Fortune 500 decides to raise their minimum wage, or any high priority news, the geek mind wants to know. The geek mind wants to know the top teams in the National Football League, or who’s likely to win the NBA Finals this coming year. The geek mind wants to know who the hottest new models are, or whether the newest blockbuster movie is worth seeing. The geek mind wants to know. The geek mind wants—needs—knowledge.

Read more about The Geek Mind.