Latest News: Technology |  Celebrity |  Movies |  Apple |  Cars |  Business |  Sports |  TV Shows |  Geek

Trending

Filed under: News | Advertising News

 

If Facebook's Secret Study Bothered You, Then Quit Now

1 Updates
If Facebook's Secret Study Bothered You, Then Quit Now
 
 

YouTube Videos Comments

Full Story

If Facebook's Secret Study Bothered You, Then Quit Now

There are more than 1 billion Facebook users—I’m one of them—and the reality is that all of us are test subjects for the website’s not-so-grand experiments.

ŸIs there a better way to get users to watch videos or convince them to buy something? Is there anything to learn from where a user’s cursor lands?

Facebook has done all of these tests—and likely hundreds more—in the past few months alone. And almost no user raised a peep.

But for one week in January 2012, nearly 700,000 Facebook users were part of a small A/B experiment: To see if changing a user’s News Feed to show either a slightly high number of positive or negative status updates would affect their behavior.

We know this now because, in a first, Facebook published the results in the Proceedings of the National Academy of Sciences. It turns out that tweaking the News Feed algorithm had a mildly contagious effect on users, who were a bit more likely to post their own positive or negative status updates.

And the response to the study has been overwhelming—overwhelmingly negative for Facebook, that is.

“Anger Builds Over Facebook’s Emotion-Manipulation Study,” Colin Daileda reports at Mashable. “Even the Editor of Facebook’s Mood Study Thought It Was Creepy,” Adrienne LaFrance writes at The Atlantic.

(Kashmir Hill, who’s been all over this story for Forbes, has a great summary of the study and captured Facebook’s nonplussed reaction.)

Did Facebook mistreat its users?

Facebook’s study was almost certainly legal. Users agree to terms of service that permit the company to do these sorts of experiments. And Facebook isn’t the only Web platform testing all kinds of interventions to see customer behavior. (More on that later.)

The more interesting debate: Was it ethical?

And that question’s struck a nerve, with passionate arguments on both sides.

Why Facebook acted unethically

In a scathing piece for Slate, Katy Waldman writes that the study’s “methodology raises serious ethical questions. The team may have bent research standards too far, possibly overstepping criteria enshrined in federal law and human rights declarations.”

Waldman adds that Facebook didn’t sufficiently warn users that their data might be used for research; that acknowledgment is buried in the fine print of the company’s Data Use Policy.

Why Facebook’s behavior was defensible

In response to Waldman and others, researcher Tal Yarkoni offers a cogent defense of Facebook on his blog. (Yarkoni’s a PhD who runs UT-Austin’s Psychoinformatics Lab; he was unaffiliated with the study.)

According to Yarkoni:

  • Manipulating a News Feed isn’t so different from what Facebook does all the time, given its ongoing tests.
  • Many firms already experiment on customer behavior—in fact, the tactic is often celebrated—and this isn’t necessarily a bad thing if it improves the product.
  • We shouldn’t criticize Facebook for sharing its data and tests. We should celebrate its decision to share its findings.

Many critics are condemning Facebook because users weren’t given “informed consent”—essentially, they didn’t know that they were test subjects. Informed consent is a key element of clinical research studies, and an essential prerequisite if researchers are trying to get approval from an institutional review board.

But some of the criticism around informed consent has been, well, misinformed. Take this reader comment on Yarkoni’s blog:

Experimenting on people requires informed consent. That’s not optional, even for [F]acebook…it doesn’t matter how interesting you think the results are, they should have had informed consent from study participants. They broke the law.

Here’s why that’s not true: Even in a clinical study, informed consent can be waived if the risks of the study are minimal. And contrary to early reports, it turns out that Facebook didn’t actually go through an institutional review board, according to Hill, which changes the need for informed consent in the first place.

Why there’s been a firestorm of criticism

Even if the study was legal, that doesn’t absolve concerns—legitimate in many cases—that the test was unnerving. And the massive critical response seems to be driven by a few factors, including:

1. Facebook users don’t like being messed with: Facebook’s algorithm-powered News Feed is no secret. But getting reminded that it’s an artificial environment, where Facebook’s software can radically customize your experience, can be unsettling—especially when it’s revealed that a test involved provoking an emotional response.

2. The test potentially put users at risk: There’s evidence that Facebook is already a depressing place to visit, and some critics say that making it a slightly more negative experience could have harmed vulnerable users.

“I wish someone were able to sue [Facebook] over this,” New Yorker writer Emily Nussbaum tweeted. For example, “if it triggered any significant depressive behavior” in users, she added.

3. Mucking around with users’ emotions goes against Facebook’s mission:It’s one thing if Facebook wanted to tweak the to sell. But the company wants to be more than an ordinary Web business—Facebook aspires to be like the plumbing of the Internet, a service that you use constantly without even thinking about it. And it’s not ethical to intentionally turn off the hot water for some of your customers.

4. Media members feel threatened:Much of the negative response has been powered by journalists, but there’s a potential subtext to their criticism, reporter Darius Tahir points out: Facebook’s News Feed has become an essential tool for disseminating their work.

“It’s media members who know how much power Facebook has over their publications and hence working lives,” Tahir writes. And in theory, any change to the Facebook algorithm could be professionally troubling.

How to guard yourself moving forward

In response to the criticism, Facebook data scientist Adam Kramer put out a statement on Sunday, and perhaps the company will change its policy on testing in the future.

But let Facebook’s study serve as a wake-up call: If you’re actively surfing the Web, you’re a high-tech lab rat. At least 15% of the top 10,000 websites conduct A/B testing at any given time, the MIT Technology Review reported earlier this year.

Facebook’s just the only site to publish an academic paper about it.

Meanwhile, don’t plan on the company to stop this sort of testing, no matter how loud the outcry over this study. There are too many business-driven reasons for Facebook to keep tweaking its platform and learning more about how its users respond to different triggers.

And if that concept bothers you—if you find Facebook’s artificial environment somehow less friendly today than yesterday—there’s a simple solution: Quit Facebook.

 

iPad Air Giveaway. Win a free iPad Air.

You Might Also Like

Updates


Sponsored Update


Advertisement


More From the Web

Shopping Deals

 
 
 

<a href="/latest_stories/all/all/31" rel="author">Forbes</a>
Forbes is among the most trusted resources for the world's business and investment leaders, providing them the uncompromising commentary, concise analysis, relevant tools and real-time reporting they need to succeed at work, profit from investing and have fun with the rewards of winning.

 

 

Comments

blog comments powered by Disqus

Latest stories

Windows 9 Set to be Divulged on September 30th
Windows 9 Set to be Divulged on September 30th
Microsoft Announces The Event "Threshold" Will Take Place.
 
 
Linda Cohn Files Lawsuit Against New York Ice Rink
Linda Cohn Files Lawsuit Against New York Ice Rink
ESPN broadcaster Linda Cohn is filing a lawsuit against a New York ice rink after a group of kids pushed a heavy coin machine which allegedly fell on her, resulting in a huge gash that required 25 stitches.
 
 
Nick Cannon Admits He and Wife Mariah Carey are Separated
 
 
JaMarcus Russell Does Dish Network Commercial
JaMarcus Russell Does Dish Network Commercial
Former NFL first overall pick JaMarcus Russell did a commercial for Dish Network along with Matt Leinart, Brian Bosworth and Heath Shuler which was released on Aug. 19.
 
 
 

About the Geek Mind

The “geek mind” is concerned with more than just the latest iPhone rumors, or which company will win the gaming console wars. I4U is concerned with more than just the latest photo shoot or other celebrity gossip.

The “geek mind” is concerned with life, in all its different forms and facets. The geek mind wants to know about societal and financial issues, both abroad and at home. If a Fortune 500 decides to raise their minimum wage, or any high priority news, the geek mind wants to know. The geek mind wants to know the top teams in the National Football League, or who’s likely to win the NBA Finals this coming year. The geek mind wants to know who the hottest new models are, or whether the newest blockbuster movie is worth seeing. The geek mind wants to know. The geek mind wants—needs—knowledge.

Read more about The Geek Mind.