Don't miss

Replay


LATEST SHOWS

EYE ON AFRICA

Benin feels the pinch of Nigeria's economic woes

Read more

BUSINESS DAILY

Deutsche Bank shares recover after turbulent week

Read more

MEDIAWATCH

Inside Aleppo: 'Feels like prison'

Read more

THE WORLD THIS WEEK

The Legacy of Shimon Peres, The Battle of Aleppo (Part 1)

Read more

THE WORLD THIS WEEK

Trump-Clinton Debate, Colombia Peace Deal, Death of the BlackBerry (Part 2)

Read more

FRANCE IN FOCUS

Backstage at Paris Fashion Week

Read more

FASHION

Paris Fashion Week: Saint Laurent, Lanvin, present new designers

Read more

#THE 51%

Online and proud: Iranian women use social media in a campaign for equality

Read more

#TECH 24

Say hello to Pepper!

Read more

Americas

Facebook under fire for manipulating users' emotions

© Afp

Text by FRANCE 24

Latest update : 2014-06-30

Facebook has come under fire for attempting to manipulate the feelings of users as part of a study on "emotional contagion" that saw the social networking giant tamper with the content users saw on their Facebook "news feeds".

For one week in 2012, Facebook tampered with the algorithm normally used to place posts into Facebook "news feeds" to study how the content affected the moods of some 700,000 users.

Researchers sought to determine whether the number of positive or negative words in "newsfeed" messages then led users to post positive or negative content in turn.

The study, entitled Experimental Evidence of Massive-Scale Emotional Contagion Through Social Networks and conducted by researchers affiliated with Facebook, Cornell University and the University of California at San Francisco, appeared in the June 17 edition of the Proceedings of the National Academy of Sciences.

"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," the study authors wrote.

"These results indicate that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks."

While other research has used metadata to study trends, this experiment appears to be unique because it manipulated the data to see if there was a particular reaction.

News of the study prompted anger when online magazines wrote about it over the weekend, with Slate calling the study "unethical" and The Atlantic also questioning the ethics of the study while saying it was "almost certainly legal".

'Consistent' with privacy policy

The social network, which counts more than one billion active users, said in a statement that "none of the data used was associated with a specific person's Facebook account".

"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible," it said.

"A big part of this is understanding how people respond to different types of content, whether it's positive or negative in tone, news from friends or information from pages they follow. We carefully consider what research we do and have a strong internal review process."

In the paper, the researchers said the study "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook".

Facebook's privacy policy states that it may use the information in collects about users for "for internal operations, including troubleshooting, data analysis, testing, research and service improvement".

Susan Fiske, a Princeton University professor who edited the report for publication, said the researchers assured her the study had been approved ahead of time by an ethics review board.

"They approved the study as exempt, because it is essentially a pre-existing dataset, part of FB's ongoing research into filtering users' news feeds for what they will find most interesting," she told AFP in an email.

"Many ethical issues are open to debate, and this one seems to have struck a nerve."

Katherine Sledge Moore, a psychology professor at Elmhurst College, said the study was fairly standard overall, especially for so-called "deception studies" in which participants are given one purpose for the research when they provide initial consent and told later what the study is really about.

In this case, however, the study's subjects did not know they were taking part.

"Based on what Facebook does with their newsfeed all of the time and based on what we've agreed to by joining Facebook, this study really isn't that out of the ordinary," Moore said.

"The results are not even that alarming or exciting."

(FRANCE 24 with AFP)

 

Date created : 2014-06-30

  • USA

    US govt reaches deal with Google, Facebook on information disclosure

    Read more

  • MOROCCO

    Morocco arrests teens for Facebook kiss picture

    Read more

  • INTERNET

    Facebook hires French artificial intelligence guru

    Read more

COMMENT(S)