Turn off the Ad Banner  

To print: Select File and then Print from your browser's menu.

    -----------------------------------------------
This story was printed from CdrInfo.com,
located at http://www.cdrinfo.com.
-----------------------------------------------

Appeared on: Monday, June 30, 2014
Facebook Says It Cares About the Emotional Impact of The Social Network

The researchers beind a controversial psychology experiment on Facebook users defended the research, saying it was aimed at looking into a common concern that seeing friends post positive content on the social networking website leads people to feel negative or left out.

According to researcher Adam D.I. Kramer, the researchers were also concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. "We didn't clearly state our motivations in the paper," Kramer wrote on his Facebook page Sunday.

"The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product," Kramer wrote.

The comments follow protest on the Internet that Facebook was manipulating its users' emotions with the experiment that doctored the content on the news feeds of close to 700,000 users.

The research, published in the Proceedings of the National Academy of Sciences of the United States of America, described the use of an algorithm to manipulate content in the News Feed of 689,003 users in two parallel experiments. In one the exposure to friends' positive emotional content in the news feed was reduced, while in the other exposure to negative emotional content in the feed was reduced.

"Emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness," the research concluded.

The response to the experiment was negative.

Kramer wrote in his post that the experiment, conducted in early 2012, was the result of "minimally deprioritizing"a small percentage of content in News Feed for about 0.04 percent of users, or 1 in 2500 for the short period of one week, which is a big number considering the large number of users Facebook has. Nobody's posts were "hidden," but they just didn't show up on some loads of Feed. "Those posts were always visible on friends' timelines, and could have shown up on subsequent News Feed loads," he added.

In the reesearch, the authors noted that the experiment procedure "was consistent with Facebook's Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research."

It is obvious that Facebook users are test subjects for the company?s experiments, and it's also reasonable. The social network is gauging customer behavior and they are are too many business-driven reasons for Facebook to keep tweaking its platform and learning more about how its users respond to different triggers. We have not discovered the wheeel here, don't you agree? And if you have just realized that this policy bothers you, you still can close your Facebook account.



Home | News | All News | Reviews | Articles | Guides | Download | Expert Area | Forum | Site Info
Site best viewed at 1024x768+ - CDRINFO.COM 1998-2024 - All rights reserved -
Privacy policy - Contact Us .