How does Facebook decide what to show in my news feed?

Not so secretly, actually. There is controversy this week over the social network’s research project manipulating nearly 700,000 users’ news feeds to understand whether it could affect their emotions.But Facebook has been much more open about its general practice of filtering the status updates and page posts that you see in your feed when logging on from your various devices. In fact, it argues that these filters are essential.
http://www.theguardian.com/technology/2014/jun/30/facebook-news-feed-filters-emotion-studyAlso see:Facebook Says It’s Sorry. We’ve Heard That Before.
Sometimes, being wrong on the Internet means having to say you’re sorry. And by now, Facebook is very, very good at saying sorry.Facebook offered up an apology to its users on Sunday, after it came to light that the company had manipulated the news feeds of more than half a million people so it could change the number of positive and negative posts that appear from their friends. Facebook’s in-house data science team carried out the project, it said, as a way to examine the “emotional impact of Facebook” on its users. Along with two university researchers, the team published the results of the study in an academic journal.
http://bits.blogs.nytimes.com/2014/06/30/facebook-says-its-sorry-weve-heard-that-before/Facebook faces criticism amid claims it breached ethical guidelines with study
Facebook’s experiment in which it tweaked the words used in nearly 700,000 users’ news feeds to see if it affected their emotions breached ethical guidelines, say independent researchers.”This is bad, even for Facebook,” said James Grimmelmann, professor of law at the University of Maryland, who says that the lack of informed consent in the experiment carried out for a week during January 2012 on Facebook users is a “real scandal”.
http://www.theguardian.com/technology/2014/jun/30/facebook-internetWhy the “everybody does it” defense of Facebook’s emotional manipulation experiment is bogus
For one week in 2012, Facebook manipulated the emotional content of the news feeds of more than 600,0000 of the social media network’s users to see how they would respond. For one group of users, the frequency of posts that contained words like “love” or “nice” decreased. For another, the number of posts containing words like “nasty” or “hurt” declined.The researchers discovered that the filtering resulted in a very small — but observable — effect. Moods are contagious! Users who experienced more negativity in their news feeds tended to express themselves more negatively in their own pasts. As a zillion Facebook posts reported over the weekend: Facebook made some of its users sad, on purpose.
www.salon.com/2014/06/30/why_the_everybody_does_it_defense_of_facebooks_emotional_manipulation_experiment_is_bogus/

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.