This article is from the source 'guardian' and was first published or seen on . It last changed over 40 days ago and won't be checked again for changes.

You can find the current article at its original source at http://www.theguardian.com/technology/2014/jun/30/facebook-news-feed-filters-emotion-study

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
How does Facebook decide what to show in my news feed? How does Facebook decide what to show in my news feed?
(about 1 hour later)
Facebook is secretly filtering my news feed? I'm outraged!Facebook is secretly filtering my news feed? I'm outraged!
Not so secretly, actually. There is controversy this week over the social network's research project manipulating nearly 700,000 users' news feeds to understand whether it could affect their emotions.Not so secretly, actually. There is controversy this week over the social network's research project manipulating nearly 700,000 users' news feeds to understand whether it could affect their emotions.
But Facebook has been much more open about its general practice of filtering the status updates and page posts that you see in your feed when logging on from your various devices. In fact, it argues that these filters are essential.But Facebook has been much more open about its general practice of filtering the status updates and page posts that you see in your feed when logging on from your various devices. In fact, it argues that these filters are essential.
Essential? Why can't Facebook just show me an unfiltered feed?Essential? Why can't Facebook just show me an unfiltered feed?
Because, it argues, the results would be overwhelming. "Every time someone visits News Feed there are on average 1,500 potential stories from friends, people they follow and Pages for them to see, and most people don’t have enough time to see them all,” wrote Facebook engineer Lars Backstrom in a blog post in August 2013. Because, it argues, the results would be overwhelming. "Every time someone visits news feed there are on average 1,500 potential stories from friends, people they follow and pages for them to see, and most people don’t have enough time to see them all,” wrote Facebook engineer Lars Backstrom in a blog post in August 2013.
“With so many stories, there is a good chance people would miss something they wanted to see if we displayed a continuous, unranked stream of information."“With so many stories, there is a good chance people would miss something they wanted to see if we displayed a continuous, unranked stream of information."
Bear in mind that this is just an average. In another blog post, by Facebook advertising executive Brian Boland in June 2014, he explained that for more intensive users, the risk of story overload is greater.Bear in mind that this is just an average. In another blog post, by Facebook advertising executive Brian Boland in June 2014, he explained that for more intensive users, the risk of story overload is greater.
"For people with lots of friends and Page likes, as many as 15,000 potential stories could appear any time they log on,” he explained. "For people with lots of friends and page likes, as many as 15,000 potential stories could appear any time they log on,” he explained.
How many stories is Facebook filtering out, and how?How many stories is Facebook filtering out, and how?
Backstrom explained in August that Facebook's news feed algorithm boils down the 1,500 posts that could be shown a day in the average news feed into around 300 that it "prioritises".Backstrom explained in August that Facebook's news feed algorithm boils down the 1,500 posts that could be shown a day in the average news feed into around 300 that it "prioritises".
How does this algorithm work? Backstrom explained that factors include: how often you interact with a friend, page or public figure; how many likes, shares and comments individual posts have received; how much you have interacted with that kind of post in the past; and whether it’s being hidden and/or reported a lot.How does this algorithm work? Backstrom explained that factors include: how often you interact with a friend, page or public figure; how many likes, shares and comments individual posts have received; how much you have interacted with that kind of post in the past; and whether it’s being hidden and/or reported a lot.
Hidden and reported?Hidden and reported?
You may have spotted the little downward arrow that appears next to stories in your Facebook news feed: you can use that to hide them, and optionally tell Facebook why you don't want to see them:You may have spotted the little downward arrow that appears next to stories in your Facebook news feed: you can use that to hide them, and optionally tell Facebook why you don't want to see them:
It's a feature as useful for banishing people whose status updates don't interest you but who you don't want to unfriend, as it is for cleaning your feed of social quizzes, Candy Crush requests or nightly set lists from that band you liked a few years ago, but who've gone off the boil now.It's a feature as useful for banishing people whose status updates don't interest you but who you don't want to unfriend, as it is for cleaning your feed of social quizzes, Candy Crush requests or nightly set lists from that band you liked a few years ago, but who've gone off the boil now.
But these actions will also influence how many other people are shown those updates too, via the news feed algorithm.But these actions will also influence how many other people are shown those updates too, via the news feed algorithm.
Why is this emotion study controversial?Why is this emotion study controversial?
Facebook has the right to filter your news feed, including for research: it's right there in the terms of conditions that very few of us read when signing up to the social network.Facebook has the right to filter your news feed, including for research: it's right there in the terms of conditions that very few of us read when signing up to the social network.
There is a debate about whether agreeing to these Ts and Cs counts as "informed consent" to take part in a research study – if not, the study has breached US ethical guidelines on human subjects research.There is a debate about whether agreeing to these Ts and Cs counts as "informed consent" to take part in a research study – if not, the study has breached US ethical guidelines on human subjects research.
There is also unease at the thought of Facebook deliberately manipulating our emotions, rather than simply using its filters for its traditional goal of making sure we see the stories that are most interesting to us.There is also unease at the thought of Facebook deliberately manipulating our emotions, rather than simply using its filters for its traditional goal of making sure we see the stories that are most interesting to us.
One of the researchers, Adam Kramer, has posted a defence of the work, claiming Facebook had users' interests at heart: "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."One of the researchers, Adam Kramer, has posted a defence of the work, claiming Facebook had users' interests at heart: "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
Is this the first such row about Facebook filters?Is this the first such row about Facebook filters?
No, although it's looking like the one most likely to be heard about by mainstream Facebook users. The last year has seen a debate rumbling around the "organic reach" of the Facebook pages created for brands, musicians and other public figures, and then 'Liked' by Facebook users. No, although it's looking like the one most likely to be heard about by mainstream Facebook users. The last year has seen a debate rumbling around the "organic reach" of the Facebook pages created for brands, musicians and other public figures, and then "liked" by Facebook users.
Organic reach – the percentage of people who've liked a page who'll see any given post by it – has been falling, leading to accusations that Facebook is deliberately showing less page posts in our news feeds, in order to force their creators to pay to promote them. Organic reach – the percentage of people who've liked a page who'll see any given post by it – has been falling, leading to accusations that Facebook is deliberately showing fewer page posts in our news feeds, in order to force their creators to pay to promote them.
Some marketers argue that they shouldn't have to pay to reach their own fans who've signed up to receive their updates by tapping a Like button at some point. Others argue that Facebook has built this massive marketing platform (yes, it's that, not just a social network for our benefit) and has the rights to charge marketers. Some marketers argue that they shouldn't have to pay to reach their own fans who've signed up to receive their updates by tapping a "like" button at some point. Others argue that Facebook has built this massive marketing platform (yes, it's that, not just a social network for our benefit) and has the rights to charge marketers.
It's the arguments about organic reach that have nudged Facebook into revealing more information about its news feed algorithms from time to time, including Backstrom and Boland's blog posts.It's the arguments about organic reach that have nudged Facebook into revealing more information about its news feed algorithms from time to time, including Backstrom and Boland's blog posts.
Who else is worried about this?Who else is worried about this?
Media companies (the publisher of The Guardian included) are spending more and more time thinking about the algorithms of Facebook, Google and other internet companies, with their sheer scale making them powerful cultural gatekeepers – for better or worse, depending who you speak to. Media companies (the publisher of the Guardian included) are spending more and more time thinking about the algorithms of Facebook, Google and other internet companies, with their sheer scale making them powerful cultural gatekeepers – for better or worse, depending who you speak to.
A sparky session at the SXSW conference in March debated this point, particularly as it relates to younger internet users: "millennials" – as explained by Kelly McBride from journalism school The Poynter Institute:A sparky session at the SXSW conference in March debated this point, particularly as it relates to younger internet users: "millennials" – as explained by Kelly McBride from journalism school The Poynter Institute:
"If you look at the research on how people get their news now: you often hear this phrase: ‘If news is important, news will find me’ – particularly for millennials. But behind that statement is something really important: if news is going to find you, it’s going to find you because of an algorithm," said McBride."If you look at the research on how people get their news now: you often hear this phrase: ‘If news is important, news will find me’ – particularly for millennials. But behind that statement is something really important: if news is going to find you, it’s going to find you because of an algorithm," said McBride.
There is this idea that is falsely propagated that we’re in a superior market of ideas because the algorithms are neutral. They’re not neutral: they’re all based on these mathematical judgements that the engineers have made behind the algorithm." There is this idea that is falsely propagated that we’re in a superior market of ideas because the algorithms are neutral. They’re not neutral: they’re all based on these mathematical judgments that the engineers have made behind the algorithm."
The counterargument to this is that if Facebook is a new gatekeeper, it's no less neutral than the old news gatekeepers – newspapers and TV stations, editors and reporters – it's just that its priorities may not be as obvious.The counterargument to this is that if Facebook is a new gatekeeper, it's no less neutral than the old news gatekeepers – newspapers and TV stations, editors and reporters – it's just that its priorities may not be as obvious.
Can I take back control of my Facebook news feed?Can I take back control of my Facebook news feed?
Yes, you can. One way is to change your news feed from "Top Stories" to "Most Recent" – the latter being a more traditional reverse-chronological feed of updates from all your friends. Yes, you can. One way is to change your news feed from "top stories" to "most recent" – the latter being a more traditional reverse-chronological feed of updates from all your friends.
How you do this seems to change regularly – at the time of writing, it's accessed via a downward arrow next to "News Feed" at the top left of Facebook's website, and under the "More" menu in Facebook's mobile apps. Some people love it, and others think it shows why Facebook's filters are needed in the first place. How you do this seems to change regularly – at the time of writing, it's accessed via a downward arrow next to "news feed" at the top left of Facebook's website, and under the "more" menu in Facebook's mobile apps. Some people love it, and others think it shows why Facebook's filters are needed in the first place.
You can also spend a bit of time sorting out your Facebook friends, classifying some as "Close Friends" to see more of their updates, and others as "Acquaintances" to see less: You can also spend a bit of time sorting out your Facebook friends, classifying some as "close friends" to see more of their updates, and others as "acquaintances" to see less:
For pages – whether brands, musicians or other public figures – you can manually choose to "Get Notifications" when they post something new. For pages – whether brands, musicians or other public figures – you can manually choose to "get notifications" when they post something new.
This should take half an hour at most, unless you have thousands of friends (in which case, the manual weeding-out process may be even more necessary). But as Facebook has suggested, your day-to-day activities – how often you interest with specific friends and/or pages – should also help to sculpt your news feed.This should take half an hour at most, unless you have thousands of friends (in which case, the manual weeding-out process may be even more necessary). But as Facebook has suggested, your day-to-day activities – how often you interest with specific friends and/or pages – should also help to sculpt your news feed.
Thanks, I'm feeling happier now... Thanks, I'm feeling happier now
Careful now. Your mood may not last if, next time you log into Facebook, your feed is full of your mopiest, angriest and/or most passive-aggressive friends' updates. But at least you'll know that it's all in the name of research. Possibly.Careful now. Your mood may not last if, next time you log into Facebook, your feed is full of your mopiest, angriest and/or most passive-aggressive friends' updates. But at least you'll know that it's all in the name of research. Possibly.