yeah we're just going to select what posts you see mmkay
its for your own good mmmkay.
we're just going to delete the ones we don't think you should be seeing mmmkay.
dont worry about it theres no need for you to be aware of this mmmmmkay
Facebook has hit back at its critics after the social network instructed researchers to meddle with its users' "news feeds" in order to manipulate their emotions. The free-content ad network sparked anger when it emerged its data scientist Adam Kramer gave a green light to researcher to filter out positive and negative posts …
Facebook does this all the freakin' time. They make no attempt to hide it.
The average Facebook user has 338 "friends". If the average user makes 1 post per day, and you see (say) 15 posts when you log in, how do you think it selects those 15 out of the 338 available?
I don't know. My bet is, it's an algorithm that's constantly being tweaked, that takes into account the age of each update, number of likes, and how much you tend to interact with that particular friend on FB, among probably many other factors (such as how much they're paying FB to promote their posts).
And for a trial period, they introduced another factor into this algorithm - the "mood" of each post - and fed a differently-weighted version to each of two different subsets of users.
Seriously, I don't get how this is such a big deal.
NSA trawling everybody's data but not actually doing anything with most of it, that's a problem. Facebook actually submitting millions of users to a psychological test without consent and you don't get it?
I'll be honest, I've personally been "meh" about it, but only for myself and only because outside of the occasional visit home when my mother wants me to send her some stuff for Farmville, I haven't touched my accounts there in about two years, maybe three. But I DO get that if I were a regular FB user, particularly one who was attempting to use it to stay in touch with Friends and Family, I would be pissed (US, not UK) at this sort of revelation.
I don't use Facebook so I don't know much about it. I would guess that it is in business to make money, which is ok, and to look after its own interests which may or may not be ok. The traditional approach to this situation is free competition and the handicapping of monopolies.
As a disinterested onlooker may I suggest nuking from LEO and seeing if a replacement occurs and if it is any different; purely as a sociological experiment, and in the interests of pure science, of course.
From Kramer's Facebook post:
"[...]
The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook. We didn't clearly state our motivations in the paper.
[...]"
In other words, Facebook are prepared to distort reality to preserve their own business.
Adverts are generally obviously trying to sell something.
Most people are clearly ok with seeing adverts a lot of the time, or even deliberately seeking out an advert-laden medium as they watch commercial TV.
However, this is manipulating the users by artificially changing the content, hiding posts which they probably will have wanted to see.
It's like broadcasting two versions of Corrie - one where everything went wrong for the characters and one where everything went right, and seeing if it made the viewers happy or sad without their knowledge
- Except that a week of a soap opera without disaster for someone would be suspicious in itself, which isn't true of Facebook.
If you aren't going to show all news items (which is a challenge in itself), you have to have a method / algorithm to decide what items you are going to show. It seems reasonable to test the effect of potential biases. Although Kramer doesn't have an academic affiliation, the other authors do, which usually obliges researchers to go through ethics committees for such experiments. There's no evidence in the (pretty lightweight) paper to say they do, but that's usually down to the journal's policies - still, it'd be nice to know
"...conducted for a single week in 2012"
read: fuck off, 2 years in the web 2.0 is like a millennium in the old, analogue. Spent and we're good now!
"none of the data used was associated with a specific person’s Facebook account"
read: we say we anonymize data, so fuck off
"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible."
This one is straight from the Book of "fuck-off quotes".
read: we do research to spam our sheep into oblivion
"We carefully consider what research we do..."
read: "we know how to shear the sheep, so fuck off"
"There is no unnecessary collection of people’s data"
read: prove us wrong and have a good day
Oh dear, no sheep?! But it's such a fitting analogy! You give them "free" grass and you shear them and give them more free stuff and shear them more. You shoo them, or give them a kick, and all they do is - bleat. Same with milking cows.
No sheep... well, hamsters maybe? I guess you feed them (...) bit by bit and they just stuff it in their pouches, til they can hardly move along. Pretty comical, eh?
"It's hackneyed to the point of uselessness, comes off as smug and superior, and turns away the very people you might hope to sway."
AKA "You catch more flies with honey than vinegar."
And it's amazing how many people these days forget this simple lesson. For example, I myself would probably be more supportive of the ideals of the various "equality" movements, were their proponents less disposed to snottily dismiss anyone who opposes them as an unenlightened overprivileged bigot, instead of logically explaining where they think the opponent is wrong. I wonder how much opposition to their causes results from this vinegar/honey effect, as opposed to genuine political disagreement with their ideas?
I'd have thought that it was a give that data was anonymised.
My problem with it is that they were deliberately trying to manipulate peoples' moods.
There are probably a fair few people that rely on seeing positive posts from others (family etc) to get through their day/week/month/year. This is bloody disgraceful, FB.
"This research was conducted for a single week in 2012 and none of the data used was associated with a specific person’s Facebook account."
We're not telling anyone who we messed with. Including them. So that's ok.
"We do research to improve our services and to make the content people see on Facebook as relevant and engaging as possible. A big part of this is understanding how people respond to different types of content, whether it’s positive or negative in tone, news from friends, or information from pages they follow. "
This is an important step in determining how to make our cattle happy and most likely to click the adverts provider by our users. The cattle will eat whatever they're fed.
"We carefully consider what research we do and have a strong internal review process."
We consider carefully if the research is in our interests and if there's money to be made. No, we're not sharing how we decided. We're very comfortable about this, so what's the problem?
There is no unnecessary collection of people’s data in connection with these research initiatives "
We decide what's necessary.
"and all data is stored securely."
It's our data, isn't it? So you can bet it's secure. The sooner you get this into your thick heads the quicker this discussion will be over.
"Facebook knows you don't want to know **EVERYTHING** your friends do, so using complex algorithms, we carefully select the most miserable posts that we know you won't want to read. Going to work in the morning? Check your Facebook news feed before you leave and feel like shit! Thanks to Facebook, you can be made to feel like a warmed up turd all week"
After dicking with the UI and the news feed filters and people going nuts because of it, this is the one that shows that all Facebook users are Zuck's bitches and nobody among the baby-sharing, bacon-snapping morons will care;
Manipulate my mind, but don't mess with the UI!
Hadn't really thought about it before, but yeah, that was around the time I stopped using FB. That and it was synergistically linked to Zynga's free game server performance going to shit. And I mostly hung out for the Zynga games in the first place. I had even just reached the point where I would have been willing to toss them a Hamilton a month for the service.
This post has been deleted by its author
Farcebook.
You can not believe ANYTHING you see on this site (farcebook) because we manipulate it to get the most emotional response from you. Since MOST of you are "sheeple" ( Meaning: for the obviously ignorant among you, you are frikking sheep people who act like herd animals, enjoy getting screwed in the posterior and are too stupid to make up your own mind) you deserve this crap, along with people who complain about being labeled "sheeple".
...A peddler of creepy advertising, like a tobacco executive who peddles cigarettes to children in developing countries.... I'd wake up and ask myself: What am I doing with my life...???
Every single day I cloak Facebook in PR spin about 'being social', but secretly I know that Facebook is highly 'addictive' nicotine, an advertising delivery device, that tricks people into clicking on ads....
I would drop my head in shame and ask: Why am I not trying to change the world: get us to Mars for example...? Then I'd probably go and eat a bullet...
It may even have provoked a couple of suicides.
It doesn't matter whether the study displayed negative posts, only that it hid them.
If you posted a "cry for help" on Facebook and your friends didn't answer, instead they continued to post inanities, then what?
You don't know that Facebook deliberately hid it from them.
This would never have got past a reasonable ethics committee, because you have to inform the subjects that they are part of a trial and allow them to withdraw if they don't want to be part of it.
"Assumed consent" is bollocks, pure and simple.
Facebook is everywhere. I blocked the domain at router level and was rewarded with slabs of black space on the web pages I visit, reminiscent of old-style newspaper censorship.
A closer examination of the page in many cases reveals no explicit mention of Facebook (e.g. "Like") so I assume the HTML is peppered with tracking pixels or similar.
Trying to control people for purposes that suit FB not necessarily the people being controlled. Then they are surprised at the reaction when people find out.
There's a reason everybody knows about the case of the school kids subject to the brown vs blue eyed psych experiment even if they are not psych or sociology majors: People get angry when you try to manipulate their emotions without their consent. Yes, this makes it harder for the experimenters to get good data in certain instances. But that doesn't negate the need for FULLY informed consent.
A large part of why I left FB is the poor content filtering. I care about how my friends are doing, not about pseudo deep captions on stock photos reshared from crap content creating pages. Maybe my personality and usage patterns didn't conform to those of the majority of users. Whatever the case may be, the algorithms failed me so I left. If they would done more studies, maybe they could have fixed my feed and I'd still use the service.
The official line on Facebook's blog from when they introduced 'Top Stories' was that they were doing it for our benefit or we'd miss the new baby pictures from our aunt! But then again, back then they said Top Stories was only for when you hadn't logged in for a while. Now, while at least the website stays on Most Recent more than it used to, the default on mobile phone apps is for 'News Feed', which is the top stories nonsense. But it is dead to me because it lists posts out of chronological order and I know I'm a masochist, but every time I check Facebook on my phone I deliberately go to More > Most Recent.