
OK, I'm late to the party here...
But I think this latest news has finally persuaded me to open a Facebook account,
Facebook let researchers adjust its users' news feeds to manipulate their emotions – and has suggested that such experimentation is routine, which is seemingly how the idea got past the advertising firm's ethics committees. In 2012, researchers led by the company's data scientist Adam Kramer, manipulated which posts from their …
one of the most annoying things about this for me is that it gives a bad name to social scientists in general. I know that there is not a lot of love for social scientists on these forums, but this really doesn't help anyone.
I cannot bellieve that someone signed this off - informed consent is something that every social science undergraduate has drummed into them from the first day.
What will be interesting to see is if any of the US regualtors step in, there have been suggestions that this breaks federal law in the US: (http://laboratorium.net/archive/2014/06/28/as_flies_to_wanton_boys)
"creepy experiment"
0% creepy, rilly. You get more emotional response manglement whenever a politician or a spokesperson appears on the tube and uses newspeak to whip up a frenzy. Extra buffer stuffing when it's an economist working for our top clown court.
Oh, on the contrary, you manipulated peoples feeds to show them negative posts in hope to make them negative (as that was the outline of your research) whilst doing the opposite to the other half - therefore you fully intended to upset half of your research 'volunteers'...
@Sampler
Yeah - that's the way I read it too.
Studies like this are always problematic as there's no real way to conduct it without keeping people in the dark. When a new drug is being tested it will go through double-blind tests against a placebo. Participants are fully aware that they may be taking a placebo but it works because there is no way to tell if you are in the control group or not.
With an experiment like this, telling people the parameters of the study would expose the whole thing. Sure, it's possible that all your friends are just sad all the time but if you notice that your feed is now a bit less or more upbeat than normal then you're going to have a pretty good idea which group you are in.
So, this research is near impossible to conduct while still getting informed consent. The question has to be asked, then, whether the research is valuable enough to warrant what has happened. I would suggest not. It's important that science not simply settle for 'common sense' answers but I don't think it would be too detrimental to our understanding of emotions if we just assume that people exposed to predominantly negative information take on some of that negativity themselves.
After all, advertising works so it's really not a stretch at all to assume that 'emotional advertisement' works too.
It could have been done reasonably well. Throw up a notification asking if people are willing to be part of an experiment on social behavior, which may alter their experience of Facebook for the next week. Explain that providing any more details about the experiment would alter people's behavior and invalidate the results, but provide more information about the study and which group people were in after it's over. Not complete information, but enough for reasonably informed consent, and far better than how they provided no information and obtained no consent.
Second comment: Creepy indeed, and it's going to get worse.
Third comment: I propose we name this kind of thing "inverse spam", i.e. "$BIGCO is blocking messages that I explicitly signed up for! What absolute assholes!"
Comment the fourth: Consider Chevy paying Farcebook/Twatter/et alia to block all positive posts about Ford vehicles. Or vice versa.
Comment the fifth: I fear we have reached the no going back side of the slippery slope when it comes to personal privacy. George Orwell was only off by a couple decades.
Sixth comment.
Next it will be the individually crafted reorder of your newsfeeds being monetized for ad-slinging (actually product placement) purposes. Welcome to the world of tomorrow, which Orwell, Huxley, Ira Levin and Gibson could not even dream of.
I just got more irritated and pissed off with Facebook for not giving me what I want to see in my newsfeed, which is ALL posts that I've elected to receive, most recent first. In other words, with none of their fancy filtering to try to determine which ones I might want to see applied. It was lonjg ago that I decided that I don't fit anyone's standard profiles, and FB is not an exception to that.
It is an advertising platform.
Regardless of what a few mind-boffins do, advertising is intended to mess witth your mind and get you to buy crap you don't want and is MUCH worse than a slight emotional wrping due to the order you receive news in.
Getting uppy about this while placidly sucking on the advertising teet makes no sense alt all.
A benign advertising platform facebook isn't:
"They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. "
http://www.bbc.co.uk/news/technology-28051930
This post has been deleted by its author
The _Atlantic_ article states, in part: "The backlash [against the research methodology] in this case, seems tied directly to the sense that Facebook manipulated people -- used them as guinea pigs -- without their knowledge, and in a setting where that kind of manipulation feels intimate." The outrage, then, seems to be hinged on people thinking that fecebook does NOT manipulate people, that a company whose revenues derive in no small part from adverts would not manipulate its users. Hokay. Not sure how this is different to measuring the effectiveness of said adverts (using illustration A, 24% of viewers clicked the link, while using illustration B only got 15% of viewers to click") -- viewer sees stimulus, viewer takes action, fecebook keeps track, and correlations are hypothesized -- as someone posted above, if a service is free you are not the customer you are the product. Having said that, if fecebook does manage to gets its hands slapped this time, hallelujah.
Tracking your response to adverts is one thing. Going out to purposely manipulate peoples news feeds to see if you can make them angry or happy or depressed and not telling anyone that they have been opted into an experiment that they have no way of opting out of is a completely different thing
Krammer: "We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
So, this was actually a brand building exercise. Just like a toothpaste company's research, except that it made you and your kids not so cheery.
1. Set your News Feed to show "Most Recent" rather than what some algorithm wants to promote.
2. Go through all your friends and individually set who you want to follow (there will be some people you want to keep on your friend list but don't feel the need to read about their breakfast every day).
That's it. Now you get the stuff you want.
Also 'most recent' still has a scrambled order. Random posts are selected to be ordered based on the date of the last comment, not the date they were posted. How this order differs from the top-stories order I'm not sure, but its not what I'd expect from 'most recent'.
Adam Kramer of Facebook is reported by BBC.co.uk as saying
""At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook."
The guy is Missing the point. Its Facebook with its arrogant attitude to people and the fact that not all of us feel the need to plaster minutiae of our lives on the internet that leads many to avoid visiting Facebook, a fact I am relived to see is dawning on some of those I know who are Facebook users, and are cutting down or ceasing their use of the data-mining service. Many of us can see NO value in joining, For my part I feel that joining would lead to a deficit in privacy and the quality of contact with friends, I prefer to relate to my friends individually - I feel they are worth that time and effort rather than employing the electronic scattergun approach of spraying dull details of everyday across t'interwebz
Simply put Mr Kramer, Facebook really isn't as compelling as you feel it should be, and - happily - people are starting to question their participation in it. I have never been and never will bne part of it, companies that increasingly demand a facebook or twitter account to interact with them should take not of the fact that I am p[art of a sizeable section of web user, one that may well grow in the future.
This is as sinister and underhanded as subliminal advertising, and it is time that this "service" was more strictly controlled - those running it clearly have no morals when it comes to messing with other peoples heads.
So they acknowledge direct psychological manipulation of unknowing users? That's a pretty stupid thing to do considering that amongst the vast number of FB users there must be at least some who are at one time or another in a mentally fragile state.
The Ts & Cs may refer to use of data for research purposes, but this amounts to use of the users minds for research purposes - and that sure isn't covered.
It doesn't surprise me that FB doesn't see any difference, or care if there is, but it worries the hell out of me that the researchers didn't see any need to consider the ethical implications of what they were doing.
OK I signed up and accepted that they could use data I provided for research.
Where is it written that actually that means I'm signing up as a test subject? This is an experiment I could have been involuntarily involved with. If a drugs company did it there would be hell to pay.
Doesn't this actually contravene human rights legislation. Maybe the EU need to get involved and find out how many EU citizens were used as human Guinea pigs without consenting.
As many here have pointed out before, if you get it for free you are not the customer you are the product.
What Farcebook are forgetting is that not only are their members the product, they are the resource, any company that hopes to have any kind of longevity in its chosen business knows that managing and shepherding its prime resources is fundamental to success.
When you treat resources as a limitless, maintenance free supply of product that will always be there; someday you are going to wake up with a nasty shock, it will all be gone.
Let's Hope Eh!
.. Simply refuse to have any dealings with a company that advertises on Feb. If it's your bank or power supplier then switch and tell them that you are switching because by running adverts on Facebook they have supported extremely dubious social engineering experiments.
No one will of course... Such is the ennui
Makes you wonder how many idiots (People already suffering manic depression etc.) went out and injured others (the normal trolls in many cases) or themselves because they were being manipulated. It sounds as if they were 'practicing' (researching) psychology/psychiatry which to the best of my knowledge information and belief requires a license.
Our government has been known to 'experiment' with what it seems to think is ITS property (People) for years.
...A peddler of creepy advertising, like a tobacco executive who peddles cigarettes to children in developing countries.... I'd wake up and ask myself: What am I doing with my life...???
Every single day I cloak Facebook in PR spin about 'being social', but secretly I know that Facebook is highly 'addictive' nicotine, an advertising delivery device, that tricks people into clicking on ads....
I would drop my head in shame and ask: Why am I not trying to change the world: get us to Mars for example...? Then I'd probably go and eat a bullet...
you DO realize Zuckerberg, et al, funds and supports the DNC in all things and is a whole lot more likely already doing this for HIS preferred politicians.
Who was it that was able to make a direct phone call to the President to complain? Someone paid a LOT of money and favors to do that. Who helped organize and support the $5K a plate "fundraiser" here in San Jose a couple months ago?
It's funny watching people cast aversions on one side while ignoring the actual observed facts about the other. What you think the republicans "might" do is a lot less critical than what the democrats ARE doing RIGHT NOW.
Worry about the GOP if it ever gets "in charge" again.
Barring the deletion, there's an entity called the Army Research Laboratory, but there is no Army Research Office. At least not in regard to the United States Army, which as this study involves American institutions, I would figure that would be who it is supposed to be, but I kind of have my doubts unless a certain circumstance happened.
But then again, the Army doesn't fund the trick cyclists often unless there's a specific battlefield advantage to be gained or a weakness to be exploited. DARPA is more for far out weird/creepy shit like this. Unless Psychological Operations branch and USACAPOC managed to be very convincing that the trick cyclists can figure out a way to degrade enemy morale using social media for a low cost. Civil Affairs and Psychological Operations has a functional sub-command under US Army Special Operations Command which means they get quite a bit of money for themselves that they can use for research if they so choose. They still have to demonstrate a need for the Army to do whatever research though, its not like DARPA where they come up with crazy ideas that sometimes work.
They probably can indeed hurt enemy morale with social media, but blasting that we're going to kill specific commanders or rank and file Soldiers over the radio and on TV, and then doing it, works better to scare the enemy if you ask me. So as a result, buying a few more EC-130E Commando Solo aircraft (which arent very expensive, $90 million a piece when they were under development, its just a C-130 with special telecommunications equipment) and their crews, which come from four of the five services, would be a better investment in my opinion.
Cornel withdrew its assertion that the "research" was partially funded by the Army. It now appears that it was funded partially by Facebook and partially by money stolen from state and federal tobacco settlement money --- funds that were intended for research on tobacco use cessation.
Cornell did not withdraw the assertion because it was false, but because if the statement were true, the "research" would have been subject to rules and regulations by the Federal Office of Human Subject Protection. So they told a lie to cover up another lie.
This is far from being over. Now UCSD and Yale are involved as well as Cornell and UCSF. It appears that Facebook is very active in using these institutions to launder illegal and unethical research and get it into established journals such as PLAS and "pay-to-play" phony journals such as PLOS-ONE. All this supports Facebook's ability to market political products based on "shadow profiles" and phony research that tells political campaigns that if they give Facebook enough money, that Facebook can decide/alter the outcomes of elections.
...that they got a statistically significant measurable effect from that methodology, if all it was was assessing emotional content on the basis of contained words. I'm fond of quoting my Second Law of Information Retrieval: "The set of words in a document do NOT tell you what the document is about."
How would a lexicon approach distinguish the emotional content of "I'm off down town to see 'Cry Freedom'" (at least two "negative" words) from "The sweet-talking git would never have made me happy, I should be glad that he's gone" (at least two "positive" words).
I suppose I could go to the PNAS and read the paper, but I don't feel like giving them the satisfaction, really.
Icon: says it all without saying it. Did the trick-cyclists categorise emoticons, too?
then manipulation of user perception.
Nope, the two cannot possibly be linked.
Cuz, they're internet, and totally not like those "other" rich guys who manipulate the system in order to stay on top.
(you probably think Net Neutrality is supported by these guys for "good" reasons too -not solely because their business success requires as much use of somebody else's paid for infrastructure)
In standing behind this ethically bankrupt study, Cornell has essentially announced to the world that its Institutional Review Boards are available to "launder" any sort of morally reprehensible study that comes along.
Initially Cornell maintained that the study was approved because Facebook did the actual manipulation and data collection, even though they also claimed that the study was partially funded by the Army Office of Research and would have fallen squarely under the rules of the Office of Human Research Protection. Then they decided, when caught in a lie, that IRB approval was not really necessary because the study received no Federal funding. Apparently ethical issues are okay at Cornell. But then the study turns out to have been funded by tobacco use cessation funding which comes mostly from Federal funding. Come on, Cornell, at least get your story right.
This is a complete "cluster f++k" on Cornell's part. Any research that is "experimental" versus "observational" requires IRB approval on both legal and ethical grounds in any serious research institution. No exceptions! This requirement kicks in if the institution receives even a penny in Federal funding, not just of the project itself receives Federal funding.
Feeding people false information to gauge their reactions it one of the most ethically challenged types of social research. It is nothing more than a thinly laundered version of the grossly discredited Milgram Experiment. You have to realize that causing people in the Milgram experiment to do unethical and immoral things to other people, if only imaginary, can cause serious and lasting damage to those individuals. Much of the IRB protection is based on preventing repetitions of the Milgram fiasco.
But even more than this, the Cornell IRB, the editors at PLAS, and even much of the press missed another major issue: serious and disabling conflicts of interest and major lapses in proper scientific rigor. The research was apparently funded by Facebook and money "laundered" illegally from state and federal tobacco settlement money. No one asked the question Cicero always asked, "Cui bono?" Who benefits? Follow the money!
Facebook has been touting in political services their intention of marketing a "Facebook campaign package" that "goes a step beyond political polls" to actually using "scientific methods" to "alter the public perception of candidates and issues in realtime." An observation that readily supports that this was a part of the groundwork for such a political intervention is that most of the "news" that was manipulated concerned ACA (Obamacare.)
So, Cornell, come clean! You helped launder illegal research funds for an unethical and illegal study that used public funds to benefit Facebook in its political activities. And you stand behind that process still!
At this point it is time to consider serious research sanctions against PLAS (or its editors), Cornell, USSF, UCSD, and Yale, all of whom participated in this or other illegal and unethical Facebook "research."