back to article Epic fail, Facebook: FTC complaint against creepy mind games filed

The Electronic Privacy Information Center (EPIC) has filed an official complaint with the US Federal Trade Commission (FTC) over an experiment in which Facebook "purposefully messed with people’s minds." "Facebook altered the News Feeds of Facebook users to elicit positive and negative emotional responses," the complaint …

  1. FormerKowloonTonger
    Trollface

    Facin' IT

    Looks as if that hooded, oh-so-clever Zuckerberg might now be "hoist on his own petard".

    1. dan1980

      Re: Facin' IT

      I doubt very much that anything will really come of this to disturb the status quo. Maybe some more stringent requirements around disclosure, maybe a fine, but nothing that will upset the shareholders or trouble Zuckerberg's yacht fund too much.

      I do award you a point, though, for getting through a whole post (short though it was) without mentioning:

      * Islam/Muslims/al-Qaeda/jihad

      * Wanking/genitals

      * Zukerberg's age (at least not directly)

      If possible, I'd give you a second point for making it through without a single ellipsis, pun or gasp!

      1. Tom 38
        Headmaster

        Re: Facin' IT

        "Facin' IT" is surely a pun.

  2. Winkypop Silver badge
    Devil

    You had me at: "EPIC FACEBOOK FAIL"

    That is all.

  3. Anonymous Coward
    Anonymous Coward

    Even now I'm not sure that "We can use your data for research" means "we can manipulate your news feed just to see if we can make you happy or sad, or suicidal or mad enough to go out and shoot people" (and no I don't think those last two happened from their manipulation... but we don't know if it did, and neither do Facebook)

    I still think they're playing with us all - my FB feed keeps flicking back to "Top Stories" from "Most Recent" and my privacy settings keep putting my posts back to "public"... so I wonder if this is another "lets fuck with their minds" experiment.

    1. dan1980

      @AC

      Absolutely - "research" is one thing but surely this falls under the definition of "experiment".

      1. Anonymous Coward
        Anonymous Coward

        Ethics in Psychology

        > "research" is one thing but surely this falls under the definition of "experiment".

        Yes, "research" is broad to the point where this doesn't look like informed consent (the crucial word being "informed").

        I'm more disappointed by the universities and academics for getting involved. I'm a psychologist at a UK university and I doubt our ethics committee would approve this (nor would I try). I don't think the British Psychological Society's code of ethics would permit this. Any kind of deception, or lack of informed consent, would have to be very well justified in a cost/benefit analysis. Afterwards you'd need to fully debrief people and give them an opportunity to withdraw their data from the study. I'm less familiar with the American Psychological Association's research guidelines and code of conduct, but if any of the academics are APA members then aggrieved parties might consider filing a complaint with the APA and the relevant universities.

        Even if this hasn't caused much harm actual, it has arguably brought psychology and institutions into disrepute. It saddens me because my colleagues and I have spent years trying to clean up psychology. Psychological research is very reliant upon people volunteering their time. How can we expect anyone to participate in psychological research if confidence in ethical standards is eroded?

        1. Anonymous Coward
          Anonymous Coward

          Re: Ethics in Psychology

          According to an academic in Texas consent was not needed.. Mind you he did seem to be back pedaling at great speed last time I looked at the comments on his blog posts

    2. Anonymous Coward
      Anonymous Coward

      ...and my privacy settings keep putting my posts back to "public"... so I wonder if this is another "lets fuck with their minds" experiment.

      No that's just simply turning you back into the advertising medium that you are.

    3. Anonymous Coward
      Anonymous Coward

      I don't think it matters. They were trying to manipulate people full stop.

      That's a company, sold for countless billions, with another supposed billion or two users, that is trying to control people.

      Even if it fails, it's still disconcerting to know they tried it.

      1. dan1980

        @TechnicalBen

        "That's a company, sold for countless billions, with another supposed billion or two users, that is trying to control people."

        To be fair, that's kind of what advertising is.

        The troubling part here is how they went about it. Presumably the idea is not to manipulate peoples' information in the future, but, now that they have confirmed that positive/negative posts can change someone's mood, they can look at writing software to process posts and make a judgement on if a person is seeing predominantly positive or negative stuff and then advertise accordingly.

        TV advertisers can do a similar thing simply by choosing which programs their ads are shown alongside.

        I don't think it's a good thing, but it's not right to think that what FB is doing is really that far out.

    4. Anonymous Coward
      Anonymous Coward

      Precisely!

      Much of the debate seems to be missing the point that this is *not at all* the usual "passing on harvested data to third parties" privacy issue (bad enough as that is), but that it lifts the game onto a whole *new* level by actually *changing"* data which do not belong to FB, and without either sender nor recipient being even made aware of that.

      IANAL but I can't imagine that the boilerplate "It's OK that we sell your data" T&C language covers that level, too. Even "Use for research" of data cannot possibly include their clandestine *modification*.

  4. Anonymous Coward
    Anonymous Coward

    Not the apology I'm looking for

    We were wrong and we're sorry that we upset you. We won't do it again.

    That would be an apology.

    Telling us that they meant no harm and that they don't think they caused any harm doesn't quite do it for me.

    1. Tom 38

      Re: Not the apology I'm looking for

      Yep, "whoops, too bad" is about the worst apology you can give.

      A couple of days ago a company I've bought e-cigs from decided that the best way to market their crap to me was to give Twitter my email address and full name, so that Twitter can invite me to register and subsequently follow my retailer... it took multiple email exchanges before they figured out that I was upset that they had spaffed my personal details to a 3rd party, not that they sent me a marketing email every 2 days.

      Their subsequent "apology" was along the lines of "Well, we've done it now, can't really take it back". Fortunately, they are a UK subsidiary of a US company, I only dealt with the UK company, so I'm seeing how toothless the ICO actually is in dealing with idiots like this. Accidental data losses are one thing, this was wilful.

  5. Anonymous Coward
    Trollface

    "We were wrong and we're sorry that we upset you" -- Wrong, that is exactly what they wanted to do (and for the record, they also tried to make some others happy).

    Do we honestly think that Facebook is the only organisation that deliberately attempts to manipulate our feelings? Manipulation like this is the core of how our culture works and (especially) how our economy. Think about those charity adverts on TV which talk about suffering children in Syria or sick children in Great Ormond Street Hospital: They're not going to be getting any (or at least very little) financial support if their TV slots were full of laughing and fun times. They need to make people feel sad about what we see on the screen (I'm not being cynical about this, we need bad news in life in order to make decisions for the better).

    However, the government depends on invoking fear in order to get away with removing certain freedoms. Much of the green movement depends on fear (of the consequences of our carbon-vomiting actions) and guilt (for being such carbon-vomiting gits).

    Businesses and government and all manner of (non government) organisations depend on being able to manipulate our feelings, all that's happened here with Facebook is that we found out about it.

    1. Anonymous Coward
      Anonymous Coward

      But I know that adverts are trying to manipulate me.. I know politicos in their broadcasts are trying to manipulate me.

      What FB did was hide good news stories from people's friends just to see if those people reacted positively or negatively. This wouldn't be a problem if they had asked for consent and offered an opt out but they didn't.

      1. Gav
        Unhappy

        These things cascade

        But it goes further than that. By "hiding" your friends' news, your friend doesn't know that you did not see it. So they are now wondering why you didn't "like" or comment on their new arrival/exam pass/engagement. Or are disappointed that you have nothing comforting to say about grandpa dying/their divorce/redundancy/horrible car accident.

        So they were also messing with their victims' friends' minds, and their relationships, too. You'd think that if facebook wanted to become a key means of staying in touch with people, they'd ensure that these things didn't get "lost".

    2. Stoneshop

      Think about those charity adverts on TV which talk about suffering children in Syria or sick children in Great Ormond Street Hospital:

      In most cases those are recognisable as adverts. News items, with footage (partially) provided by the organisation that would benefit from heightened awareness, less so. News providers themselves (sites, newspapers and broadcasters) can and do filter in accordance with their worldview, but one could say that the viewer has the choice to select their news provider, or use several to get a less biased view.

      Facebook surreptitiously manipulating exposure of certain messages to certain viewers is something not at all like the above.

      1. Anonymous Coward
        Anonymous Coward

        News paper chooses which stories to post to match their own agenda. That's a problem, but we can mitigate it by choosing what to read. It's a problem that's already there, but we would not want it worse.

        Communications companies decide to change, amend and/or block communications between users to match their own agenda. It's not an existing problem, because until now, no one did it (AFAIK post office does not re-write your mail to "persuade" your voting decisions etc) due to difficulty, cost, lack of technology or just plain morals... now that seems it might be changing?

  6. Anonymous Coward
    Anonymous Coward

    ....What Else Is Facebook Doing?...

    Some more interesting studies by the Zucker-Stasi :-

    http://www.bloombergview.com/articles/2014-07-03/what-else-is-facebook-doing

    1. Anonymous Coward
      Anonymous Coward

      Re: ....What Else Is Facebook Doing?...

      And this is also very interesting and a bit scary:

      http://socialmediacollective.org/2014/06/26/corrupt-personalization/

  7. Boris Winkle
    Stop

    Meh.. That explains it then...

    Why I kept receiving rascist political bollocks non-stop in my feed. Good to have a cull of people that are too lazy to read what they share though.

  8. Velv
    Black Helicopters

    Since the IPO took place in 2012 as well, it will not be long before a class action is launched by those who bought shares since this experimentation was not declared in the prospectus and is something that is likely to have a material impact on the share price.

    1. Anonymous Coward
      Black Helicopters

      *Whole Fleet of Helicopters*

      I'm guessing it's why the share price was so high... how much would people pay to be able to control all of FBs customer base?

      If you own McD's you gain some customers and income, you choose their diet perhaps... if you own FB, you choose what?

  9. Anonymous Coward
    Anonymous Coward

    At last, a positive aspect of US gun laws

    It permits Zuckie to shoot himself in the foot. Repeatedly.

  10. Slacker@work
    Coat

    What this site needs....

    ...is LIKE buttons for stories of this ilk.

  11. Anonymous Coward
    Anonymous Coward

    Total non-issue

    People gave Facebook permission to do this the moment they signed up.

    If they don't like it, then they shouldn't use Facebook - it's not compulsory.

    But no, crying all the way to mommy because the big bad mans did something they gave him explicit consent to do.

    Wise up, cretins. On Facebook **YOU** are the product.

    1. Anonymous Coward
      Anonymous Coward

      Re: Total non-issue

      No I did not.. I gave fb to use my data for research. This is doing things like seeing if my posts change frequency or attitude round external events such as the world cup. It will cover seeing what sort of news feed messages I block or ignored or respond to.

      It does NOT mean they can go round messing with what I see in my news feed just to see how I react.

      1. Anonymous Coward
        Anonymous Coward

        Re: Total non-issue

        "No I did not.. I gave fb to use my data for research."

        So...yes you did. By your own admission.

        "It does NOT mean they can go round messing with what I see in my news feed just to see how I react."

        Yes it does, if that's what the research is about. End of discussion.

    2. Anonymous Coward
      Anonymous Coward

      Re: Total non-issue

      Completely agree - a total non-issue. Besides they weren't changing actual 'news', just what stories you receive. And if your sole source of news is Facebook, then you pretty much deserve to be manipulated, as you've already signed away all requirements for impartiality and truthfulness.

      Besides, adverts attempt to do the same sort of thing, and, as we all know, adverts are the arterial blood of the WWW. Every site uses them extensively after all.

      So stop whining. Don't like Facebook or what it does with you? Then take charge if your life and don't use it.

  12. IanW

    One rule for Facebook, 1000x okay for every other media outlet known to man?

    http://www.ianwaring.com/2014/07/04/facebook-mood-research-whos-really-not-thinking-this-through/

    1. Anonymous Coward
      Anonymous Coward

      Re: One rule for Facebook, 1000x okay for every other media outlet known to man?

      Facebook is not a media outlet.

  13. Anonymous Coward
    Anonymous Coward

    What Else Is Facebook Doing?...

    Some more interesting studies by the Zucker-Stasi :-

    http://www.bloombergview.com/articles/2014-07-03/what-else-is-facebook-doing

  14. HardCoded

    think of the kids?

    Anyone a parent to moody teenagers or 20 somethings who spend inordinate amounts of their time online?

    I don't need someone else deliberately messing with our heads. It's unethical, perhaps illegal. I use FB to communicate business, to chat and to keep abreast of social events. It's a good tool, but when it's manipulated to make you feel negative, that's a problem.

    I'm OK though, my head is already messed up, and I don't need FB to feel "complete".

    The effect may not be significant, it's just something we need to be aware of.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like