back to article Another reminder that bias, testing, diversity is needed in machine learning: Twitter's image-crop AI may favor white men, women's chests

Twitter says its AI that automatically crops images in tweets didn't exhibit racial or gender bias when it was developed – even though in production it may prefer to crop out dark-skinned people and focus on women's chests. The social network acknowledged it has more work to do to address concerns. "Our team did test for bias …

  1. Steve K
    Coat

    Rebrand opportunity?

    They could make a clean breast of it and rebrand as Titter?

    1. Version 1.0 Silver badge

      Re: Rebrand opportunity?

      Or maybe they'll drop the "titter" letters and rework the remaining letter into a shapely logo? It is a good deed to forget a poor joke - Brendan Behan.

      But seriously, what does this sort or programmatic error tell us about the code writers? If women stood a better chance of getting a job as a programmer then would this stupidity vanish?

      1. Robert Grant

        Re: Rebrand opportunity?

        No one wrote code to exactly identify this. An AI was trained on a set of images. You're a million miles off.

        1. This post has been deleted by its author

        2. sabroni Silver badge

          Re: No one wrote code to exactly identify this.

          So the humans responsible can wash their hands of it?

          The fact it doesn't do what they built it to do isn't something to be proud of.

    2. Anonymous Coward
      Anonymous Coward

      Re: Rebrand opportunity?

      Don't make fun of them. They did their breast.

  2. Pascal Monett Silver badge
    Coat

    "focus on women's chests"

    So it's a teenage AI then ?

    1. Jim Mitchell

      Re: "focus on women's chests"

      At it's age, it might still be nursing.

  3. Mike 137 Silver badge

    "the code Twitter uses to select the viewable portion of images displayed in Twitter feeds"

    Whose blooming image is it anyway? What earthly right has the twit to decide which parts of an image to display. By all means suppress the entire image if it doesn't pass "censorship" but displaying a tampered version is an infringement of the originator's right to freedom of expression, as it can change the meaning of the image.

    As a demonstration of how bad this can get, there's a fascinating book The Commissar vanishes by David King (Henry Holt, New York 1997) which shows several series of group photos of Soviet officials in which members of the group are progressively replaced by blank space (or in one case a pot plant) as they were eliminated in Stalin's purges.

  4. TheMeerkat

    Why there is that “racism” brigade that always assume that somehow “racism” is involved in everything? They always sound like extremely biased and bigoted people.

    Why not accept that the reason might be that the task of recognising black faces just being more challenging due to light properties?

    It is a difficult task, it needs and will be addressed but it will take time.

    1. Anonymous Coward
      Anonymous Coward

      >>>Why not accept that the reason might be that the task of recognising black faces just being more challenging due to light properties?

      More challenging than pulling very poor excuses for bad and sloppy ML system design & testing out of one's arsehole?

    2. Anonymous Coward
      Anonymous Coward

      It is a difficult task, it needs and will be addressed but it will take time.

      Great phrase. I can just hear someone rolling that out for the abolition of slavery, votes for women, accessibility for disabled, and so on. To paraphrase, I'm fine with the status quo - we'll get to you lot later. Now imagine how you'd feel when consistently being told that you and your needs are lower priority because, 'difficult'.

      1. Anonymous Coward
        Trollface

        Re: It is a difficult task, it needs and will be addressed but it will take time.

        "Now imagine how you'd feel when consistently being told that you and your needs are lower priority because, 'difficult'."

        That's how the rest of the world sees the English, isn't it?

      2. find users who cut cat tail

        Re: It is a difficult task, it needs and will be addressed but it will take time.

        Even if the system was developed by aliens who have never heard about human races, its performance in poor light conditions (backlight, low light, …) would deteriorate quicker for dark skin faces. That's just how optics works.

        Frist you have to admit this. Then we can talk about what to do about it. Shouting racism does not help anyone.

        1. Anonymous Coward
          Anonymous Coward

          Re: It is a difficult task, it needs and will be addressed but it will take time.

          Or, perhaps we should think before deploying a system guaranteed not to work for a % of citizens, paying customers, or whoever the intended audience - a % which might be in the majority globally.

    3. sabroni Silver badge

      re: that “racism” brigade

      Oh, the woke brigade. Those who don't like to watch police slowly suffocate black people to death.

      Fucking liberals.

    4. anononononono

      Why not just put a disclaimer on tech that's only works for white men? Then you could safely say my tech only works for white men, and no one could accuse you of being racist or sexist

  5. Anonymous Coward
    Anonymous Coward

    The question is who are the test subjects?" she said. "If they are predominantly straight white men who are ogling at women's chests and preferring to look at white skin

    Ah, the permitted stereotypes.

  6. parperback parper

    Saliency bias

    Nobody is going to notice if the AI gets it right.

    They certainly will when it gets it wrong.

    There is no safe false matching rate and no matter how hard you try, your system is going to end up making visibly racist/sexist decisions.

    Perhaps people should take this into account when deploying an AI system.

    A system that makes obvious, but stupid decisions is going to get less flak than one that makes opaque but mostly correct ones.

  7. Anonymous Coward
    Anonymous Coward

    They must have been mistakenly using the data set from the Dev machine..,.

  8. Christoph

    Not surprising

    Look at photos of people in newspapers and magazines and news websites. Photos of men show the face. Photos of women show the face and upper torso. Any algorithm that uses any of that data is going to show that bias.

  9. Dwarf

    B oo b

    So named as apparently its Top view, front view, side view

    Although if you believe the handle, then it must follow that I'd appreciate the front view bit more than most :-)

  10. maffski

    Surely the most important question is...

    Why prat about with this AI learning to crop an image when you could just scale it to fit?

  11. J27

    Seeing as they haven't fixed the bias in the camera sensors, it's pretty much impossible to fix the after-effects. They just don't capture the same level of detail in darker skin tones.

    That example photo is pretty bad too. Backlit, with half the man's face in shadow. If they wanted to make a point they should have compared it to a similar shot, not one that was evenly front-lit.

    There are a lot of factors at play here, not just a single algorithm and this coverage is over simplistic.

    1. Daedalus

      Backlighting

      Yep, backlighting is the curse of imaging at all levels. Too many video conferences are shot against a bright light, resulting in dark faces all round. Never mind the melanin, feel the burn. As usual its the manglers and sales droids to blame, they don't understand that technical stuff. And as for the ones who call in from their car on the way to the airport.....must control...fist...of death.....

  12. ericsmith881

    Seriously?

    Oh for crying out loud...the issue here is one of getting proper contrast for facial recognition. Anyone who's done facial recognition knows this. The whole "pale globe" comment is ridiculous. The picture in question shows a dark-skinned man in a very backlit situation, something difficult for *any* algorithm to lock onto. You'd get the same results if a white person was silhouetted. If it can't detect facial features due to poor lighting, it's going to lock onto anything that remotely resembles a face. It's not "choosing" a pale globe because it hates black people.

    Yet another example of people getting offended about something simply because they're looking for reasons to get offended.

    1. anononononono

      Re: Seriously?

      If the image detection doesn't work, why use it? Why are you coming here to get all offended about reading an article and then complaining about people looking for reasons to get offended?

    2. Anonymous Coward
      Anonymous Coward

      Re: Seriously?

      Bullshit, the issue is primarily feeding pictures of white people as training data. You telling me that all this fancy "AI" can't spot the difference in colours? It's a computer, it is capable of spotting the difference the human eye can't pick out. If the pixels are not identical it can "see" the difference.

  13. IGotOut Silver badge

    Didn't show in testing

    Odd that a mainly white male test team* didn't notice these issues.

    * Based a biased view of who was doing the testing.

    1. Mark192

      Re: Didn't show in testing

      Go read the article again and the bias testing they've done.

      1. Anonymous Coward
        Anonymous Coward

        Re: Didn't show in testing

        "Works on my box" becomes "Works on my face". They clearly haven't done enough of the right testing, hence why it picks the picture of the male US politician with huge fake breasts added in over the same male US politician.

  14. codejunky Silver badge

    Ha!

    Bias is in machine learning. In a neural net there can be bias weights used yes. Oh sorry you think the computer cares about your feelings? Cry me a river just not on the important electronics.

    There is always one moron who is gonna claim racism. Reminds me of an idiot in a checkout line thinking he was stopped from buying the single can from a multipack (that he drank before reaching the checkout) because he was black. Moron knows no colour.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like