back to article Once again, racial biases show up in AI image databases, this time turning Barack Obama white

A new computer vision technique that helps convert blurry photos of people into fake, realistic images has come under fire for being racially biased towards white people. The tool known as PULSE was introduced by a group of researchers from Duke University, and was presented at the virtual Conference on Computer Vision and …

  1. Anonymous Coward
    Anonymous Coward

    I always knew computers were racist.

    Unlike a piano, all the keys on my keyboard are white.

    1. iron Silver badge

      All the keys on my keyboard are black.

      Your point?

      1. b0llchit Silver badge
        Joke

        Black keys only

        Your point?

        A tune in a limited key, of course.

        Just as limited as neural networks and often sounds just wrong, like the neural networks.

      2. Anonymous Coward
        Anonymous Coward

        No point at all. I was just being a dick.

        Sorry.

    2. monty75

      Mine are a kind of beige/yellow. Think I should probably clean it.

  2. codejunky Silver badge

    Ha

    Didnt Obama use a white man picture of himself to get elected and make the election about electing a black man?

    Maybe if these offended are really upset they will go and make one that doesnt have such a problem? Maybe? **tumble weed**

    1. codejunky Silver badge

      Re: Ha

      Is that down vote for mentioning saint Obama's campaign? Or for suggesting those complaining write an AI that doesnt have the issue they complain about?

  3. Cederic Silver badge

    this is not bias

    Results are skewed. That is not indicative of bias.

    People crying wolf about racial bias will lead to real racism being overlooked. Anyway, at the scale included in the article the image of Obama and not Obama are very comparable indeed.

    1. monty75

      Re: this is not bias

      Bias : "systematic error introduced into sampling or testing by selecting or encouraging one outcome or answer over others" https://www.merriam-webster.com/dictionary/bias

      I'd say that definition pretty much covers this case.

      1. Cederic Silver badge

        Re: this is not bias

        By your definition, to be bias the system would have to be rejecting non-white faces in favour of the white ones.

        Is that's what's happening?

        I've opened both images in an image editor, used a colour picker to select from the cheek, forehead, chin of both images.

        - Forehead in the light, both are the same colour.

        - Forehead in the shade, both are the same colour.

        - Cheek on the left, the selected image is a darker brown than Obama (but similar).

        - Chin, centre, Obama is a darker brown than the selected image (but similar)

        - Hair, both are the same colour

        Where's the bias? Hell, where's the skew?

        It doesn't look like it's in the software if people think one image is black and the other white.

        1. FeepingCreature Bronze badge

          Re: this is not bias

          Yeah I don't see it either. Both images look equally black.

        2. monty75

          Re: this is not bias

          It's not my definition, its Merriam-Webster's.

          The bias is in the training. The GAN learns from the training data what a face looks like. If it sees 90% white faces it will favour white skin in its definition of "face". This isn't a new revelation - it's a well-known phenomenon as the journal article referred to in the Reg article states. It's basically a manifestation of the old maxim "garbage in, garbage out"

          1. Anonymous Coward
            Anonymous Coward

            Re: this is not bias

            You are correct that the training data appears to be bias. You are incorrect that this is a proof or example of this. It may also turn white people black, or other types of face around to others entirely different.

            Proof would show the % of people it gets wrong. Also as posted above, I would think in part lighting may have an effect, and change more than the actual skin tone does!

    2. a_yank_lurker

      Re: this is not bias

      It's not bias per se that is the problem but that the Artificial Idiocy consistently fails on relatively simple tasks. Tasks that a child could easily complete with a much higher accuracy rate.

      1. Anonymous Coward
        Anonymous Coward

        Re: this is not bias

        Children can paint photorealistic mugshots form a blurred image in less than 30 seconds?!

        Which school you sending your kids to?!

    3. John Brown (no body) Silver badge

      Re: this is not bias

      "at the scale included in the article the image of Obama and not Obama are very comparable indeed."

      If you squint and blur your vision, the general shape is very good. But the skin tones are significantly lighter in the generated image. That's the bias showing. Generally, when upscaling or zooming a poor image to enhance it, you want to create new pixels between the larger pixels which are averages of the general area of the image. How you can interpolate lighter pixels between darker areas such that the average across the whole area becomes lighter is beyond me.

      It'd be interesting to see what it does to a pixelated image of Trump after he's just come off the sunbed and is at is most "orange panda-like" best. It'd probably put glasses on him :-)

  4. gnasher729 Silver badge

    AS - Artificial Stupidity

    What we have right now is not Artificial Intelligence, it is Artificial Stupidity.

    When you see on a computer science site kids asking if some neural network can solve NP-complete problems, then you realise that the problem is magnified by NS (Natural Stupidity).

  5. BazNav

    An end to police racial bias?

    So if the police put in a blurry picture of a BAME suspect then they get a high definition picture of an imaginary white suspect back? Surely this will work to counter-balance any potential discrimination or structural racism and make the world a better place. Or its just a complete piece of junk that should have been tested properly before being released on the world.

    1. John Brown (no body) Silver badge

      Re: An end to police racial bias?

      From what I read on the Beeb, it was rejected for publication and the university has distanced itself from it.

      1. John Brown (no body) Silver badge

        Re: An end to police racial bias?

        Whoops, sorry. My mistake. The Beeb was talking about facial recognition that claims to tell if if someone is a likely criminal by looking at a mugshot. THAT one is in the realms of phrenology!

  6. Whitter
    Meh

    What if the image is only very slightly blurry?

    Maybe it has some legs as a photo sharpening tool rather than trying to patch up a shoddy surveillance cam image?

  7. Henry Wertz 1 Gold badge

    Not racially biased, color-blind

    Honestly, AIs are not racially biased, they can be "color-blind" (ESPECIALLY when photos are taken in varying lighting conditions). They focus on feature recognition, since they decide on their own what features to look for they can ENTIRELY miss the point sometimes. So, you look at these photos and obviously it's not the same person. You look at FEATURES, and they are surprisingly similar. The eyes in the photos are not brown and blue, to me they both appear black due to lack of resolution. The ears are very similar, the pose is identical, they have the same hair line (including this triangular bit hanging over the forehead), and the lighting they both have a shadow in the top-right corner, the right side of the forehead.

    Don't get me wrong, it definitely shows a big problem with facial recognition systems; I'm not a fan of them for privacy reasons either.

    One tale of woe regarding AIs.. 10 or 15 years back, the military (don't know if it was US or UK?) was going to test a neural network-based "friend or foe" system. They brought out various airplanes onto the tarmac, took photos to feed in. They train this thing, test it in the wild and it DOES NOT WORK AT ALL. It turns out, most of the friendlys were photographed in the morning, and the rest in the evening, so ALL the AI was basing it's "friend or foe" on was if the plan was lit up from the left side or the right side, it was not looking at what kind of plane it was, the plane markings, etc. at all.

    1. Anonymous Coward
      Anonymous Coward

      Re: Not racially biased, color-blind

      Not addin skin tone into the AIs system is part of bias I guess. But you do seem correct on the other points.

      Plus, at what point of melatonin do you become a different race? If we cannot specify that, why expect an AI to know?

  8. RLWatkins

    This is not "AI-based image enhancement".

    Actual intelligence would recognize the image and fill in the details from memory.

    However, an actual intelligence seeing a face which it couldn't recognize, never having seen it before, would do no better than this.

    As much as we'd love to extract from an image details which just aren't there, and aren't anywhere else, it can't be done.

    Another promise, one which was never believable to begin with, broken.

    Yawn.

    1. Mark192

      Re: This is not "AI-based image enhancement".

      RLWatkins said: "Another promise, one which was never believable to begin with, broken. Yawn"

      Their website states "PULSE makes imaginary faces of people who do not exist, which should not be confused for real people. It will not help identify or reconstruct the original image."

      It does what it says on the tin. Any disappointment on your part is based on a fundamental misunderstanding of what they set out to achieve.

  9. dvd

    Skin tones

    I'm sorry, but I'm just not seeing that much difference between the skin tones of the subjects and the skin tones of the images that the computer picked.

    Maybe this is more about the race hyper awareness of the complainants than bias in the AI.

    1. dvd

      Re: Skin tones

      In fact, is go so far as to say that the enhancements to the blurred photos of Obama and AOC are pretty good if you don't bring the baggage to the test that Obama's supposed to be black and AOC's supposed to be Hispanic.

      1. Dr Scrum Master
        Meh

        Re: Skin tones

        O'Bama may be supposed to be black, but with a white mother and a black father he's mixed race.

        1. dvd

          Re: Skin tones

          Well exactly. The narrative is that he's black but the truth is that he's pretty light. AOC is basically as white as me; the features that make her look Hispanic are pretty slight and are gone in the pixellated photos.

          The real story here is that some people just want to make everything about race.

          1. cornetman Silver badge

            Re: Skin tones

            TBH, if you didn't know that AOC was hispanic, a HUMAN would be hard pressed to determine that from the sample photo.

            The truth as I see it, is that the people making these observations might have an actual point, but their examples are so weak that they are not really making much of a case. Given the fact that both AOC and Obama are famous, it is extremely difficult for people to discount their preconceptions when looking at the performance of the process being debated.

            It would have been better to pick the faces of strangers that are more recognisable as dark skinned to make their point.

    2. harmjschoonhoven
      Boffin

      Re: Skin tones

      OpenCV's Haarfilter is designed to recognize faces. Trouble is, the analysis is based on grey-scale images. So a kitchen-cupboard with two knobs is also recognized as a face. Now compare the colour of the inscribed ellipsis of the 'face' to the human skincolours which all lay in a well-defined column in RGB-space and you have a DIY face recognition system.

  10. Anonymous Coward
    Anonymous Coward

    grow the dataset

    Can we assume that the dubious actors behind Zoom (and creators of other videoconferencing stuff) are harvesting video of people's faces to sell for testing/training by facial-recognition software companies? See how well they do with various cats/ dogs/ kids crawling over the keyboard in front of the webcams.

  11. cornetman Silver badge
    Stop

    Not really sure that I understand the accusation here.

    In all the examples presented, the selected images are extremely close matches.

    Now, *we* know that the models are "black" (although most of them here are actually fairly light skinned), but the sample photos are taken with very bright lighting.

    > The computer-generated face obviously doesn't look anything like Obama at all.

    I'm not sure what the article author is looking at, but that match is extremely close to the very pixelated image by the side. *We* know that it is Barak Obama, so we know that the selected image is not Obama, but I don't think that the software is going to have that much contextual information.

    Let's see some examples of what it chooses with a photo that is not taken with very bright lighting of someone that really has dark skin.

    I think the complainants are being pretty disingenuous here.

  12. Mark192

    "Race doesn't matter, race doesn't exist..."

    "Race doesn't matter, race doesn't exist..." people keep telling me.

    "OMG a computer program made a fake person and got the race wrong!" scream other people.

    The program appears to try and identify the skin tone and give features of the appropriate race. Light brown in bright light looks the same as 'white' skin in poorer/darker lighting.

    Most lighter-brown people in America will have white ancestry as well as African. In the example of Obama, he had a white mother and black father. Complaining it made him "black and that's not his race" would be equally valid... except he's chosen to identify as black.

    As much as this shows up the issue of data sets causing bias it also shows up the trend of making race a discreet characteristic (you're either this or that) when it should be a gloriously messy irrelevance.

  13. Neoc

    Not seeing it.

    Literally.

    I am short-sighted, so I did a very simple experiment - I removed my glasses and looked at the Twitter images. And for the life of me, if I didn't know they weren't the same person, I would have said they were two differently-pixelated pictures of the same person.

  14. disgruntled yank

    To point the moral

    Comparisons are invidias.

  15. Anonymous Coward
    Anonymous Coward

    ..there's some rare amusing fun wisdom in this Artificial Stupidity...

    ..there's some rare amusing fun wisdom in this Artificial Stupidity, Nobel Prize Recipient Obama was raised in a White family, he committed his own war crimes by covering up W.'s war crimes, by going after whistle blowers like Pfc Manning, Snowden, Winner, Assange etc, rather than Qualifying him self for said sad Nobel Prize, by going after All the war criminals.

  16. Brewster's Angle Grinder Silver badge
    Joke

    Researchers then tested it with a blurry photo of President Trump. It responded with a picture of a black man, leading them to conclude the algorithm was now smart enough to troll us.

    Seriously, while I'd love to see the fireworks, the man doesn't need any more excuses to act the victim.

  17. Anonymous Coward
    Anonymous Coward

    Half wright

    Lets keep it real, he's mother is white, so he is half and half. His features and skin tone is a mix of both.

    The facial recognition software should completely skip using skin tone as a basis, due to the fact most people that are not straight pale or pure black. they they can and do dramatically change color depending on how much sun you get.

    Mathematically it is a disaster to match skin tone. Much more accurate to go by dimensions of facial features.

  18. Claptrap314 Silver badge
    Black Helicopters

    "Enhance"

    End of line.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like