back to article Google Cloud's AI recog code 'biased' against black people – and more from ML land

Here's your latest summary of recent machine-learning developments. Google Cloud’s computer vision algos are accused of being biased: An experiment probing Google’s commercial image recognition models, via its Vision API, revealed the possibly biased nature of its training data. Algorithm Watch fed an image of someone with …

  1. Andy Non Silver badge

    ED 209

    Please put down your weapon, you have 20 seconds to comply...

    But I'm a doctor, and this is a temperature gun.

    You now have 15 seconds to comply...

  2. Anonymous Coward
    Anonymous Coward

    It is probably more that visible light is biased against black people. And dogs. And cars. And … All of these things are harder to photograph well. So the average photograph generally isn't very good. And if you are using average photographs to train your AI you will get worse results with black people. And dogs. And cars. And....

    1. Giles C Silver badge

      A friends dog is biased against black dogs (he is a white Staffordshire terrier), other dogs no issue, black dogs doesn’t like.

      They reckon is because is it harder to read the intent of a black dog as opposed to a lighter colour fur, presumably this is down to when you see the teeth you can’t tell if it is yawning, having its tongue hang out or getting ready to fight, and you can’t see the eyes against the fur....

      So perhaps the data set was trained using dogs.....

    2. a_yank_lurker

      Photographic Quality

      If Chocolate Factory is using snap shots from Instagram and the like they are truly the idiots. Most people do not know how to operate a camera and thus use whatever settings the camera/phone selects. Also most do not use a quality camera that has very good resolution and lens ever. Up to point image quality is affected by the quality of the device (sensor size, lens, etc.) as well as the skill of the user. I have not addressed lighting conditions and how to try to compensate for them.

      To properly 'train' the artificial idiocy system you need high quality portraits taken from different angles of a large number of people. This costs real money. The reason they sort of get away with it on lighter skinned people is there is naturally better contrast in the face than with darker skinned people. Basically they got lucky. But the poor quality of the original images means more error in the system. Depending on what you need the system to do, the error could be unacceptable.

      1. Alister

        Re: Photographic Quality

        I disagree. If you only use studio quality photographs as your training set, then the AI's performance when faced with real world photos - from CCTV for instance - will be even more error prone.

  3. Flocke Kroes Silver badge

    Be careful when getting some coffee

    Rear view biometric access has already been installed in certain secure facilities for years.

  4. John Brown (no body) Silver badge

    why store data in the cloud?

    Why does my arse related info need to be stored in the cloud? I'm guessing this Internet Of Shit device is going to be very expensive and contain expensive sensors and some significant processing power in-situ (see what I did there?) so how hard is it to have a some on-board storage (ooo-errr) like an SD card and maybe a simple webserver to serve up your business to you on demand?

    1. macjules
      Paris Hilton

      Re: why store data in the cloud?

      Anal recognition software: "Please move your anus around until we have recognised it.". "Thank you, would you also like to use your anus to unlock your iPhone?"

      1. Anonymous Coward
        Anonymous Coward

        Re: why store data in the cloud?

        Isn't it always an anus that unlocks an iPhone...


        1. baud

          Re: why store data in the cloud?

          At least the onus will still be on the user for this

      2. Morten_T

        Re: why store data in the cloud?

        I reckon that version number 2 will be called Anal Recognition Software Enhanced :D

    2. veti Silver badge

      Re: why store data in the cloud?

      A toilet is not exactly a friendly environment to most things, the less you have to put in there the better. And outside the khazi itself is the bathroom, which is not much better from the perspective of electronics.

      What bothers me is the thought of what it will take to keep it clean.

  5. cornetman Silver badge

    It seems to be that the researchers are showing their own bias here.

    My (obviously less politically motivated) take is that the model more accurately identified what was actually a gun (albeit a temperature one) when held by someone with dark skin.

    So perhaps the headline should actually say, "Model more accurate when dark skinned person in the scene".

  6. Robert Grant

    The scans — both finger and nonfinger

    Sorry - how does the finger "scan" my bum?

    1. Robert Grant

      "One thumb up" - sounds as though I have my answer.

  7. ExampleOne

    The toilet one sounds like an attempt for an Iggy...

  8. SVV

    as it turns out, your anal print is unique

    This has just given me an idea for a fun art project that everybody can try at home during the lockdown.

  9. LaFiend

    Imagine the final invoice for one of the smart toilet seats if it were procured by NASA.

  10. ibmalone

    However, if we see signs of COVID-19 pneumonia on chest x-ray, which may be picked up by the AI algorithm, we may decide to test patients with RT-PCR who have not yet been tested, or re-test patients who have had a negative RT-PCR test already. Some patients have required 4 or more RT-PCR tests before they ultimately turn positive, even when x-ray or CT already show findings

    I can't help but suspect a certain amount of corona-wagon jumping will be going on at the minute. Patient turns up in the middle of a pandemic with breathing difficulties, they get tested. Maybe you retest a few times due to false negatives, particularly if the X-ray shows pneumonia (as the quote suggests happens already), what having a ML widget to indicate (non-specific) pneumonia in that case adds is not clear to me. Seems like the right answer would be more tests and more accurate tests.

  11. A-nonCoward

    visiting the loo @MIT Media Lab

    Many years ago - I was young and clever and thought I would be given the red carpet - didn't get past the front desk.


    Being generous by nature, I decided to contribute, so visited their "facilities".

    When in the proper enclosure and assuming the position, there was this label at eye level The cameras are for research purposes only.

    At the time I thought it was just a cute sophomoric prank, go around the toilets with stickers, make lower life forms feel insecure. (I know the feeling, I mean, of making others feel insecure, like when I get nude to change in a swimming pool locker.)

    Later on I learned that some MIT people are subject to raaather flexible ethics requirements when doing research in Africa, but that's another story.

  12. mrobaer

    Recurring trend?

    A previous El Reg article reported similar troubles with racial profiling AI.

  13. tesmith47

    dark colored folks identified with a gun, not a banana or some other innocuous object, yep AI bias

  14. cwadamsmith

    Very informative article!!!

  15. Gadbous

    All your gun are belong to us.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like