back to article Google says it would release its photorealistic DALL-E 2 rival – but this AI is too prejudiced for you to use

DALL·E 2 may have to cede its throne as the most impressive image-generating AI to Google, which has revealed its own text-to-image model called Imagen. Like OpenAI's DALL·E 2, Google's system outputs images of stuff based on written prompts from users. Ask it for a vulture flying off with a laptop in its claws and you'll …

  1. Juillen 1

    Not the usual tropes again...

    If you look for racism and sexism, you'll find it. Even if it's not there.

    What the training set has done is find the general composition of the set, and render that. If that's what the contribution to the set is, that's what'll be reflected.

    I'm always confounded by AI workers coming up with these exclamations after exposing learning mechanisms to a representative set of data and it doesn't come up with what they want. If you want to teach something to come up with answers you want, you need to put the effort into creating a curated set of information for it to learn from (this is something that every species on the planet has learned long ago, which is why they survive and in cases such as some Birds and Simians, develop their own actual cultures.

    But, when you select only what you want to see, then you have to understand that it's a product of your own biases. Once you take off the limiters and let it see what's out there, it'll learn things you'd rather it didn't.

    There are, of course, confounding issues (like contrast for some human gene adaptations, which make some faces harder to recognise for example), but those are technical hurdles which you can control for and work with (it's part of learning how to teach a learning system).

    1. Anonymous Coward
      Anonymous Coward

      Evolution

      Is all about that for a given situation, some choices that are better than others.

      All it would have taken for you to not be here is for any one of your ancestors, all the way back through your lineage to the first "living" organism, to have made a wrong choice or have been in the wrong place at the wrong time before it procreated the next chain in your lineage.

    2. theOtherJT

      Re: Not the usual tropes again...

      In other words "Damn computer, seeing what's there rather than what we wish was there!"

      That's the uncomfortable thing with AI. It really doesn't care who it upsets. It doesn't have the concept of upsetting. Or who, if it comes to it.

      You get the hard cold facts of your dataset.

    3. Filippo Silver badge

      Re: Not the usual tropes again...

      The sexism and racism are there, in that the training set comes from the real world, the real world has sexism and racism, and the program can do nothing but imitate the training set.

      The weird and stupid thing is that people get surprised by this, when this behaviour is a necessary outcome of how so-called "AI" works. It's just a statistical model with a crapload of parameters. It doesn't buckle trends, it follows and reproduces them.

      It's not intelligent, it's a tool, it's as dumb as a hammer. If you hammer your finger, you don't demand a hammer that somehow always magically misses fingers; you learn how to use the hammer.

      You have to use it in a specific way to get the results you want. You most definitely can get a bunch of non-racist, non-sexist pictures out of an image generation model, and you don't even have to curate a set of tens of millions of non-racist, non-sexist pictures. Just make twice the number of pictures you want, and then discard the ones that are racist or sexist.

      You want an image generation tool that is both trained on real-world data, and doesn't statistically overrepresent rich white dudes? Yeah, that's the magic finger-missing hammer. You're asking for a technological solution to racism and sexism. It doesn't make sense, and it's not going to happen. Fix society, and then statistics will represent that.

      1. Dave 126 Silver badge

        Re: Not the usual tropes again...

        > There are, of course, confounding issues (like contrast for some human gene adaptations, which make some faces harder to recognise for example

        Also confounding is that cameras and film processing have been calibrated on people with lighter skin. The reference cards that Kodak sent to photo developers only had Caucasian people one them until the late 80s (IIRC, when Kodak brought out a new film). These technical issues could be worked around by a skilled photographer, but often weren't: a school year book might have headshots of the white students correctly exposed, and a near silhouette of the darker skinned students.

        Okay, I assume this training data set didn't use photos from the 1970s, but similar issues exist with digital photography, and with lossy image compression that has finer tonal graduation in lighter pixels (hence why areas of dark mist or space can look banded on lower bit-rate video). Our human eyes (and the invisible-to-us data disposal, post processing and compositing that our brains do) are very good at getting an 'image' from subtle cues from a scene with high dynamic range when compared to film cameras. But then film cameras don't 'cheat' as our brains do.

        Sidenote: Dang, I have a Samsung phone, the camera smudges any pixels it thinks are a face, because apparently waxy smooth skin is the fashion in Korea.

        1. Juillen 1

          Re: Not the usual tropes again...

          That was what I was referencing in the adaptation. It's actually due to the contrast levels of the skin, which are more pronounced in lighter skins. It's a feature of any image generation; it was picked up a while ago and branded "Racist" (the definition of which is "prejudiced or antagonistic towards a group, often a minority").

          All the school photos I had from back then worked quite nicely for people with all skin tones (and they were from the 70s and 80s).

          While revealing the information that the contrast was harder to process, this is merely a fact (and a "confounding factor"), not discriminatory, prejudiced or antagonistic.

          Did give me a chuckle that your phone smudges pixels like that.. As far as I view things, facial recognition is still in its primitive stages for learning systems, and probably will be for years, if not decades to come.

          Image recognition has come a long way from when I cut my teeth on it in the late 80s though, that's for sure..

      2. Juillen 1

        Re: Not the usual tropes again...

        Do you have stats on that? Everything I see has a fairly representative balance. Where are you obtaining the data about "Rich white dudes" by the way (which you seem to use in a disparaging way, thus coming very close to racism and sexism yourself; it's not just about minorities in the actual definition of the word).

        Now, you're saying "curate a set of images". This is of course subjective. What one person says is racist or sexist is highly unlikely to be what another person says it is. So what you're doing it introducing a bias into a set that previously worked on unfiltered information. Not by a measurable metric (which would be fine), but by an arbitrary subjective opinion.

        You could, of course, end up with a set of everyone in a business suit in a particular environment. Which would then render your set with flawed information to work from, potentially not being able to discern other alterations/environments correctly.

        And as to "Fix society", what would you say needs fixing (there's a lot, yes, as we're fairly primitive still, but I was wondering what you were alluding to in this instance).

        1. Filippo Silver badge

          Re: Not the usual tropes again...

          Read the article. It's the developers themselves that are stating their model has racial and gender biases. If you think they are wrong about their own software, that's a problem between you and them. It would be a somewhat weird position to take, but I've seen weirder.

    4. sabroni Silver badge
      Boffin

      Re: If you look for racism and sexism, you'll find it.

      Because it's there.

      1. Juillen 1

        Re: If you look for racism and sexism, you'll find it.

        Proof?

        1. JDX Gold badge

          Re: If you look for racism and sexism, you'll find it.

          Sorry, are you asking for proof racism and sexism are real?

      2. Anonymous Coward
        Anonymous Coward

        Re: If you look for racism and sexism, you'll find it.

        "Everything is racist" - left-wing lunacy, helpfully compiled by Titania McGrath:

        https://twitter.com/titaniamcgrath/status/1281023987242487808

  2. Ken Moorhouse Silver badge

    uncovered a wide range of inappropriate content

    Maybe this is the epitome of AI, because this is what people think, at the rawest, most basic level.

    Convolve the Bible, Qur'an, etc. into the mix and this would help rectify the "rules", to make the output more palatable for "family" consumption, but it would still be limited by who the human beings are that feed it. It also depends how many texts are fed into the mix (the etc. I mentioned at the beginning of this para).

    1. Anonymous Coward
      Anonymous Coward

      Re: uncovered a wide range of inappropriate content

      So if AI is going to follow the rules it's going to be very interesting in America. For example the Republicans bill to keep transgender women and girls in Louisiana from competing in college and K-12 athletic teams has just won final legislative passage ... so the only way for AI to verify that transgender students are not competing will be to discard all clothing.

      1. Anonymous Coward
        Anonymous Coward

        Re: uncovered a wide range of inappropriate content

        My, uninformed, understanding is that post-op trans people look much the same as people that were born "that way" so your ancient Greece style photos wouldn't prove anything.

        1. tiggity Silver badge

          Re: uncovered a wide range of inappropriate content

          @AC There are a few diagnostics that will be clear despite surgery, e.g. look at M & F pelvis as the most obviously visible to anyone without medical knowledge example, but plenty of more subtle indicators that experts can spot.

          .. When a skeleton is found, it can be "sexed" visually (without even needing to see if any DNA available) by medical experts

    2. Youngone Silver badge

      Re: uncovered a wide range of inappropriate content

      I'm not sure about the Qur'an, but having read the Bible I can tell you it is decided not "family friendly".

      1. Paul Crawford Silver badge

        Re: uncovered a wide range of inappropriate content

        Ore view it in Lego recreations:

        https://thebrickbible.com/

  3. cosmodrome

    Fed with images scraped from the internet. A typical case of "garbage in, garbage out". The vast majority of "images from the internet" are stock photos showing what a certain target group wants to see. So what do you expect from an AI fed with the exhaust from advertising agencies and their underpaid minions? If only Douglas Adams were still among us - he'd certainly find the right words for the kind of "reality" reflected by this kind of input.

  4. Anonymous Coward
    Anonymous Coward

    Great new technology

    And we all know what it will be used for

  5. The Bobster

    Not sure what those teddy bears are doing but there's no 400m butterfly in the olympics or any other international meet. Those poor teddypaws would be tired out after doing the regulation 200m; making them swim double that would be cruel and unusual punishment.

    1. I ain't Spartacus Gold badge

      Also, they're going to absorb all that water and sink. Plus no teddy ever quite recovers from being dunked in a swimming pool.

  6. JDX Gold badge

    Pretty impressive

    How can I have a go on it?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like

Biting the hand that feeds IT © 1998–2022