back to article Images of women coerced by adult companies poison dataset popularised by deepfake smut creators

Thousands of nude images from a popular dataset designed to train machine learning models to create AI-generated adult content are often taken from porn production companies accused of sexual abuse. The images, reviewed by Vice's Samantha Cole, come from Czech Casting and Girls Do Porn – companies that have been associated …

  1. Mage

    crowdsourcing the job gives Google

    And people are naive enough to do it. This sort of exploitation and much else from Google and all of Facebook's Empire needs banned to protect the majority of humans that don't understand what is happening.

    1. Anonymous Coward
      Anonymous Coward

      Re: crowdsourcing the job gives Google

      And now they want you to pay to do it also, as its no longer unlimited space to store your photos for their training data set.

    2. Anonymous Coward
      Anonymous Coward

      Re: crowdsourcing the job gives Google

      "Engineers will no doubt have to clean the data further before it's used to train the machine learning models."

      Not to give anyone ideas or anything, but this suggests a certain response.

      1. Anonymous Coward
        Anonymous Coward

        Re: crowdsourcing the job gives Google

        I'd wager that idea crossed most of our minds!

      2. Anonymous Coward
        Anonymous Coward

        Re: crowdsourcing the job gives Google

        Are the engineers going to wipe it on the curtains?

    3. Tomato Krill

      Re: crowdsourcing the job gives Google

      The fact it immediately follows the Russian story says everything...

  2. macjules

    Police in Moscow are investigating ..

    Is that what they call saying, "We are busy drinking coffee and will give the matter the urgent attention it requires"?

    1. Chris G

      Re: Police in Moscow are investigating ..

      You would get a similar reaction in the UK but it would be tea not coffee, if you are really lucky you might get a crime number for 'stuff'.

      Given the high ratio of cams:people in the UK I suspect it would doable there too. If it isn't already.

      As a side note, Telegram is used all over the world, is owned by two Russian brothers who live in Switzerland, so is not strictly a Russian message service. The Russian government is apparently not that happy with it because of encryption and the lack of rear entry.

      1. Jedit Silver badge

        "The Russian government is apparently not happy because of the lack of rear entry."

        Is that their main concern with Telegram, or with Czech Casting?

    2. You aint sin me, roit

      Re: Police in Moscow are investigating ..

      "We're busy working on a lucrative business tracking people from the photos interested parties send us... it's all legitimate, the fee just covers administrative costs."

    3. Mike 137 Silver badge

      Re: Police in Moscow are investigating ..

      ""We are busy drinking coffee..."

      Tea surely?

      1. Why Not?

        Re: Police in Moscow are investigating ..


  3. Anonymous Coward
    Anonymous Coward

    My pics are easy to transcribe.

    It's because they're also all that porn that they want to identify. Have fun sorting my smut!

  4. MiguelC Silver badge

    "As the faces and bodies are based on real data, it's possible that the deepfakes could resemble a human enough that people mistakenly believe it's someone they've seen in real life."

    You know when you see someone (IRL) that you think you know but in reality don't? No need for deepfakes for that, human phenotypes are not unlimited and two unrelated people might look alike.

    1. Jason Bloomberg Silver badge

      And, even if someone does look like the deepfake you were busy shuffling over the night before, that doesn't necessarily put them at risk.

      The kind of people who would equate being in a porn film as giving permission for sexual assault are hopefully few and far between and likely don't care who their victims are anyway.

    2. Anonymous Coward
      Anonymous Coward

      I actually have found porn of a married couple that I know. They're a bit younger in the video than they were when I met them, so I'm thinking they made it when they were in college. It looks like it was taken with a webcam. They're both teachers now. I sometimes wonder how many of their students have stumbled across it.

    3. Anonymous Coward
      Anonymous Coward


      I went to college with my doppelgänger. It was rather eerie meeting him. The main way people could tell us apart was I had a goatee at the time (I guess I'm the evil twin?) and I have a much deeper voice.

      I was once stopped by a couple of girls that insisted I knew them in high school. Couldn't convince them otherwise - wrong county, much less school. Even showing my driver's license didn't help, they thought it was a fake ID for buying booze. (Never mind that it said I was underage...)

      1. Anonymous Coward
        Anonymous Coward

        Re: Doppelgängers

        When I moved to Atlanta in late 1993, I went into several different restaurants downtown, in Alpharetta, in Duluth, and Sandy Springs (which are all over the map) and had staff tell me they didn't expect to see me/ how had i been doing since _____. As near as I could tell, I looked like 3 other people in Atlanta*.

        * - and fortunately, they all seemed to be non-serial-killer types...

      2. fiskrond

        Re: Doppelgängers

        A mate of mine once sent me a pic of someone taken in an international airport... even 'I' thought it was me, especially as intently looking at what appeared to be my then current phone (Nokia 7650), same haircut, glasses, and wearing similar attire...

  5. TeeCee Gold badge

    You're training an AI algorithm to recognise pr0n pics. You show it a load of pr0n pics.


    Where those pr0n pics came from is of no interest and nor should it be. You must have a wide selection of realistic and genuine data for training. Some how "cleaning" the data according to some holier-than-thou ruling merely serves to bias the dataset with inevitable undesirable effects on the resulting algorithm.

    You may not like the original source of the data, but it already exists to be used. If you want your AI algorithm to recognise that sort of picture, it has to be trained on it.

    1. Falmari Silver badge

      “You must have a wide selection of realistic and genuine data for training.” Even if that data has been obtained/created illegally through rape trafficking etc.

      So you are saying it is ok to create a product and make money from rape trafficking etc using data you have no right to.

      Also how are they obtaining this data just scraping it without the sites permission. Stealing it.

      Or buying it from the sites. Funding rape and trafficking.

    2. Anony Mice

      The problem with using data from dubious sources, especially when it has been obtained in bad faith & with questionable consent, is that it gives incentive to the continuation of said practices. It's far easier to harvest data if you have a complete disregard for people's rights.

      The only way to guarantee that data harvesters follow good practice is to make data obtained through questionable practices untouchable.

    3. General Purpose

      If you use images of coerced and raped women to train your AI to make deepfake pr0n, then supply that as AI-generated pr0n saying no humans were harmed in its creation, you're not providing "meat-free" pr0n and arguably you're surreptitiously training people to get off on images of coerced and raped women.

      1. NATTtrash

        Hmmm, I can even see a more disturbing future there. As we all know here, "some" are easily persuaded to believe that some net propagated opinion/ behaviour is the new normal/ truth. Deep fakes are limited only by the imagination of the "creator", and can be applied to cater for any individual, highly personal preference. And (IMHO) that's fine, everybody is entitled to her/ his own poison. But... The adult industry is there to make money. And the top shelf days are over as my grand daughter can show you in 3 mins. So what if this very personal view is rolled out widely because there is money to be made, and becomes regarded as the new normal? After all, deep fakes don't complain about rape/ amputating a ball and letting you eat it/ any "alternative fact" you can come up with yourself...

  6. fidodogbreath Silver badge

    All of this will make it easier for Google to train its algorithms to sort through your albums when you're looking for photos from events like Christmas parties or Thanksgiving or particular people or objects to track your location and offline contacts in order to show you more-relevant ads for things you bought two months ago.


  7. Anonymous Coward
    Anonymous Coward

    Face recognition

    I wish we had that service in the EU (where I live).

    I would then pay to get the data of every MEP who supports TERREG and the various other snooping laws, and send the results to them, see how they like it.

    1. Anonymous Coward
      Anonymous Coward

      Re: Face recognition

      Can't you set up a business account on Facebook and then target your advertising selectors to match only them? Maybe one can switch selectors on/off ala the game "Mastermind" to see if some are more interesting than others.

  8. Anonymous Coward
    Anonymous Coward

    Looks like I'm falling behind the times

    In my days only the boobs were fake, not the whole actor.

    1. FlamingDeath Silver badge

      Re: Looks like I'm falling behind the times

      Everything is fake now

      It turns out that humans have a propensity for telling lies

      What I can’t understand is “women who said they were tricked and forced into shooting porn videos”

      How fucking thick were they?

  9. Anonymous Coward
    Anonymous Coward

    Uber (sic)

    So what's left of it then? A phone application and the backend to run a short-term temp agency?

    Can't say I'm disappointed, mind.

    1. Sorry that handle is already taken. Silver badge
  10. Anonymous Coward
    Anonymous Coward

    Give Google data

    Hmmmm, no.

  11. Ashto5 Bronze badge

    Google have enough data

    Help google

    No thanks I will pass on that

    Every website I visit is passed to google even when I use a different browser

    G Analytics is everywhere it’s like some sort of virus just slurping the data up

    I now have blocks on google URLs but they keep changing them hardly a reputable action, none of the GDPR cookie consents EVER turn GA off

  12. Anonymous Coward
    Anonymous Coward

    F*ck crowdsourcing and surveys

    Frankly, Google can stick its attempt to make me work for them where the sun doesn't shine, and the same goes for the new spam: surveys.

    If you want my time and opinion as a commercial outfit, you pay for it. Otherwise I think you should pay for the illegal use of my personal data and my time for (a) zapping your request, (b) setting up a filter that ensures that any further shit doesn't get far and (c) adding the outfit to my greylist, a list of companies that I have to do further benefit/risk analysis before I deal with them again. NONE, and by that I literally mean ZERO of the outfits that sent me a survey ever asked permission for that when they collected my information (and I know this because I read what I give permission for), yet they expect free use of my personal time and computing resources. I now tend to give out temporary addresses so I can track who has been leaking my data to others (as that also happens).

    Oh, and no, I won't unsubscribe. That merely confirms someone uses that address, and thus ups its value for resale. Remember, you're spamming, and unsubscribing from spam is one of the surest ways to ensure the floodgates truly open. You shuld not have sent it in the first place.

    Dammit, I blew my weekly rant quota on this. Grr.

    Beer. I need beer.

    1. Barking mad

      Re: F*ck crowdsourcing and surveys

      Alternatively, label them incorrectly.

      Picture of a bus -> giraffe

      That sort of thing.

  13. This post has been deleted by its author

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like