crowdsourcing the job gives Google
And people are naive enough to do it. This sort of exploitation and much else from Google and all of Facebook's Empire needs banned to protect the majority of humans that don't understand what is happening.
Thousands of nude images from a popular dataset designed to train machine learning models to create AI-generated adult content are often taken from porn production companies accused of sexual abuse. The images, reviewed by Vice's Samantha Cole, come from Czech Casting and Girls Do Porn – companies that have been associated …
You would get a similar reaction in the UK but it would be tea not coffee, if you are really lucky you might get a crime number for 'stuff'.
Given the high ratio of cams:people in the UK I suspect it would doable there too. If it isn't already.
As a side note, Telegram is used all over the world, is owned by two Russian brothers who live in Switzerland, so is not strictly a Russian message service. The Russian government is apparently not that happy with it because of encryption and the lack of rear entry.
"As the faces and bodies are based on real data, it's possible that the deepfakes could resemble a human enough that people mistakenly believe it's someone they've seen in real life."
You know when you see someone (IRL) that you think you know but in reality don't? No need for deepfakes for that, human phenotypes are not unlimited and two unrelated people might look alike.
And, even if someone does look like the deepfake you were busy shuffling over the night before, that doesn't necessarily put them at risk.
The kind of people who would equate being in a porn film as giving permission for sexual assault are hopefully few and far between and likely don't care who their victims are anyway.
I actually have found porn of a married couple that I know. They're a bit younger in the video than they were when I met them, so I'm thinking they made it when they were in college. It looks like it was taken with a webcam. They're both teachers now. I sometimes wonder how many of their students have stumbled across it.
I went to college with my doppelgänger. It was rather eerie meeting him. The main way people could tell us apart was I had a goatee at the time (I guess I'm the evil twin?) and I have a much deeper voice.
I was once stopped by a couple of girls that insisted I knew them in high school. Couldn't convince them otherwise - wrong county, much less school. Even showing my driver's license didn't help, they thought it was a fake ID for buying booze. (Never mind that it said I was underage...)
When I moved to Atlanta in late 1993, I went into several different restaurants downtown, in Alpharetta, in Duluth, and Sandy Springs (which are all over the map) and had staff tell me they didn't expect to see me/ how had i been doing since _____. As near as I could tell, I looked like 3 other people in Atlanta*.
* - and fortunately, they all seemed to be non-serial-killer types...
You're training an AI algorithm to recognise pr0n pics. You show it a load of pr0n pics.
END OF BLOODY STORY.
Where those pr0n pics came from is of no interest and nor should it be. You must have a wide selection of realistic and genuine data for training. Some how "cleaning" the data according to some holier-than-thou ruling merely serves to bias the dataset with inevitable undesirable effects on the resulting algorithm.
You may not like the original source of the data, but it already exists to be used. If you want your AI algorithm to recognise that sort of picture, it has to be trained on it.
“You must have a wide selection of realistic and genuine data for training.” Even if that data has been obtained/created illegally through rape trafficking etc.
So you are saying it is ok to create a product and make money from rape trafficking etc using data you have no right to.
Also how are they obtaining this data just scraping it without the sites permission. Stealing it.
Or buying it from the sites. Funding rape and trafficking.
The problem with using data from dubious sources, especially when it has been obtained in bad faith & with questionable consent, is that it gives incentive to the continuation of said practices. It's far easier to harvest data if you have a complete disregard for people's rights.
The only way to guarantee that data harvesters follow good practice is to make data obtained through questionable practices untouchable.
If you use images of coerced and raped women to train your AI to make deepfake pr0n, then supply that as AI-generated pr0n saying no humans were harmed in its creation, you're not providing "meat-free" pr0n and arguably you're surreptitiously training people to get off on images of coerced and raped women.
Hmmm, I can even see a more disturbing future there. As we all know here, "some" are easily persuaded to believe that some net propagated opinion/ behaviour is the new normal/ truth. Deep fakes are limited only by the imagination of the "creator", and can be applied to cater for any individual, highly personal preference. And (IMHO) that's fine, everybody is entitled to her/ his own poison. But... The adult industry is there to make money. And the top shelf days are over as my grand daughter can show you in 3 mins. So what if this very personal view is rolled out widely because there is money to be made, and becomes regarded as the new normal? After all, deep fakes don't complain about rape/ amputating a ball and letting you eat it/ any "alternative fact" you can come up with yourself...
All of this will make it easier for Google to train its algorithms to sort through your albums
when you're looking for photos from events like Christmas parties or Thanksgiving or particular people or objects to track your location and offline contacts in order to show you more-relevant ads for things you bought two months ago.
No thanks I will pass on that
Every website I visit is passed to google even when I use a different browser
G Analytics is everywhere it’s like some sort of virus just slurping the data up
I now have blocks on google URLs but they keep changing them hardly a reputable action, none of the GDPR cookie consents EVER turn GA off
Frankly, Google can stick its attempt to make me work for them where the sun doesn't shine, and the same goes for the new spam: surveys.
If you want my time and opinion as a commercial outfit, you pay for it. Otherwise I think you should pay for the illegal use of my personal data and my time for (a) zapping your request, (b) setting up a filter that ensures that any further shit doesn't get far and (c) adding the outfit to my greylist, a list of companies that I have to do further benefit/risk analysis before I deal with them again. NONE, and by that I literally mean ZERO of the outfits that sent me a survey ever asked permission for that when they collected my information (and I know this because I read what I give permission for), yet they expect free use of my personal time and computing resources. I now tend to give out temporary addresses so I can track who has been leaking my data to others (as that also happens).
Oh, and no, I won't unsubscribe. That merely confirms someone uses that address, and thus ups its value for resale. Remember, you're spamming, and unsubscribing from spam is one of the surest ways to ensure the floodgates truly open. You shuld not have sent it in the first place.
Dammit, I blew my weekly rant quota on this. Grr.
Beer. I need beer.
This post has been deleted by its author
Biting the hand that feeds IT © 1998–2022