Irony of ironies
... at least that's what my wife calls it.
From the department of closing the barn door after the horse has warped away at light speed, comes this latest news. Although the creators of DeepNude have torn down their slimy software that removes clothes from women in photos, the code continues to spread all over the internet. Some devs are even trying to reverse-engineer …
Some ideas that will definitely crop up if they have not been built already:
Chatbots that emulate children for the enjoyment of sexual predators. (These already exist in the hands of some law enforcement groups to identify and trap such predators.
Deepnude+deepfake videos = Porn films with the subjects of your choice.
Text story to Screen Play / Screen Direction conversion using AI
Screen direction / Screen Play to video /animation with 3D imaging using AI
Add force feedback
Add them all together and you will have the full Startrek Holodeck Experience.
It is inevitable that this will be used for any and all sorts of perversions along with entertainment, training and every other useful and culturally appropriate use.
There will be grumbling and crying and gnashing of teeth but that is not going to stop the torrent of new ideas and technologies that will explode to satisfy any and all perversions.
That will all be "Empowering" "smashing the patriarchy" "striking a blow for the sisterhood" "showing these MEN for the ABUSERS they ALL are" etc
Funny how sexism only works one way in the minds of most feminists, ditto the attitude that white people can't be victims of racism and racist violence, that working class white men living below the poverty line and with no opportunities still get treated as having "privelige" due to the colour of their skin, apply that argument to any other ethniticity and watch the "twatterati" explode in fury and go into full lynch mob "ruin their life" mode
Feminism is ipso facto sexist.
Just as the Association of Black Lawyers is racist.
Once you start discriminating on the basis of color or gender, you are a racist or a sexist.
Even if your discrimination is meant to reverse someone else's it still perpetuates the agenda.
The correct answer to any of these images is "I wish that <insert part> looked like that" "yeah like that looks real" "big whoop nurses, doctors have all seen it and half the population has those parts"
I get some people would find it embarrasing, but trust me folks after urological surgery under local anaesthesia and knowing that a LARGE group of people (males and females of all ages and appearances) have seen your genitals and even more have seen them in a post surgery and borderline mutailated state (bruised, stitched up oozing wounds, post surgery infection - yeah that was "fun"....not) during numerous checkups.........the "oh my god folk have seen me naked" loses its effect on you and rapidly so, to the point I would just mentally add it to the lengthy list of folk who've seen my "private parts" and particularly when your told they found pre-cancerous lesions which they removed, it gets to be "meh"
Reminds me of when I had a retrograde IVP. An IVP is when you are given a dye orally so they can x-ray it as it goes through the urinanary tract.A retrograde is when the dye goes in from the other side via a catheter. I was given a valium IV. I said to the Doctor " but that's not a pain killer" He said " you are right, you just won't care any more" he was right. I could have walked out in the middle of the Super bowl stark naked and I wouldn't have cared.
Of course once the Valium wore off I cared a lot. Tylenol Five didn't touch the pain.
LMAO! Just had colon surgery last week - had a cancer mass removed - found out a friend's mother saw my ass on the operating table... And I, for one, do not give a fuck, cancer levels all playing fields...
Edited to say - yeah, doc blessed me with 12 TicTacs, (hydrocondone FIVES!) - understand the reasoning due to opioid addiction, affects on the bowels, but dayum! This thing HURT!
Anyone who wants to see ANY of the following politicians in the nude (simulated or otherwise) needs help or sectioned permanently.....
Donald Trump, Angela Merkel, Kim Jong Un, Scott Morrison, Anne Widdicombe, Theresa May, Viktor Orban, Xi Jinping (I wonder if he is in fact Winnie the Pooh with a human suit on), Mitch McConnell, Ncola Sturgeon, Ruth Davidson, Tayyip Recip Erdogan, Diane Abbott......and the rest....
(sorry for the nightmare images that thinking of that lot nude has probably caused, AC so I can't be pursued for the psychological therapy costs....)
Politicians - well, they do make a habit of having things to hide, so, anything that chips away at that, and reminds them that they are human like rest of us, and not a different class/species.
retraining the thing to do the same to men of all ages, with the aim of being able to embarrass and undermine prominent politicians
Most of our politicians are already huge dicks....
Anyway, this idea is not new, artists have combined lampooning the establishment and porn before.
https://commons.wikimedia.org/wiki/File:18th_Century_pornographic_cartoon._Marie_Antoinette_and_the_great_French_General_and_politician_Lafayette.jpg
retraining the thing to do the same to men of all ages, with the aim of being able to embarrass and undermine prominent politicians
Alas, there is no readily available training set of "middle aged men in suits, clothed/unclothed, same pose", whilst the "attractive 20-30s woman, clothed/unclothed, same pose" is almost limitless.
I've often thought technology can be used to prevent harm by giving people legal outlets for their desires, even if the act would be illegal with a real person.
It somewhat has been, but since it has also been used to cause harm, and generates the fear that it might be in ways yet unimagined some groups want to ban it's use as an outlet (or any outlet) and pretend bottling up is not harmful and anyone who transgresses is an evil pervert that should be shunned.
The permissive society does not permit.
Red light on headset signifies disrobing processing underway.
To be fair, it's been advanced as a reliable method to reduce stagefright and performance nerves for decades at least (although maybe not if addressing Parliament - it might exacerbate fear in this case).
Might be argued to be a form of therapy.
> Deepnude+deepfake videos = Porn films with the subjects of your choice.
Content producers have been speculating/investigating this idea since deepfake emerged. Their aim has been to be able to provide custom videos for paying clients where the client is substituted into the video. This seems an obvious extension to their existing business model of producing custom content for clients willing to foot the bill.
With this hacked version I just downloaded, you can even make an image of an Astute class submarine with naked boobies, and not just a mere nipple or two. But actually it's just for a test of a new acoustic absorber technology, and isn't prurient at all. Honest.
Finds all the pearl clutching, tutting and harrumphing about these sorts of things a little tedious? Lets face it; An AI can now do automatically what used to take a skilled photoshoppe Artisan hours to do by hand!
Sure; I can understand finding a photo of yourself neekid and on the te-interwebs more than a little disturbing. But surely at some stage we collectively need to say "Meh; I *wish* my boobs/dick (delete as approp) was that big".
Wake me up when deep fake/nude can be run in real time and the headline is...
Twitter thot making millions from pay piggies, turns out to be a 120KG trucker from Birmingham called Dave.*THEN* you will have a headline :)
We as a species need to become a heluva lot more cynical about the things we see via photos & video
We as a species need to become a heluva lot more cynical about the things we see via photos & video
You got a point, and I would even go further... We as a species are acting all dignified, PC, cultural, and sophisticated, but actually are still driven by very basic things. Drop online that a nude pic or sex tape of <fill in your favourite yummy person> has hit the interwebz, and you got a viral webtraffic spike on your hands. Let's make a bold statement here:
Most of the sweeping, fast developing advances in IT were driven by... sex and porn.
Digital images?
Digital video (streaming)?
Video on demand?
GPU developments?
Online/ digital pay methods?
(Picture posting) bulletin boards?
Video chatting?
...and I'm sure grey bearded commentards here can come up with many, many more...
Maybe I'm a bit too black and white, just for arguments sake, but I have no doubt that, if we look back over the years, we recognise more developments have been dreamt up by Willy Wanka in his Pr0n Factory than we care to admit. So is that a bad thing? Maybe yes. Maybe not. Or maybe we should be more concerned about the fact that we are trying to represent ourselves [still/ more and more]*** as something we are clearly not.
*** Choose if appropriate
"Most of the sweeping, fast developing advances in IT were driven by... sex and porn."
Completely this.
Fundamentally we are all animals. Everyone's most basic needs, just like most animals, rank something like this:
- air
- drink
- food, shelter
- sex
- most everything else
Ranting inanely about the evils of sex ranks WAAAY down the list of human needs but somehow a few hundred years of weird cultural conditioning and we arrive at the puritans, who unfortunately are still among us.
No, you're not the only one. However, it's become fashionable for techies to demonstrate how liberal we all are and shocked that stuff we develop could be used for teenage kicks.
Personally, apart from the fact that a lot of clothing is designed to stimulate the imagination of the viewer, I reckon I'd get pretty bored of a toy like this pretty quickly. A bigger problem is probably the unrealistic expectations that kids may have of themselves and others due to all the cosmetic surgery: gravity defying breasts, dicks that horses could be proud of but with the staying power of a fox, and the kind of athleticism that should only be expected from Olympic gymnasts.
Sex is an act but not a performance.
You're doing it wrong. Hey, a tragedy can still be a performance!
How can you be sure you are doing it wrong?
Or even right...
The most instruction we ever got was rumour and unnuendo followed up by blind fumbling about until we found something that kind of worked*.
TV exposure was always choreographed, porn scripted, the birds and the bees not an instructive guide and state education wise we generally had something almost little better than the Goodies Gender Education.
* personally, I've always had attempts that more closely resembled the clown troupe at a freak show circus...only a couple of nights, no repeat performance, no encore and basic accommodations.
The first good sex I had was with an American girl who said, "That was nice. All the other boys I've been with treat sex as a performance, while you just seemed to enjoy yourself."
I chose to take it as a compliment though I still worried about 'all the other boys'.
Thank god for the Pontifical Catholic University of Rio Grande do Sul. I've been in enough sweat lodges, saunas, and on enough nudist beaches that an app that can put clothes on the internet is needed.
> Finds all the pearl clutching, tutting and harrumphing about these sorts of things a little tedious?
Just sad to see sites like el reg that used to be tech sites turn into whatever they are now.
No, this isn't "involuntary porn" and no it doesn't "make naked pictures of people". It just photoshops random parts onto someone's head and frame in a realistic looking way. The "AI" has no way of knowing what size or color a woman's nipples are, or other parts.
The original creator used naked women because that's the easiest picture to find online to train the model for free. You can do the same with cats, dogs, men, trees, whatever you can find source pictures to train the model on.
If there's an app that can make clothed women appear naked, then you can't trust ANY pictures of naked women. All the women who wish they hadn't sent naked selfies to that ex boyfriend, or otherwise have real naked pics of themselves out on the internet they wish weren't out there have plausible deniability that those pics are fake.
Pretty much would end "revenue porn" and hackers getting hold of someone's pics from blackmailing them into sending them more. They have no hold on the victims any longer if a naked pic of her can be trivially faked and everyone knows it.
Well by the time the 4 year olds are 14 where today's parents are worried their teens are sexting, maybe 2029's parents of teens will figure there's no point to worrying about whether they share real naked pictures since everyone will be able to download a simple app on their phone to nudify them anyway.
Well OK "true porn" shots of a woman actually having sex are a different matter than being naked, but those are already faked with Photoshop (just ask any female celebrity) and they will get harder and harder to tell from the real thing.
The point is that as all the technologies to fake pictures improve to the point where they are indistinguishable from real pictures, eventually everyone will have no reason to believe any picture that includes someone they know is real whether they are engaging in BDSM or walking their dog topless in a park.
If a woman had a real porn/naked picture of her "in the wild" she would have much less reason to worry if her friends/family/co-workers/neighbors saw it, because there would be much less fear of consequences. You can't get fired from your teaching job when the school board knows the picture can't be proven to be real, etc.
Deny the birthmark, tattoo or mole that can identify you to an intimate partner? the app doesn't know whats under the clothing and imposes pre-determined images that while they may look real should you look at a stranger would be pretty obviously faked if it were of someone you know intimately ...
I have tattoo's and moles that my wife would see and know it was me and vice versa, if a deep nude of me or her appeared we would instantly know due to the lack of known detail,
That said, I'd also know if it was real .......
something to generate boobs for creatures without boobs. And upright postures. Where was it that once had three million people and 158 million sheep?
I'm sure we'll be faced with deepfaked deepnudes of sheep and people in flagrante dilecto
The boy stood on the burning deck
His fleece was white as snow ...
New Zealand, and it was only 70million sheep. Now it's 4 million folks and only 30million woolly ovines. This is due in part to a number of lowland sheep farms being converted to dairy but mainly due to breeding and genotyping such that most NZ ewes now have twins routinely. Hence you need half as many breeding animals to maintain production.
There's an old Kiwi joke that if aliens ever observed NZ from orbit they would see a large population of sheep being serviced by bipedal slaves kept in coastal pens, land and march up to a sheep and say 'take me to your leader'.
The speed of reaction of the sub-culture is impressive. Just a few days and a complicated set of tangled 'problems' has been 'solved'. Why is it I'm worried that this energy, intelligence and enterprise will be used for better poison gasses than better drugs? Making and spreading fake news will triumph in any war against detecting and quenching fake news. It would be rather nice though if a new strain of a virus could be nailed in weeks and the logistics of dealing with a natural disaster could be solved in hours.
It would be rather nice though if a new strain of a virus could be nailed in weeks and the logistics of dealing with a natural disaster could be solved in hours.
Unfortunately, solving those problems takes more brain power and more time, than simply drawing tits on stuff. And the average or slow manifestly outnumber the smart or intelligent, so the available time balance is the wrong way around.
> The Register poked around a few of these Discord servers. “We are happy to announce that we have the complete and clean version of DeepNude V2 and cracked the software and are making adjustments to improve the program,” announced one we found to its thousands of members.
I love the way they are 'leet hackerz' but still use corporate PR speak "We are happy to announce..." when releasing the new version.
The problem here is that the problem solved here is one which has a known solution: what does a naked body look like?
The cure for a new virus is now a known problem, so needs a different type of AI from one that can be trained to draw cats, nipples or beards.
I really can't see the motivation to use such software. I realise such motivations exist, but is it really anything sexual? I don't need to undress an attractive lady to find her attractive, and I have no possible motivation to do so unless it's my lucky day and she's with me in the flesh, so to speak.
But to write such software is an entirely different kettle of fish. Visualisation is an interesting problem no matter what the subject. I've written scientific visualisation software[1], and it was certainly one of the more inherently worthwhile things I've worked on. As soon as news broke of an app being withdrawn, the intellectual challenge was there, and someone was bound to take it up!
In my first term as a student, one of my courses was Group Theory. I picked up and solved a 'magic cube' that was a practical exercise in the subject - entertaining but in reality peripheral to the course. A year or so later those cubes had hit the shops under the brand name Rubik's Cube. Seems to me much the same kind of intellectual exercise as re-creating this software.
[1] The context being satellite images. The visualisation software helped pick out many things, from seismic activity to pollution incidents to the phase of the tide.
"I don't need to undress an attractive lady to find her attractive, and I have no possible motivation to do so unless it's my lucky day and she's with me in the flesh, so to speak."
Some guys have poor imaginations. Also, some find picture of skin a better turn-on than the imagination.
On one hand I reckon this technology has certainly reduced the amount of effort required to humiliate another person - and possibly even ruin their career. In lots of institutions pointing your superiors to this reg article by way of explanation may well not be enough to assuage their concerns - "no smoke without fire" and all that.
On the other hand if this sort of thing really takes off (and becomes convincing - I'd be surprised if the output of the above software isn't fairly rubbish) I wonder if it might have the opposite of its intended effect and actually ends up devaluing nudes in some general sense. This is an interesting outcome because it raises the possibility of actually making some human progress in terms of defusing some of our neurosis concerning nudity, sexuality etc. Though I'd be distressed if someone in my office sent an allcompany@ email containing nude pictures of me (fake or otherwise) I can't help but think that ideally this is something I really should be able to shrug off - and maybe this is easier in some future where "nudes" are trivial to produce and omnipresent. I'm not saying that the current victims of this sort of thing don't have a genuine complaint - right now given the state of human cultural neurosis the harm is certainly real. Just that if usage of this technology explodes in an uncontrollable way there might be a counter-intuitive silver lining.
The "consent" part is interesting too. Personally I find the claim that its unacceptable to depict a given person "in the nude" without prior consent, even if said depiction is "fake" and purely for private consumption a bit strong. I don't think that we should expect to have a right to control all non-public depictions of our likenesses. That just seems crazy to me. I reckon the idea only gets as much play as it does because it supports a sort of stealth puritanism and if there is something that we humans love as much as other naked humans its chastising other humans for their interest in said other naked humans!
I see a lot of people agonising over whether or not the real \ fake distinction is meaningful even if the fakes are "as good as the real thing". I think this is a bit of a rabbit hole which the concept of "likeness" gets you out of. For the purposes of evaluating your stake in your likeness the source of the image doesn't really matter - its accepted that you have a genuine interest in your likeness even if your ability to exercise control over it is limited.
I think its fair to require consent before sharing or distributing said materials in public though (and I'd consider large enough groups to be public by the way) and if we end up with stronger laws around this stuff I'd feel most comfortable with legislation that targets the sharing \ distributing \ revenge posting part of this. There is probably no technical way of regulating this of course but there is REALLY no way of regulating the non-public production and consumption of non-consensual porn so this seems a sensible compromise to me.
Overall though I think that more puritanical law is probably not what we really need - and I do worry about the effect that new legislation might have in "locking-in" nudity neurosis (both in the sense of our anxiety around nudity and how extra thrilling these images will be if they are made extra-illegal). Some sort of cultural shift would be preferable I think and maybe AI generated imagery starts to make that a possibility. I'm not confident though!
Perhaps the next step will be a system to be able to take someone's photograph (edited or otherwise) and use that to construct a 3D model of the person for use in whatever. Go beyond deepnudes and deepfakes and create deepscenes where you don't have to use existing material apart from creating the model(s).
I'd feel most comfortable with legislation that targets the sharing \ distributing \ revenge posting part of this
I'm not lawyer, but I feel what we need are laws against harassment, blackmail, etc. Not anything specific about porn (consensual or non-consensual, private or distributed). There are plenty of ways to intimidate or distress people with non-fake material and non-nude material (even a quote or a twitter comment taken out of context may be enough). And catcalling or racial or sexual abuse can be much distressing.
Let the laws concentrate on the hard questions: such as to what extent intent is required, and how much distress is illegal, rather than on mechanisms.
Of course, that would require politicians who are willing to put hard work into good laws.
Not even mythological politicians could solve the problem because it's all subjective. One man's insult is another man's praise, and the line between ribbing and harassment is both mobile and different from person to person. That's why defamation suits and the like are so touchy-feely. They're human in nature, so they can't help but be touchy-feely.
This post has been deleted by its author
Before I sent in a correction, the article read it substitutes the clothes for the naughty bits instead of the other way around. But I'm thinking that could also be useful - for example when you've accidentally wandered on to the part of the beach frequented by middle aged nudists - a pair of AI holo lenses that overlayed some bikini's before you coped an eye full of that, would be most welcome!
With all the respect to Andrew Ng, but I don't think this is "one of the most disgusting applications of AI". It might be discusting, OK, and also stupid, infantile, pervert and so on, but, personally, I find, by several orders of magnitude, more disgusting - and also dangerous - those criminal activities performed by some anti-social networks and Internet round-up engines which use AI to profile all the earth's population and trade those data. They also use the data to control and manipulate elections, political issues, they bring instabilities (political, economical, etc) throughout the world. Because of their round-ups people also remain without medical insurance (because they or their family members are profiled as ill or genetically predisposed for a serious illness, and therefore not profitable for insurance companies), house loan, without job,...People *DIE* because of that digital round-ups! And not because of fake nudity. And that's what we need to condemn and to stop, in the first place! Right now!
Actually I beg to differ. Based on the divergence of head lice from body lice and other evidence it is generally accepted that the early ancestors of H.sapiens sapiens began wearing clothes about 83K-170K years ago. See https://www.quora.com/Why-did-humans-start-covering-their-private-parts-and-when
Also relevant: the effect of clothes on sperm counts is measurable.
Random: I give it about a month before someone adds the "Fundie Filter" (tm) to Firefox/Chrome/IE/Opera/etc to automagically add clothes to "inappropriate" images online.