back to article Smut slinger dreams of AI software to create hardcore flicks with your face – plus other machine-learning news

It's a long weekend in England and Wales, with many Reg vultures taking time out and making the most of what's left of the quiet August month. We haven't forgotten you, though, so here's a roundup of artificial intelligent software related tidbits. Oh dear, a porn company wants to monetize the deepfakes craze: Hey, remember …

  1. ThatOne Silver badge
    Coat

    Oh yes!

    > The service then spits out a realistic-looking saucy scene with you pasted into it.

    ...so you can blackmail your boss with that video of him and those women-who-are-not-his-spouse...

    .

    > Taking fewer 2D slices, and using an AI to fill in the gaps, will reduce the time taken in an MRI machine.

    Giving my MRI scans to Facebook? I'm not even going there, but taking less real slices will most likely allow to replace that real tumor by some extrapolated healthy tissue. Who wouldn't want that. Less definition is always cheaper - I mean better.

    .

    > What if El Reg articles could be effectively translated without having to resort to Google Translate, and thus millions of people from non-English-speaking countries showed up. Just imagine the comments.

    Like "The rubberized minuet is galloping around my humanist carburetor"? Yes, that would indeed be entertaining.

    1. Anonymous Coward
      Facepalm

      Re: Oh yes!

      Don't forget AI making up tumors that aren't even there...

      And no, uploading pictures of your tumor to FaceBook to see how many 'likes' you can get would be the last thing I'll ever want to do !

    2. Warm Braw Silver badge

      Replace that real tumor by some extrapolated healthy tissue

      I hope they don't get their technologies confused. The last thing* anyone wants is to take home a copy of their MRI to find it's a video of their diseased pancreas banging a pneumatic porn star.

      *There will be Rule 34 exceptions.

    3. Korev Silver badge
      Big Brother

      Re: Oh yes!

      >> Taking fewer 2D slices, and using an AI to fill in the gaps, will reduce the time taken in an MRI machine.

      >Giving my MRI scans to Facebook? I'm not even going there, but taking less real slices will most likely allow to replace that real tumor by some extrapolated healthy tissue. Who wouldn't want that. Less definition is always cheaper - I mean better.

      One problem that medical science has is that we know a lot about disease, but not about what causes disease or what it looks like before it's detectable (obviously there are exceptions like smoking causing cancer etc).FB etc have a huge amount of data on 10-15 years of people's lives and it'd be illuminating if they could somehow link it with medical records.

      That ignores that fact that no one in their right mind would want to give FB, Google etc. any more data to mine and sell. I wish there was some way that this kind of analysis could be done without destroying privacy.

    4. Anonymous Coward
      Anonymous Coward

      Oh No

      Police are getting trained in spotting DeepFakes. Some people have tried using the software to create fake video evidence of assaults, then providing the fake evidence to police in an attempt to frame innocent people.

      1. veti Silver badge

        Re: Oh No

        Any amount of training will only be as good as the current generation of processes. Within 18 months the next generation will be available, and it'll be able to fool whatever training you can get today. Within 5 years, there will be no reliable way even for an expert to distinguish between real and faked video.

        Welcome to the future.

    5. Ian Michael Gumby
      Boffin

      Oh No! not Re: Oh yes!

      ...so you can blackmail your boss with that video of him and those women-who-are-not-his-spouse...

      Not really.

      Wives aren't that dumb and even if they suspected the hubby of philandering around town... they are not going to believe their eyes. Can you say size discrepancy?

      At the same time, you'll be asked to prove a date/time and you can bet unless you know your boss' schedule, he'll have an alibi. Then you'll be carted off to jail and the least of your worries will be the loss of your job.

      At the same time... I can see the potential for psychological issues that this may create. Body dysphoria?

      (sp? or dysmorphia )

      But this reminds me of the Futurama episode w Lucy Liu and the sex robots. ;-) ( Both deal with you trying to live out your sexual fantasy...)

      1. Neil Barnes Silver badge
        Big Brother

        Re: Oh No! not Oh yes!

        Indeed.

        Partner has wandering eyes? Possibly.

        Partner does something about it? Possibly.

        Partner does it with a highly attractive person? Possibly.

        Partner changes physique? Unlikely.

        Partner does all of the above, in a professionally lit multi camera studio? Pull the other one...

  2. VikiAi
    Trollface

    Does the first part of this article count as "deep-fake news"?

    1. Teiwaz

      Does the first part of this article count as "deep-fake news"?

      Deep-Throat-Fake news.

  3. jmch Silver badge

    Porn

    Has always been at the forefront of Internet technology, being early adopters / drivers for credit card payment technology, image and video-rich content etc. So no surprises there. I suppose the ultimate endpoint is passing through virtual or augmented reality headset to feeding porn signals direct to your brain via electrodes a la Demolition Man.

    Regarding the comment about blackmail possibility, this actually means less blackmail opportunity rather than more since as knowledge of the tech becomes widespread it will be easier for potential blackmail victims to simply say it's fake (even if it's actually real)

    1. DCFusor

      Re: Porn

      That rates an upvote for "insightful", jmch. Most people don't think things through. Especially if that would hurt whatever other agenda they want to use spin to push.

    2. Neil Barnes Silver badge

      Re: Porn

      Porn has been at the front of an awful lot of technology... why do you think the dreadful VHS format was chosen over two technically superior formats?

      And... photography became practical around 1846. 3-d photography, same year. 3-d images of naked ladies... same year.

      For some reason, humans have a deep and abiding interest in sex. Can't think why, it's not as if it's a survival characteristic or something... oh, wait...

  4. Anonymous Coward
    Anonymous Coward

    Digital wallflower

    Just great, even my avatar gets to have a hedonistic rock star lifestyle. While I just get to sit here as a mopey commentard.

  5. Anonymous Coward
    Anonymous Coward

    PEO

    What a great name for the PEO (Porn Executive Officer), Andreas Hornopoulos :)

  6. Anonymous Coward
    Paris Hilton

    Porn with my face == instant droop.

    Paris. Just because.

  7. Teiwaz

    Well, now porn biz Naughty America wants to sell this sort of caper to horny netizens.

    They're packaging the wrong poduct.

    If the first adopters wanted fake celeb sex vids, that's what the rest will want. If there was a market for 'paste your own cum-face' - that's be the first thing it was abused for.

    When pornography photography was invented, the first [ab]use of it was selling piccies other people doing it, not setting up photo salons so you could have piccies of yourself 'doing it'.

    Before photographs, 'illicit' illustrations features the countries lords, ladies and leaders 'doing it'.

    A better seller may well be paying celebs for their footage and offering to sell 'you face 'dubbed' onto partner of 'celeb of your choice'. Not that many celebs would go for it. But theirs always look-a-likes.

    I'm not saying some won't jump for the current offer, but I don't think it's a big seller.

    1. Ian Michael Gumby
      Boffin

      @Teiwaz

      I think that w celebrity porn, you have a major legal liability if the vids got out in to public.

      But guys wanting to fsck their fav porn star? Sure. It makes for an interesting legal contract going forward.

      Imagine a new line for Stormy Daniels and her lawyer trying to negotiate a per version payout instead of per performance. (1 sex scene == hundreds of different customers, so she would now be paid 100 times for the scene than just once)

      Note: I chose Stormy because she's an alleged porn star / dancer who everyone would recognize her name... could have gone with a crossover starlet like Sasha Grey? (Is that the correct name / spelling?)

  8. Crazy Operations Guy

    MRIs

    "slice by slice of 2D images". I know I am being pedantic, but the MRI machine does not make 2-D images. Rather it produces 1-dimensional lines that are arrayed onto a radial plot by the attached computer (Learned this when I reverse-engineered the protocol since the computer attached was no longer usable and the hospital didn't have the $100k to buy a new one. Found out it was generic serial but over a proprietary port).

    But, really, the best thing that AI could be used for would be to correct for organ movement throughout the scan to produce a better image, its quite hard to get a good image of the heart when it is busy doing its thing, trying to find tumors on a lung that is

    The second best would probably be to establish a program where you'd get a full-body scan during a yearly physical and then the AI compares the scan data over time to identify anything that appears to be growing or shrinking that shouldn't be doing so.

    The second worst idea would be to let an advertising company get anywhere near such data (I consider Facebook and Google ad-slingers since that is where their money comes from).

    The worst is trying to 'fill-in' gaps in images. The point of taking MRI images is to detect those tiny anomalies and now they want to use AI to throw data that it 'thinks' should be in there. If your training data is healthy bodies, congratulations, you are now going to see a suspiciously high number of clean scans.

  9. Eduard Coli

    Just say no

    "NYU has amassed about three million MRI images of knees, brains and livers to train convolutional neural networks built by Facebook."

    Facebook is the last service anyone would want handling sensitive health data.

    1. Crazy Operations Guy

      Re: Just say no

      I'd expand that to handling any personal data at all.

  10. JeffyPoooh
    Pint

    "That's obviously not me..."

    "...If it was me, then you'd be needing a much wider aspect ratio. Just sayin'..."

  11. Anonymous Coward
    Anonymous Coward

    sorry but i cant masturbate to this.

    I'll try, of course, but i'm warning you beforehand.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like