back to article Deepfakes being used in 'sextortion' scams, FBI warns

Miscreants are using AI to create faked images of a sexual nature, which they then employ in sextortion schemes. Scams of this sort used to see crims steal intimate images – or convince victims to send share them – before demanding payments to prevent their wide release. But scammers are now accessing publicly available and …

  1. JimboSmith

    This reinforces the reasons I have no photos of myself anywhere that I know of on the net. I will have a job on my hands to convince my younger relatives (nephews & niece) that they should be extremely careful.

    1. Anonymous Coward
      Anonymous Coward

      FBI: Deepfakes are being used in 'sextortion' scams.

      Citizens: Used by whom?

      FBI: By us.

  2. Neil Barnes Silver badge
    Holmes

    Pictures or it didn't happen...

    is rapidly turning into 'if you didn't see me in person it didn't happen'.

    Anyone deepfaking playmobil scenes yet?

    1. Anonymous Coward
      Anonymous Coward

      Re: Pictures or it didn't happen...

      "It wasn't me."

      --Shaggy

  3. Anonymous Coward
    Anonymous Coward

    Does this not also work the other way?

    If someone is threatened with releasing pictures they can just say "Go for it, I'll just tell people they're deepfakes and not me". It also stands to reason that if someone is threatening you with a deepfake then anyone can make that deepfake if they can get the picture of course so it instantly loses any value.

  4. Scott L. Burson

    Actually, this has an upside

    Now when your actual nude photos get out, you can claim they're fakes.

    1. KittenHuffer Silver badge

      Re: Actually, this has an upside

      Unfortunately it also means that every scummy politician* will be able to use the same excuse when the video of previous events emerge that would hurt their future prospects. I'm sure there are a number who wish this technology was available much earlier in their careers.

      * I tried to think of other classes of people (celebrities, business people, sports people, etc.) that I might list. But politicians stood head and shoulders above anyone else I could think of.

      1. YetAnotherLocksmith

        Re: Actually, this has an upside

        They already are.

        And elon has tried to use "it could be a deep fake" as a defence in court - despite him saying it onstage in front of thousands of people, more than once, in 2016 & 2017! (His claims saying Tesla would be full self driving within the year, which he's now being sued over)

    2. Nifty

      Re: Actually, this has an upside

      "when your actual nude photos get out, you can claim they're fakes"

      Nah I'll claim my body really is that good.

  5. Pascal Monett Silver badge
    Trollface

    Not a deepfake

    "Such concerns surfaced again this week when a deepfake video aired on Russian TV purportedly depicted Russian President Vladimir Putin declaring martial law against the backdrop of the country's ongoing illegal invasion of Ukraine."

    More of a forecast, actually.

  6. trevorde Silver badge

    This excuse is already old

    Musk tried to wriggle out of Autopilot grilling by claiming past boasts may be deepfakes

    https://www.theregister.com/2023/04/27/musk_autopilot_death_deepfake/

  7. LybsterRoy Silver badge

    If someone approached me with the offer of sharing deepfake naught pics of myself I'd ask if I could have a bigger dick!

    1. Jedit Silver badge
      Alert

      "I'd ask if I could have a bigger dick!"

      I imagine there has already been at least one situation where a modestly endowed man has been blackmailed with a deepfake of him as a well hung porn star. So the only way he can get out from under the shame of appearing in a porn movie is to admit to, shall we say, only having had a small part.

      1. parlei

        Re: "I'd ask if I could have a bigger dick!"

        Easier: have a known prior partner simply state: "Nope, not him, I would have remembered" (regardless of if one sees a 99% percentile fallos as a desirable characteristic or not).

    2. Anonymous Coward
      Anonymous Coward

      I'm the opposite I don't want the added attention. My nickname at school was tripod.

  8. John69

    Is this any different from photoshop?

    There was a time when if you saw a photo then what was depicted must have happened. Then photoshop came along and there was a short time when some people could be fooled by a photoshopped image. Then everyone learned and photoshop became an everyday tool.

    I see no reason to think deep fakes will be any different. Once we learn we cannot trust video this will be no different to photoshoping someone head into a sex scene.

    I do not believe there is much future for freely available AI image detecting software. The nature of generative adversarial networks means people will just incorporate whatever detection tool in the learning algorithm.

    1. Metro-Gnome

      Re: Is this any different from photoshop?

      It has been a thing well before photo-shop. Newspapers have been sharing on edited or 'untrue' photos since the Cottingley Fairies and before.

      Now the pictures may move and be in colour but not in any of our lifetimes have we been fully able to trust a picture or video we didn't know the provenance of.

      1. Anonymous Coward
        Anonymous Coward

        Re: Is this any different from photoshop?

        Wait a second. Do you mean to tell me they didn't find a London bus on the moon?

    2. bo111

      Re: Is this any different from photoshop?

      Yes, it is orders of magnitude different by scale. It is similar to huge growth of digital photography after 1st iPhone. Besides making realistic fakes in Photoshop is very hard and slow.

  9. jim.warwick

    Detection is futile...

    The idea that we can rely on AI systems to detect deepfakes seems to ignore the fact that the very reason GANs work so well is because they are trained against an "adversarial" network until it can't spot the difference. All we'll achieve is ever increasingly perfect deep fakes that is practically impossible to detect.

    It also seems ironic that the go-to application of any new on-line technology - porn - is probably training our future overlords. At least they'll know how to have a good time.

    Hmmm...

    Jim

  10. Helcat Silver badge

    Unless it's verified, don't believe that it's true.

    There's been some very convincing photoshopped pictures over the years: That's been enough to question the validity of anything that isn't verified. And even then: Verified doesn't mean it's real, either. Just that someone has stepped up to say it's real, so you've someone to blame if it's not.

  11. find users who cut cat tail

    Can't stop it

    So, what to do instead? Generate vast quantities of such images. For everyone. For non-existent AI-generated lookalikes. For whoever and whatever you can in all possible variations. Make them easily available. Let everyone know.

    Then they become banal and useless for extortion. Of course such pictures exist. Of course they are fake. Who cares?

    Sure currently there would be legal problems, and with minors this is not likely to be an option, etc. But generally, the ease with which they are generated is not the problem; it is the solution.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like