This reinforces the reasons I have no photos of myself anywhere that I know of on the net. I will have a job on my hands to convince my younger relatives (nephews & niece) that they should be extremely careful.
Deepfakes being used in 'sextortion' scams, FBI warns
Miscreants are using AI to create faked images of a sexual nature, which they then employ in sextortion schemes. Scams of this sort used to see crims steal intimate images – or convince victims to send share them – before demanding payments to prevent their wide release. But scammers are now accessing publicly available and …
COMMENTS
-
Thursday 8th June 2023 06:51 GMT Anonymous Coward
Does this not also work the other way?
If someone is threatened with releasing pictures they can just say "Go for it, I'll just tell people they're deepfakes and not me". It also stands to reason that if someone is threatening you with a deepfake then anyone can make that deepfake if they can get the picture of course so it instantly loses any value.
-
-
Thursday 8th June 2023 07:26 GMT KittenHuffer
Re: Actually, this has an upside
Unfortunately it also means that every scummy politician* will be able to use the same excuse when the video of previous events emerge that would hurt their future prospects. I'm sure there are a number who wish this technology was available much earlier in their careers.
* I tried to think of other classes of people (celebrities, business people, sports people, etc.) that I might list. But politicians stood head and shoulders above anyone else I could think of.
-
Thursday 8th June 2023 10:05 GMT YetAnotherLocksmith
Re: Actually, this has an upside
They already are.
And elon has tried to use "it could be a deep fake" as a defence in court - despite him saying it onstage in front of thousands of people, more than once, in 2016 & 2017! (His claims saying Tesla would be full self driving within the year, which he's now being sued over)
-
-
-
-
-
Thursday 8th June 2023 13:35 GMT Jedit
"I'd ask if I could have a bigger dick!"
I imagine there has already been at least one situation where a modestly endowed man has been blackmailed with a deepfake of him as a well hung porn star. So the only way he can get out from under the shame of appearing in a porn movie is to admit to, shall we say, only having had a small part.
-
-
Thursday 8th June 2023 11:32 GMT John69
Is this any different from photoshop?
There was a time when if you saw a photo then what was depicted must have happened. Then photoshop came along and there was a short time when some people could be fooled by a photoshopped image. Then everyone learned and photoshop became an everyday tool.
I see no reason to think deep fakes will be any different. Once we learn we cannot trust video this will be no different to photoshoping someone head into a sex scene.
I do not believe there is much future for freely available AI image detecting software. The nature of generative adversarial networks means people will just incorporate whatever detection tool in the learning algorithm.
-
Thursday 8th June 2023 12:27 GMT Metro-Gnome
Re: Is this any different from photoshop?
It has been a thing well before photo-shop. Newspapers have been sharing on edited or 'untrue' photos since the Cottingley Fairies and before.
Now the pictures may move and be in colour but not in any of our lifetimes have we been fully able to trust a picture or video we didn't know the provenance of.
-
-
Thursday 8th June 2023 12:03 GMT jim.warwick
Detection is futile...
The idea that we can rely on AI systems to detect deepfakes seems to ignore the fact that the very reason GANs work so well is because they are trained against an "adversarial" network until it can't spot the difference. All we'll achieve is ever increasingly perfect deep fakes that is practically impossible to detect.
It also seems ironic that the go-to application of any new on-line technology - porn - is probably training our future overlords. At least they'll know how to have a good time.
Hmmm...
Jim
-
Thursday 8th June 2023 13:50 GMT Helcat
Unless it's verified, don't believe that it's true.
There's been some very convincing photoshopped pictures over the years: That's been enough to question the validity of anything that isn't verified. And even then: Verified doesn't mean it's real, either. Just that someone has stepped up to say it's real, so you've someone to blame if it's not.
-
Thursday 8th June 2023 16:26 GMT find users who cut cat tail
Can't stop it
So, what to do instead? Generate vast quantities of such images. For everyone. For non-existent AI-generated lookalikes. For whoever and whatever you can in all possible variations. Make them easily available. Let everyone know.
Then they become banal and useless for extortion. Of course such pictures exist. Of course they are fake. Who cares?
Sure currently there would be legal problems, and with minors this is not likely to be an option, etc. But generally, the ease with which they are generated is not the problem; it is the solution.