A plague on all their houses
As I just commented on the Copyright story, I can see no use case where the use of these technologies is of public benefit.
India's Ministry of Electronics and IT (MeitY) this week issued an advisory saying social media companies need to remove deepfakes from their platforms within 36 hours after they're reported. "It is a legal obligation for online platforms to prevent the spread of misinformation by any user under the Information Technology (IT …
I completely agree. Unfortunately, the genie is out of the bottle now, so legal action is the only way forward.
There are some, but the harm certainly seems to outweigh any benefits. I've read a couple of reports where a woman had an obsessed admirer turn an innocent Instagram pic into a nude, and then shared that around. Or more disturbing, people making CP from kid's photos. The CP is probably legally easier given legislation already includes language to cover art and emulation, but if it's other kids doing it for the lolz, they may not realise they're committing a very serious offence. In the Instagram story, one problem seemed to be figuring out what law, if any had been broken. Per wiki-
https://en.wikipedia.org/wiki/Revenge_porn#United_Kingdom
Section 33 of the Act makes it an offence in England and Wales to disclose private sexual photographs and films without the consent of the individual depicted and with the intent to cause distress.
But if the image is a 'deepfake', it's not a private photograph that ever previously existed. But wiki does mention-
(more properly referred to by the terms non-consensual intimate imagery, NCII, or image-based sexual abuse, IBSA)
So whether the UK Act should be amended to include NCII where sexual photographs are completely made up, obviously without consent.
And I've seen other oddities. My gf showed me what happens if you search for her name with safe search off. First couple of pages are porn pics, including one woman who looks similar and with a similar name. It also shows a bunch of other nudes of women with different names, so why those appear in a search at all. There doesn't seem to be anything that can be done about that, other than try to convince the search engines to show exact matches first, not 'fuzzy' matches.
But I'm also wondering if this takedown system could be abused. A politician or a celeb gets caught doing something embarassing and declares it a 'deepfake', and issues a takedown notice. Content hosts aren't going to be in a position to verify, so the story could be disappeared. Any legislation should include DMCA-like penalties for issuing false take-down notices. For other cases like NCII, it should be possible to use image searches to find where they're hosted, and order a takedown. But that would probably require more legislation, a statutory body or organisation like the IWF, and content hosts to comply. Plus being the Internet, once something is 'out there', it's very difficult to eliminate.
If it's built from individual other images, technically speaking it could be categorised as multiple counts of the offence?
I'm.. not sure it would need to be, just amending existing legislation to cover deepfakes. So using wiki to illustrate-
https://en.wikipedia.org/wiki/Revenge_porn
Revenge porn is the distribution of sexually explicit images or videos of individuals without their consent
Which seems a reasonable definition covering 'real' images, or fakes. Then add-
the terms non-consensual intimate imagery, NCII, or image-based sexual abuse, IBSA
To UK's Section 33 with those terms defined. Other countries seem to have taken this approach in their legislation, and it would seem like a simple amendment to more clearly capture 'deepfaked' images. The debate around the name is also interesting, eg-
Some academics argue that the term "revenge porn" should not be used, and instead that it should be referred to as "image-based sexual abuse."
But legislation often ends up with catchy short names to help publicise it. Personally, I think NCII covers pretty much everything and avoids possible arguments about porn or abuse. The much greater problem is how to deal with images once they're out there. It's a subject I've been interested in for years given I've photographed nudes. It's something I've always discussed with models, and my consent forms included an option to withdraw consent, but with the caveat that the ability was limited. I mostly sold prints, so it was easy to stop selling prints of a model if they wanted, but with digital images, it's virtually impossible to take those down once they're online.
I agree the technology to do this should not be available to the public.
..but this definitely has military/intelligence applications so it's not going away. As you say, the genie is well and truly out of the bottle.
The other point is, what do you do when genuine video is labeled by the politicians caught in the act as a Deepfake? I mean.. we can tell them apart now, but I'd say it's only a matter of time before they become indistinguishable.