back to article India gives social media platforms 36 hours to remove deepfakes

India's Ministry of Electronics and IT (MeitY) this week issued an advisory saying social media companies need to remove deepfakes from their platforms within 36 hours after they're reported. "It is a legal obligation for online platforms to prevent the spread of misinformation by any user under the Information Technology (IT …

  1. Neil Barnes Silver badge

    A plague on all their houses

    As I just commented on the Copyright story, I can see no use case where the use of these technologies is of public benefit.

    1. Pascal Monett Silver badge

      Re: A plague on all their houses

      I completely agree. Unfortunately, the genie is out of the bottle now, so legal action is the only way forward.

      Nothing like scaring Senators (or the Indian equivalent) to get a problem solved, eh ?

      1. Jellied Eel Silver badge

        Re: A plague on all their houses

        I completely agree. Unfortunately, the genie is out of the bottle now, so legal action is the only way forward.

        There are some, but the harm certainly seems to outweigh any benefits. I've read a couple of reports where a woman had an obsessed admirer turn an innocent Instagram pic into a nude, and then shared that around. Or more disturbing, people making CP from kid's photos. The CP is probably legally easier given legislation already includes language to cover art and emulation, but if it's other kids doing it for the lolz, they may not realise they're committing a very serious offence. In the Instagram story, one problem seemed to be figuring out what law, if any had been broken. Per wiki-

        https://en.wikipedia.org/wiki/Revenge_porn#United_Kingdom

        Section 33 of the Act makes it an offence in England and Wales to disclose private sexual photographs and films without the consent of the individual depicted and with the intent to cause distress.

        But if the image is a 'deepfake', it's not a private photograph that ever previously existed. But wiki does mention-

        (more properly referred to by the terms non-consensual intimate imagery, NCII, or image-based sexual abuse, IBSA)

        So whether the UK Act should be amended to include NCII where sexual photographs are completely made up, obviously without consent.

        And I've seen other oddities. My gf showed me what happens if you search for her name with safe search off. First couple of pages are porn pics, including one woman who looks similar and with a similar name. It also shows a bunch of other nudes of women with different names, so why those appear in a search at all. There doesn't seem to be anything that can be done about that, other than try to convince the search engines to show exact matches first, not 'fuzzy' matches.

        But I'm also wondering if this takedown system could be abused. A politician or a celeb gets caught doing something embarassing and declares it a 'deepfake', and issues a takedown notice. Content hosts aren't going to be in a position to verify, so the story could be disappeared. Any legislation should include DMCA-like penalties for issuing false take-down notices. For other cases like NCII, it should be possible to use image searches to find where they're hosted, and order a takedown. But that would probably require more legislation, a statutory body or organisation like the IWF, and content hosts to comply. Plus being the Internet, once something is 'out there', it's very difficult to eliminate.

        1. GioCiampa

          Re: A plague on all their houses

          "But if the image is a 'deepfake', it's not a private photograph that ever previously existed."

          If it's built from individual other images, technically speaking it could be categorised as multiple counts of the offence?

          1. Jellied Eel Silver badge

            Re: A plague on all their houses

            If it's built from individual other images, technically speaking it could be categorised as multiple counts of the offence?

            I'm.. not sure it would need to be, just amending existing legislation to cover deepfakes. So using wiki to illustrate-

            https://en.wikipedia.org/wiki/Revenge_porn

            Revenge porn is the distribution of sexually explicit images or videos of individuals without their consent

            Which seems a reasonable definition covering 'real' images, or fakes. Then add-

            the terms non-consensual intimate imagery, NCII, or image-based sexual abuse, IBSA

            To UK's Section 33 with those terms defined. Other countries seem to have taken this approach in their legislation, and it would seem like a simple amendment to more clearly capture 'deepfaked' images. The debate around the name is also interesting, eg-

            Some academics argue that the term "revenge porn" should not be used, and instead that it should be referred to as "image-based sexual abuse."

            But legislation often ends up with catchy short names to help publicise it. Personally, I think NCII covers pretty much everything and avoids possible arguments about porn or abuse. The much greater problem is how to deal with images once they're out there. It's a subject I've been interested in for years given I've photographed nudes. It's something I've always discussed with models, and my consent forms included an option to withdraw consent, but with the caveat that the ability was limited. I mostly sold prints, so it was easy to stop selling prints of a model if they wanted, but with digital images, it's virtually impossible to take those down once they're online.

    2. Pinero50

      Re: A plague on all their houses

      I agree the technology to do this should not be available to the public.

      ..but this definitely has military/intelligence applications so it's not going away. As you say, the genie is well and truly out of the bottle.

      The other point is, what do you do when genuine video is labeled by the politicians caught in the act as a Deepfake? I mean.. we can tell them apart now, but I'd say it's only a matter of time before they become indistinguishable.

      1. katrinab Silver badge
        Black Helicopters

        Re: A plague on all their houses

        You can generate deep fakes on a 3080ti in about a minute. This is not something you can prevent the public from having.

        Or of course a skilled person could do it on photoshop or similar.

    3. Zippy´s Sausage Factory

      Re: A plague on all their houses

      Except maybe recreating the missing episodes of Doctor Who, but that's the only use case I can think of.

  2. katrinab Silver badge
    Meh

    In the case of WhatsApp, surely you can see the phone numbers of the people who shared it?

    Then the correct person to approach for further information would be the phone company, not Meta?

  3. Snowy Silver badge
    Coat

    If deepfakes

    Are so good how do you sort out what is fake from what is real?

    Other uses for it would be to claim a compromising image/video is deep fake and have it deleted?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like