back to article Just say no to NO FAKES Act, EFF argues

The Electronic Frontier Foundation (EFF) says the revised version of the NO FAKES Act, reintroduced in April, would be a disaster for free speech and innovation if signed into law. The bill [PDF] aims to address valid concerns about AI-generated audio and visual replicas of people in the wrong way, the EFF argues, by …

  1. Jou (Mxyzptlk) Silver badge

    But privacy and people rights don't earn money!

    For USA money is the only god. Not people, infrastructure, living conditions. Therefore, especially right now, people and privacy are not the concern. Not only in the movie "They Live", ask every US-American who lived outside the US bubble for at least a half year. And how their view has changed when they come back, and how they experience the reverse culture shock.

    BTW: You can ask soldiers too, but some of the soldiers are not lucky to experience the outside-US view depending on where and for what they were deployed.

    I hope the EFF can topple it, 'cause 'murica is still the "everything s possible" country can can surprise.

    1. ecofeco Silver badge

      Re: But privacy and people rights don't earn money!

      For USA money is the only god.

      NEVER forget this.

  2. chuckufarley
    Joke

    IF...big if...

    ...People in the US get the privacy protections that should be a human right for all people then it would up end the entire economy. So I don't think the EFF has much of a chance to change minds here. Not that I think they they are wrong in principle but that they are fighting the war in the wrong way. EFF, go buy some property rights then abuse them because you can. Lead by example.

  3. doublelayer Silver badge

    Does privacy cover this?

    The EFF's complaints about what's in this are quite correct. Vague statements about what is covered make this law dangerous, and virtually unlimited requirements to take down maybe almost anything when maybe anyone asks you to is a perfect way to get bad consequences. Focusing on the existence of tools that could theoretically be abused is looking in the wrong direction when focusing on those abusive uses is more important, especially when there are legitimate uses for the tools. They're not wrong about what this law does badly.

    However, I have to wonder if they are wrong about what is needed to accomplish this goal. Privacy legislation would be very welcome, but I'm not convinced it would solve this problem. Privacy legislation might prevent a company from getting a bunch of pictures of someone without consent and then making a fake avatar of them, but there are lots of ways of doing that which wouldn't be covered. The problem is that AI companies have acted as if most laws don't apply to them, and unfortunately, they've just gotten a judge to say that they can use data for whatever they want with the only proviso being that they had to obtain it legally. The fact that that counts as a partial win against AI companies speaks volumes, since normally, obtaining something illegally would be an obvious problem and nobody would need to question that part of it in the first place.

    So let's consider how a company could legally obtain pictures of someone who doesn't consent to them being used to create a fake video of them. Several really easy things come to mind. Maybe they're an extra or actor and agreed to appear on camera, but not to having that footage used for literally anything later on. Or what if they posted video or photographs of themselves to a public social media website? In both cases, they voluntarily chose to put their likeness online, and that would remove most of the privacy protections around it. Do we really want that to be the only bar preventing someone from using that data without their consent in such an invasive way? To prevent that, the privacy law would have to specify that any subsequent use of the data had to be agreed to beforehand, and any data produced in violation of that had to be removed. Or in other words, the privacy legislation that would fix this problem actually looks quite a lot like this legislation does.

    If the EFF is so confident that some other law can fix this problem, they need to do a better job of explaining how. Until then, they only have half of an argument, which makes it a lot easier for anyone motivated to dismiss and discredit them.

    1. kmorwath

      Re: Does privacy cover this?

      Good privacy laws do protect from that, because even if your image is published somewhere, you should opt-in for those kind of uses. For example, in most jurisdiction you can't take a photo of someone in the streets and sell it, without a "model release". See how Facebook was stopped from using users' post to train its AIs in Europe.

      1. doublelayer Silver badge

        Re: Does privacy cover this?

        Not all of that is related to privacy legislation, and some of it is not correct. For example, the existing laws that forbid you from using pictures is related to restrictions on what can be done with a "likeness" which is much more similar to the "property right" the EFF is complaining about than privacy, since privacy would forbid the taking of the picture in the first place rather than just some possible uses of the content of the picture later on.

        And Facebook not using user posts in the EU... that ended and the regulators are fine with that. There may be a chance that NOYB will get a judge to block that, but it's not looking great so far. Maybe a better privacy law could do what GDPR is failing to do, but this is the US which has nothing other than a few patchwork laws in a couple states that have large carve-outs and are not well-enforced in the first place. If they had privacy legislation that covered this or even required similar permission as you'd need for a model release, then the EFF would have a point that existing laws covered it. Since those don't cover it, the EFF's argument has a hole in it, and I wonder if it might work better if they suggested to remove the dangerous parts of the law rather than abandon it entirely. The protection that law is written to provide is necessary and not available elsewhere.

        That goes for other countries as well, where existing privacy legislation will probably not be sufficient to deal with AI training on personal data. The logic that Facebook's successfully used to get training on text allowed under the GDPR is the same logic that would be used to train on images. We can hope that a court decides that's a big enough problem that they'll block it, or we could make it official and not gamble on the result.

  4. kmorwath

    "some people's rights are worth more than others"

    Exactly what they want. Of course politicians, big companies executives - and their jesters (actor, singers, etc.) need to have more rights than the "commons".

    A privacy law would protect everyone, not the patricians, and would of course make a dent into profits of Google, Facebook, Microsoft, Apple, Amazon.... instead a law only a few can afford to be protected by won't.

  5. ecofeco Silver badge

    Conservatism consists of exactly one proposition …There must be in-groups whom the law protects but does not bind, alongside out-groups whom the law binds but does not protect.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like