Does privacy cover this?
The EFF's complaints about what's in this are quite correct. Vague statements about what is covered make this law dangerous, and virtually unlimited requirements to take down maybe almost anything when maybe anyone asks you to is a perfect way to get bad consequences. Focusing on the existence of tools that could theoretically be abused is looking in the wrong direction when focusing on those abusive uses is more important, especially when there are legitimate uses for the tools. They're not wrong about what this law does badly.
However, I have to wonder if they are wrong about what is needed to accomplish this goal. Privacy legislation would be very welcome, but I'm not convinced it would solve this problem. Privacy legislation might prevent a company from getting a bunch of pictures of someone without consent and then making a fake avatar of them, but there are lots of ways of doing that which wouldn't be covered. The problem is that AI companies have acted as if most laws don't apply to them, and unfortunately, they've just gotten a judge to say that they can use data for whatever they want with the only proviso being that they had to obtain it legally. The fact that that counts as a partial win against AI companies speaks volumes, since normally, obtaining something illegally would be an obvious problem and nobody would need to question that part of it in the first place.
So let's consider how a company could legally obtain pictures of someone who doesn't consent to them being used to create a fake video of them. Several really easy things come to mind. Maybe they're an extra or actor and agreed to appear on camera, but not to having that footage used for literally anything later on. Or what if they posted video or photographs of themselves to a public social media website? In both cases, they voluntarily chose to put their likeness online, and that would remove most of the privacy protections around it. Do we really want that to be the only bar preventing someone from using that data without their consent in such an invasive way? To prevent that, the privacy law would have to specify that any subsequent use of the data had to be agreed to beforehand, and any data produced in violation of that had to be removed. Or in other words, the privacy legislation that would fix this problem actually looks quite a lot like this legislation does.
If the EFF is so confident that some other law can fix this problem, they need to do a better job of explaining how. Until then, they only have half of an argument, which makes it a lot easier for anyone motivated to dismiss and discredit them.