Re: Am I the only one ...
This is a fair point, and whilst there is perfectly understand outrage about the CSAM aspect, this shouldn't lessen the outrage around using this sort of system to remove the clothing from images of adults.
Quite frankly, if this software is being used to do things that humans can do, such as synthesising images of people without their clothes, and without their consent, it should be subject to the same rules and regulations that a human would be.
It is a ridiculous situation where a person would (rightly) be hauled up in court for creating a fake nude of another person without their consent and publishing it, but a machine can churn out tens out thousands of such images in the same time and nobody faces any consequences for it.
Put simply, if the purpose, by design, of Grok isn't to produce non-consensual explicit pictures of real people, without their consent, why has it been trained on the data that allows it to do so?
The CSAM angel is a little different, because not only is it illegal to produce such images of real people, it is also illegal (in the UK at least) to produce such images of entirely fictional children. If safeguards cannot be put in place, that are absolutely reliable and cannot be circumvented, and there are good reasons to believe that this is actually impossible (analogous to the halting problem), then there is an argument that the entire technology is flawed, or at the very least, that its operators should be held responsible for its use in a much more structured manner.
Oddly enough, though, I can't see a world where Elon Musk is held personally responsible for every illegal image his software produces, even though his grubby hands are all over rigging its algorithms.