At what point do artificial images become "wrong"?
Few would argue that possession of cartoon level drawings (like Simpsons characters) of child porn deserves a long prison stay. All but a few would argue that real life images of child porn most certainly do deserve a long stint in prison. Where in there is the line where you go from one side to the other?
So what if AI is used to generate real looking but completely fake images, not based on any real child? Assuming the perp is only looking at them privately to satisfy his own urges, and never showing them to minors or attempting to "groom" children, should that be illegal? I suppose some will argue that that will lead to abusing real children in some cases, and that's almost certain to be true, in some cases. But I'm a little uncomfortable with the idea of treating it as a crime on the same level as possession of real child porn. I mean, it is legal to make and sell porn with 18 year old girls who look 13 - and that's the primary attraction for those consuming it, and I'd argue that's worse than anything AI may generate.
Its certainly something we're going to have to figure out, as we can't be far from AI generated images being indistinguishable from real, and not too many years from AI generated video being indistinguishable.