back to article US AGs: We need law to purge the web of AI-drawn child sex abuse material

The National Association of Attorneys General, the body that all US states and territories use to collaboratively address legal issues, has urged Congress to pass legislation prohibiting the use of AI to generate child sex abuse images. In a letter [PDF] to the leaders of the Senate and the House of Representatives on Tuesday …

  1. thames

    Some wrong assumptions are being made

    Deepfakes and AI generated images are two completely different things. There should be no need to use live model child pornography to produce AI generated child pornography.

    All the creators should need is images of pornography models who are of legal age but have slender builds, and non-pornographic images of children. They could then use generative AI technology to blend the two together.

    I suspect that any law based on the premise that AI generated pornography depends on live model child pornography would fail in court if the defendant kept a good record of what data was used to train the model and could show that no live model child pornography was involved.

    I suspect that the US will have to address the issue by either having a law that says that if it looks like child pornography, then it's child pornography even if no children were involved (which is what some countries do), or else deal with the issue of what constitutes a derivative work when AI is involved together with some privacy laws that have some real teeth and which can prevent images of children being used for unauthorized purposes. Then add on top of that a requirement for all image related AI models to keep a record of what training material they used.

    In the end though, as AI oriented hardware acceleration becomes more mainstream I suspect that attempts to prevent illegal uses of it will become futile.

    1. Anonymous Coward
      Anonymous Coward

      Re: Some wrong assumptions are being made

      Even if privacy laws could be made strong enough, I doubt controlling the training materials can be sufficient. With enough advances it will certainly be possible to create NSFW content that just happens to look like specific targeted children, even if no picture of them were used at any point in the creation. I recall seeing at least one law that regulated NSFW drawings (probably in the context of the extensive Japanese industry), which specifically banned lifelike depiction of recognizable existing children, meaning that the end result is illegal no matter how it was created. That seems a more likely path.

      1. Anonymous Coward
        Anonymous Coward

        Re: Some wrong assumptions are being made

        Or you could be the Aussie goverment, and prevent porn stars from appearing in movies if they looked young, regardless of their actual age.

        1. CountCadaver

          Re: Some wrong assumptions are being made

          ditto uk

          Where a lot of production moved to more "liberal" jurisdictions after the effectively banned bondage materials as the model has to have at least 1 limb free at all times

          So many companies just went to the Czech republic

    2. jmch Silver badge

      Re: Some wrong assumptions are being made

      "the US will have to address the issue by either having a law that says that if it looks like child pornography, then it's child pornography"

      I think that this is, de facto, the current case. Possession and distribution of child pornography is illegal, and since in the law I'm pretty sure no distinction is being made as to whether it was AI-generated, that means AI-generated child porn is already illegal (so I'm not sure why additional legislation is required). If any such case comes to trial, it would not be enough for the defendant to prove that no actual children were used in the training material, but make a tricky legal argument about the definition of child porn. I suspect that the law would fall down on the side of 'if it walks like a duck and quacks like a duck, it's a duck'

      1. Anonymous Coward
        Anonymous Coward

        Re: Some wrong assumptions are being made

        In the UK, if you were to simply hand-sketch something that looked like a naked child created entirely from your imagination, you'd probably be committing a crime, even if nobody ever saw it.* So I suspect that AI-generated images of that nature are already illegal here.

        *This does of course raise the possibility of tree-falls-in-forest arguments, and I suspect it verges on the concept of thoughtcrime. However, it's hard to see how else it could be done.

        1. jmch Silver badge

          Re: Some wrong assumptions are being made

          "tree-falls-in-forest arguments"

          I love the "does it make a sound if a tree falls in the forest" example because it cuts to the heart of how humans discern reality:

          Yes >> reality is basically objective

          No >> reality is basically subjective

          There is no 'correct' answer to the question, the point is in what your answer tells you about yourself

        2. katrinab Silver badge

          Re: Some wrong assumptions are being made

          I know for sure that if you attempt to import a child-sized sex doll to the UK, that is extremely illegal and you will be jailed for a very long time.

          1. garwhale

            Re: Some wrong assumptions are being made

            And if it was a depiction of a person of restricted growth of advanced age instead of a child?

        3. garwhale

          Re: Some wrong assumptions are being made

          Nakedness and abusive child porn are not the same, otherwise e.g. mirrors would be banned.

      2. Anonymous Coward
        Anonymous Coward

        In the USA

        From the top level the system was geared on actual pictures of actual people. Derivatives beyond distortion, cropping, or coverups aren't really handled or all that clear. Some local jurisdictions have stronger interpretations, and have also been slapped down by the higher courts on occasion for images not based on photo/video or other depictions. Traditionally drawn or painted material was considered distasteful but not inherently illegal. There are plenty of other truck sized loopholes, including prior exceptions for film and television, much from international markets originally, including the black and white Romeo and Juliet that was all over the news a few months ago when the child actors sued.

        All that may change if generative image tools make realistic enough images and they change the rules. While I don't think this should be just left to individual sites content moderation, overzealous application may be hard to avoid, and the consequences for pushing artistic or social boundaries may be dire when the new tools get intentionally misapplied to dissenting speech or art.

        One saving grace is that this is once case where the ML tools can help with content moderation, as the "not-hotdog" tools are reasonably good at catching look-alikes. It will have a bigger impact on legal adult performers who look underage, who may need a legal content registry to prevent over blocking and legal action, but that has already been a problem for them anyway, at least from interviews over a couple of decades.

        It may be better if we rush to improve content moderation tools to keep the stuff from popping up everywhere, and walk a little slower working legislative changes out.

    3. lowwall

      Re: Some wrong assumptions are being made

      The US Supreme Court directly addressed this issue in 2002. They ruled that a federal law that made creation or possession of "virtual pornographic images" of children a crime was unconstitutional.

      They did allow one portion of the law to stand. This was the part that covered deepfake-type images, or as the ruling described it, "Rather than creating original images, pornographers can alter innocent pictures of real children so that the children appear to be engaged in sexual activity."

      You can read the decision here: or the wikipedia entry here:

  2. johnrobyclayton

    There is a question to ask here

    Do we want to prevent any form of exploitation of children?


    Do we want to prevent any form of enjoyment of sexual pleasure from perceiving information that is perceived by anyone else as representing children?

    The first is clear and easy to concretely describe.

    Laws prohibiting the first relatively are easy to frame with little risk of unintended consequences.

    Laws prohibiting the second are open to almost infinite feature creep and framing laws for this are almost certainly going to have unintended consequences.

    Digitally aging images of a missing child so that they can be identified years after they disappeared has been used to digitally reverse aging of adult entertainers to produce what appears to be child pornography.

    No children exploited here. This has been around for years.

    Drawing cartoons of children to produce what appears to be child pornography. There are lots of artist that can draw a picture of what looks like a child without needing a child as a model.

    No children exploited here. This has been around for millennia. I am sure there were a few statues in the Grecian Empire of individuals we would identify as underage without the carver needing to use a model.

    Drawing a cartoon of a bandicoot or a fox with a flying tail of an indeterminate age, but looks like the character in popular children's entertainment might be identifiable as something that can be experienced as child pornography.

    No children exploited here, though there might be some copyright infringement. There is a surprising amount of such content available.

    Creating a customizable doll.

    Lots of dolls and action figures with bendable and pose-able limbs.

    Lots of dolls/action figures that can have their configurations changeable. Mr and Mrs Potato Head.

    Lots of dolls/action figures that can simulate a number of biological behaviours.

    Wetting themselves, sucking on a bottle, crawling, walking, speaking phrases, speaking context aware phrases, full on human language interface.

    Writing a story about suck a customizable doll/action figure.

    Drawing a cartoon or creating animation about such a doll/action figure. Astro Boy.

    Creating Pornography about such a doll/action figure that that is brand new from complete cloth.

    There is no child exploitation here, but there is an almost infinite variety of pornography that can be imagined and produced.

    The doll/action figure/android/robot does not even have to look human for someone to perceive them as something that is recognizable as a child.

    There is simply no limit on what can be objectionable just as there is no limit on what people are capable of enjoying.

    There are an indefinite variety and indefinite count of individual that are capable of enjoying experiences that can be achieved through various forms of child exploitation. The only reason it is not infinite is that the population itself is not infinite.

    This does not mean that these individuals necessarily need to exploit any children directly to enjoy these experiences.

    Distracting ourselves by trying to detect, track, legislate against, prosecute, incarcerate all of these various types of people and activities will dilute our focus on protecting children in our care (that's all of them)

    Going after this infinite variety of people is low hanging fruit for legislators and law enforcement bodies. It allows them to crow from the rooftops that they are working hard to protect everyone from everyone else that can outrage and infinite variety of outrages. They can do this using easily applied technical tools to track and trace and spy on everyone.

    Focus on the hard work of protecting children directly instead of sticky beaking on everyone in the hope of finding something that outrages someone.

    1. Dinanziame Silver badge

      Re: There is a question to ask here

      I noticed the words "evil desires" from that letter are really very close to "thoughtcrime".

    2. Anonymous Coward
      Anonymous Coward

      Re: There is a question to ask here

      TBH as tech marches on animated Hentai will make pretty much all live filmed youth porn irrelevant.

      Possibly CGI animation will make all porn irrelevant apart from live steaming - and who will know in 20 years when it passes a Porn Eliza Test.

      1. Anonymous Coward
        Anonymous Coward

        Re: There is a question to ask here

        "She's actually a 200-year-old vampire"

    3. Anonymous Coward
      Anonymous Coward

      Re: There is a question to ask here

      Define 'child.' Do you mean under 16, under 18, or under 21?

      1. Anonymous Coward
        Anonymous Coward

        That's how they get you

        And of course add to the matrix "looks like" to "actual age" and "Not based on actual person"

        There are tons of huge problems in the edge cases all around these issues.

      2. katrinab Silver badge

        Re: There is a question to ask here

        The legal definition in the UK is under 18.

    4. Long John Silver

      Re: There is a question to ask here

      Yes, you speak sense.

      The bottom line regarding dealing with the making and distribution of pornographic images, these supposedly of children, rests with preventing actual children from being sexually exploited. That is, success ought to be measured in terms of real children rescued from harm, and of perpetrators being caught before they can do more harm to other real children.

      The Internet seemingly is awash with still and moving images of alleged child sexual abuse. Undoubtedly, many of these date back a long time; some still images originated in the Victorian era. "Alleged" was used in the previous sentence because mores have changed over time and by place; for instance photos taken by Charles Lutwidge Dodgson (aka Lewis Carroll) of his friend's naked pre-pubertal daughter would if made nowadays in Europe or the USA cause outrage and likely imprisonment for Dodgson.

      Law enforcement and enthusiastic moralists deem catching people in possession of illegal images a success in its own right. It matters little whether the images are recent or whether there is an extant child to be 'saved'. Generally, these are easy catches. Here an analogy to illicit possession of drugs bears mention. Very few people import, manufacture, or sell illegal drugs. The easiest people to prosecute are those found in possession of drugs for personal use and others just above them in the distribution chain. A fragile argument sometimes put forth is that catching end-users and minor distributors will smash the trade and force its king pins to turn to other activities.

      That argument is carried to possession of illegal images. It may be true that a tiny proportion of people possessing such images 'made them' in the common sense usage of 'made' rather than that meaning enshrined in British law: that is, they took the photos or directly persuaded another person to take them. Inevitably, this approach will lead to capturing some direct offenders and to saving some at present abused children. A weaker argument, not evidence backed, is that mere possession fires up enthusiasm to become an originator of new images. By strange logic, the seriousness of the offence of simple possession is deemed proportional to the number of images discovered.

      The above approach engenders 'opportunity cost'. Police officers and moral crusaders get a pat on the back for 'successes' involving mere possession unlinked to commercial transaction. That gives little incentive to delve more deeply into production and supply chains for new materials. Put differently, it's an approach which can offer a yield of children to protect, but it likely touches only the tip of an iceberg.

      Adding AI generated images to the pool of illegal making/possession crime serves only to further dilute effort which actually results in children being protected. Unfortunately, moral indignation can too easily lead to abandoning a hard-headed approach in favour of simpler means which produce seemingly impressive, yet often empty statistics.

    5. bombastic bob Silver badge

      Re: There is a question to ask here

      To summarize, exploiting actual children needs to be punished with extreme prejudice to the fullest extent of the law.

      Prosecution for anything else that does NOT exploit an actual child is essentially "THOUGHT POLICE" and must NEVER happen, regardless of any professed "moral outrage"

      That's my take on it, at any rate.

  3. chuckufarley Silver badge

    It's easier to regulate and mandate...

    ..teaching ethics and logic to school children that it will ever be to regulate and mandate morality. People will not stand for being told that they are immoral because they have so many logical fallacies to fall back on.

    1. Anonymous Coward
      Anonymous Coward

      Re: It's easier to regulate and mandate...

      You just need to look at the rise of the MAP to see this very thing happening.

      1. veti Silver badge

        Re: It's easier to regulate and mandate...

        I googled "the MAP" but all I got was maps. What are you talking about?

        1. Anonymous Coward
          Anonymous Coward

          Re: It's easier to regulate and mandate...

          Minor Attracted Person. It is newspeak.

          1. bombastic bob Silver badge

            Re: It's easier to regulate and mandate...

            At some point the 'M' for 'MAP' will be added to the LGB alphabet soup (if things continue on the current path), much to the chagrin and outrage of the 'LGB' portion (who just want to be left alone to live their lives, consenting adults etc,). The radicalized agenda has driven things WAY past common sense, long ago, and everyone who disagrees with this new added letter (or is outraged by it) will be labeled 'bigot' or some kind of '-phobe'.

            (I'm sure pointing this out is potentially offensive to many, but so is having an agenda crammed at us, so SOMEONE had to say what many are thinking, and take the arrows).

            That being said, the "moral outrage" crowd that would play "THOUGHT POLICE" with art that does NOT exploit actual children may actually end up signing onto the radicalized agenda behind changing the word 'pedophile' to 'MAP'. Do NOT doubt me! (Rush Limbaugh used to say things about this topic a LOT, particularly regarding organizations like NAMBLA, who have been around for a LONG time)

            (sad face because nobody really wants the 'M' for MAP in there)

  4. Norman Nescio Silver badge

    Simple solution

    Presumably, all one needs to do is have a government mandated prompt to tell the generative AIs not to generate illegal things? [fx: Disingenuous smile]

    Possession of a non-government approved AI could become a felony (just like non-government approved encryption). Someone's next novel could be about AI bootleggers.

    It is a tragedy that the Venn diagram of rational adults and politicians does not intersect.

  5. DJO Silver badge


    ...Ohio's Attorney General Dave Yost said in a statement. ... A society that fails to protect its children literally has no future," he said...

    Except where guns are concerned.

    Cue downvotes from leftpondians.

    1. doesnothingwell

      Re: Consistency?

      Being from Ohio I can say that this is just grandstanding for the "permanently outraged" to get votes. Gun control would save more children, so would chasing the producers of CP (the FBI distributes it to trap users). The authorities are all outraged but not enough to actually stop it, just to keep their jobs chasing it with more "control .vs. privacy" rhetoric. Once the public is fitted with mind control helmets what will they outrage over then?

  6. Torben Mogensen

    Violence in films and games?

    There have been a long debate on whether seeing violence in films and games (where no actual violence towards people or animals has happened) would make "impressionable" people more likely to commit acts of violence. So far, there has been no evidence that this is true -- except questionable studies that looks at people that have committed violence and see that they have seen such films and played such games and concluded this is why they did what they did without considering other possible causes. The implication might very well be the inverse.

    So simply assuming that watching AI-generated child porn would make people more likely to commit real-life abuse is questionable, and making laws on such an assumption even more so. It could even be that "evil desires" could be sated by AI-generated images. After all, the number of voyuer cases dropped after porn became legal.

    1. Anonymous Coward
      Anonymous Coward

      Re: Violence in films and games?

      Much less questionable if you look at actual research based on CSM and not based solely on low quality studies on another discipline and research area.

      Fetishization rarely happens in a vacuum, and the massive increase in fake incest porn in the industry was already worsening social problems. The closest parallel to your example is that some consumers of CSM do not directly assault children themselves, but in most cases they are more likely to the more material they are exposed to and more often. That also leaves out the harm to children in the original material.

      While it might be possible to train a "ethically sourced" GAN to produce realistic fake CSM, who would police that? Does anyone believe the bottom feeders will actually do that? Or will it be cheaper for them to build one on real CSM material and use the tech to cover their tracks and make it harder to block?

  7. jmch Silver badge

    Not sure I agree with the reasoning here...

    I'm not sure that the stated aim of reducing harm to children matches the policy request to make AI-generated child porn illegal.

    Starting by stating the obvious, kiddie porn and any and every related child abuse is despicable, awful and to be fought with every proportionate* means necessary. I'm not sure if there is any proper psychological research into whether paedophilia is intrinsic like sexual orientation, or if it is learnt, or if a bit of both is there a dominant cause? If paedophilia is more of an intrinsic behaviour, no amount of making stuff illegal is going to reduce demand, in which case AI-generated kiddie porn that is generated without using any real kiddie-porn source material** is preferable to actual kiddie porn with actual kids. If it's more of a learned behaviour, increased access to kiddie porn might lead to more potential child abusers. Without that psychological research data, difficult to decide which way to go.

    Secondly, in most jurisdictions it's already illegal to have or distribute child porn, whether it's AI generated or not**, so why is additional legislation needed??

    *universal government access to everyone's communications at a whim is *not* proportionate

    **and in this case, how does anyone know that it's AI-generated??

    1. David Hicklin Bronze badge

      Re: Not sure I agree with the reasoning here...

      >> so why is additional legislation needed??

      Its the same as in all cases of where "additional laws" are produced where existing laws already cover things - its shows that someone is doing something. Never mind about actually enforcing them !!

      1. martinusher Silver badge

        Re: Not sure I agree with the reasoning here...

        Based on the way that the need to take measures to suppress the tide of kiddie porn I'd venture to suggest that its not about kids or porn at all. Its all about control.

        Standard disclaimer here -- I'm not into kiddie porn, I don't agree with it and so on. It is a bit of a minority taste, so much so that I didn't even realize it existed until various righteous groups started kicking up a fuss about it 20-30 years ago or so. At the time we were also mid "War on Drugs" so I figured it was just 'more of the same'. No real solutions offered, just a call for more control over other people's lives. (Because, as I've said over and over -- restricting one class of information is a way to prototype the tools needed to restrict any class of information.

      2. 43300 Silver badge

        Re: Not sure I agree with the reasoning here...

        Recent UK governments have been particularly keen on this - quite a few laws produced for very specific things which were well within the remit of existing laws.

    2. Dinanziame Silver badge

      Re: Not sure I agree with the reasoning here...

      Secondly, in most jurisdictions it's already illegal to have or distribute child porn

      Well, no — Not if it's a cartoon. The reason child porn movies are illegal is that real children must be hurt to produce it. Producing a cartoon does not hurt children, so child porn cartoons are legal in most countries (with the notable exception of UK and Australia). This has been so far an acceptable situation, considering the fact that cartoons are obviously different from reality. But with AI, it will be possible to create photorealistic child porn without hurting children, and the goal of the new law is to make this content illegal all the same.

  8. Norman Nescio Silver badge

    Psychological studies

    There's a worthy goal here of wishing to decrease the number of abused minors, so even if we don't like the methods, the goal seems entirely reasonable.

    Arguing by analogy is fraught with difficulties, as sometimes the analogy can be invalid for subtle or unknown reasons, but even so, I'll point out some results in a related field.

    People can have strong feelings about legal pornography, with some regarding much of it as inherently misogynistic and reinforcing unequal gender stereotypes. This is not that argument.

    What has been investigated is a link between the incidence of male-on-female rape and and availability of pornography. The topic is controversial.

    Psychology Today: 2016-01-14: Evidence Mounts: More Porn, Less Sexual Assault - Those who claim that porn incites rape are mistaken.

    A criticism of the above article was made, citing research that supported the opposite conclusion, and the response was: Psychology Today: 2017-07-02: More Porn, Less Rape? The Controversy Revisited - Does porn cause rape? It depends on the study. But one type is more credible.

    There are other studies:

    Science Daily: 2010-11-30: Legalizing pornography: Lower sex crime rates? Study carried out in Czech Republic shows results similar to those in Japan and Denmark

    International Journal of Law and Psychiatry: 1991: Pornography and rape: Theory and practice?: Evidence from crime data in four countries where pornography is easily available

    So there is a reasonably plausible hypothesis that availability of realistic fictional depictions of abuse of minors is likely to reduce the incidence of actual abuse of minors. If the goal is harm reduction, and a lower incidence of abuse of minors, then such a policy could be evaluated against other methods for efficacy. Some people find such an approach to be morally repugnant - as somehow condoning abuse of minors, or possibly encouraging people to move from fiction to reality. In the latter case, a lot of work has been done looking at the effect of violent video games on people's propensity to commit violent acts. I believe the evidence is equivocal, at best.

    I'm not a psychologist, and by no means an expert. I think evidence-based policies for reducing abuse of minors would be a good thing. Research in this area is difficult. If another poster can cite good evidence that my belief/opinions are wrong, I'll happily be corrected.

    Abuse of minors is wrong. Evidence-based methods of reducing it ought to be evaluated. The politician's syllogism of "Something should be done. This is something, so it should be done." is an unreliable tool.

    1. NiceCuppaTea

      Re: Psychological studies

      "I think evidence-based policies for reducing abuse of minors would be a good thing."


      I think evidence-based policies for all law would be a good thing.

      Shame most governments think evidence and reality are optional...

    2. Anonymous Coward
      Anonymous Coward

      Re: Psychological studies

      There is a crack in these studies the way you are framing them, which is that there are two cohorts of concern not just one made up of the general public.

      One group are people prone to developing a fetish or paraphilia, the other is a group that has already developed one.

      The distinction is critical as the two populations are vastly different sizes. So if mass exposure and free availability ups the conversion rate of possible pedo's in the first cohort to 50%, that may mean millions of potential abusers. (cutting the material off won't stop it either, if it was possible, look at the blue dots on the federal sex offender registry.)

      The fact that the second cohort of people that have already gone to the dark side may reduce their offender rate slightly won't counter the harm caused by scale of the increased number of offenders and the harm caused to victims in producing "real" csm to feed the profitable market created.

      There is no daylight in the math for "harm reduction" legalization for ethical CSM. It's only a net harm to society, the only real question is where we balance the impact of civil rights vs total harm.

      Harm reduction means reducing the prevalence of the sexualization of minors, which is a problem that goes far beyond the porn industries.

      Sadly, and much worse ,zero harm isn't realistic or possible as despite nearly universal cultural taboos, these problems existed before the internet, or even camera's.

  9. SonofRojBlake

    Politicians and evidence

    Politicians are interested in evidence with regard to their policies.

    Unfortunately, their interest is not : "Does the evidence show that the policy will work?"

    Their interest is : "Does the evidence show that bringing in this policy will mean people will vote for me?"

    And in fairness, that's a rational choice for them.

  10. Alan Mackenzie

    There's a bit of a contradiction, here.

    If no actual children are involved in the production of AI child porn, then it isn't "child sexual abuse material". There is no rational groud to prohibit this stuff. The actual reason would appear to be a desire to make life difficult for people who are (supposedly) different from the norm.

    It must be very difficult, being a paedophile (by which I don't mean child abuser). Now that technology looks like being able to satisfy, to some extent, the sexual urges of these people without hurting others, lawmakers are looking to ban this use of technology.

    It seems clear that the motivations of these lawmakers is not to protect children, but to punish paedophiles for the "crime" of simply being. In so doing, they can only be harming real children, not protecting them.

    These lawmakers need to get a grip on reality.

  11. katrinab Silver badge

    "The National Association of Attorneys General argued that such material is not victimless, as tools capable of generating such images was likely trained on actual CSAM"

    I doubt it. More likely it was trained adult nudes from the likes of OnlyFans, which does not publishe CSAM, and photos of fully clothed children from the likes of FaceBook, which also doesn't publish CSAM. And it has combined the two to produce it.

  12. CountCadaver

    Already illegal in the UK

    Heck even those lewd simpsons cartoons are already illegal to possess as they come under "images of or appear to be that of a child" (or words to that effect), heck some labour member of the house of lords wanted to start censoring any written mention in any context.

  13. faibistes

    "One day in the near future, a child molester will be able to use AI to generate a deepfake video of the child down the street performing a sex act of their choosing,"

    We want a future where that child molester chooses to physically molest that child instead.

    It's been proven again and again that more porn=less sex crime.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like