back to article Child psychiatrist jailed after making pornographic AI deep-fakes of kids

A child psychiatrist was jailed Wednesday for the production, possession, and transportation of child sexual abuse material (CSAM), including the use of web-based artificial intelligence software to create pornographic images of minors. The prosecutors in North Carolina said David Tatum, 41, found guilty by a jury in May, has …

  1. DS999 Silver badge

    At what point do artificial images become "wrong"?

    Few would argue that possession of cartoon level drawings (like Simpsons characters) of child porn deserves a long prison stay. All but a few would argue that real life images of child porn most certainly do deserve a long stint in prison. Where in there is the line where you go from one side to the other?

    So what if AI is used to generate real looking but completely fake images, not based on any real child? Assuming the perp is only looking at them privately to satisfy his own urges, and never showing them to minors or attempting to "groom" children, should that be illegal? I suppose some will argue that that will lead to abusing real children in some cases, and that's almost certain to be true, in some cases. But I'm a little uncomfortable with the idea of treating it as a crime on the same level as possession of real child porn. I mean, it is legal to make and sell porn with 18 year old girls who look 13 - and that's the primary attraction for those consuming it, and I'd argue that's worse than anything AI may generate.

    Its certainly something we're going to have to figure out, as we can't be far from AI generated images being indistinguishable from real, and not too many years from AI generated video being indistinguishable.

    1. Bebu
      Big Brother

      Re: At what point do artificial images become "wrong"?

      "The trial evidence cited by the government includes a secretly-made recording of a minor (a cousin) undressing and showering, and other videos of children participating in sex acts."

      In this case the AI generated material was incidental and arguably could have been omitted with the court still bringing in the same verdict and applying the same sentence.

      The question of how far does a liberal society wish to go in censoring and criminalizing purely privately generated images is a reasonable one. What general principles and rationales should apply?

      When you move away from this obviously extremely sensitive context to asking apparently stupid questions like "should privately generated animations of Max Headroom 'doing' Shrek's donkey", leaving aside copyright issues, be a criminal act? (AFAIK bestiality is generally illegal.)

      Obviously ridiculous and I imagine most would see this portrayal Max's activities as a relatively harmless, if tasteless, waste of time. This poses the question of why this situation is apparently trivial and others would draw the ire of the full force of the state. Hypotheses non fingo.

      As an example I would not use the Simpsons as the internet has apparently generated everything the depraved mind might have conceived.

      Just about everything that could potentially contribute so much to our world has been perverted and befouling our lives. Someone has apparently coined a very apt word for this process of degradation viz "enshitification"

      1. Anonymous Coward
        Anonymous Coward

        Re: At what point do artificial images become "wrong"?

        > Assuming the perp is only looking at them privately to satisfy his own urges, and never showing them to minors or attempting to "groom" children, should that be illegal?

        Posting anon, because I have been on the sharp end of this debate.

        When I was about 14, I stumbled upon illegal images on 'Kazaa' - my friends were using it to pirate games/music/etc, and I used it to look for porn. I was in an all-boys school, I was looking at girls my own age, and I didn't appreciate how wrong it was.

        It formed a habit, and although I eventually realised that I needed to stop it, I would find myself going back to it in periods of depression, like a gambling addiction.

        I have never ever paid for, shared, discussed, produced or ever tried to get close to children, and to my autistic brain, that seemed like it was OK, I was not harming anyone, except perhaps myself.

        But of course when I was caught, it caused massive harm to myself, my family, my friends, my employer, and anyone who has ever been a child victim of such abuse sees yet another news story and will wonder if they were among the images I viewed.

        It is not so much the act, it is the thought (which the act proves the existence of) which disgusts and offends people. Even where no actual harm was caused e.g. with an AI-generated image, you will have indirectly caused harm (mostly to yourself and those associated with you), and you will have amplified the fear and hate that was already there in society.

        The Chinese call overuse of computer games etc. "Spiritual Opium", but this is more like Spiritual Arsenic, or Spiritual Fentanyl. It is also a societal allergen, that will label you as a threat to the organism, and the organism may overreact so violently that it damages itself. It is similar to the battles between certain religious and ethnic groups and those who fear terrorism.

        Is that right and just? Not really, but the only advice I can give is to stay well away. At least (unlike your ethnicity..) you can choose to steer clear of it.

        1. Anonymous Coward
          Anonymous Coward

          Re: At what point do artificial images become "wrong"?

          "Looking at girls my own age" seems perfectly normal to me......up to a point, of course. Boys are hardwired to lust after girls and part of their upbringing should be to channel this into an acceptable social context. Part of that process is recognizing that a 'girl' is also a person -- someone with thoughts, desires, ambitions and relationships and not just some kind of organic sex doll. (They also are just as sexual as boys.) This should be straightforward enough but its complicated by the way that sex, guilt and shame have long been used as tools of social control. It also doesn't help that women have historically been regarded as 'property' throughout history, a necessary resource that has to be controlled lest bloodlines are corrupted and distribution of property becomes entangled. (If you go back a few hundred years you get the impression that most of us weren't regarded as people, we were livestock!) Anyway, regardless of how 'enlightened' we might think we are today messing with relationships in the wrong context can get you into big trouble (as you unfortunately found out.)

          Incidentally, I've never understood why people get off on porn What little I've seen of it is intensely boring. The notion of "Spiritual Opium" is a good one because both opium and porn use the same basic brain chemistry reward mechanism and both can initially seem harmless. I think its best avoided, especially as the best expressions of sexuality in culture are not explicit ones, they embody intense feelings without corrupting them with the everyday mechanics of mating.

          1. Anonymous Coward
            Anonymous Coward

            Re: At what point do artificial images become "wrong"?

            "sex, guilt and shame have long been used as tools of social control"*

            *by religious leaders

            "women have historically been regarded as 'property' throughout history"

            Only by the wealthy minority in society and lunatics.

            None of us were around in the past, but just thinking about it logically makes that seem like a daft sweeping statement. Sure, women from higher ranking families in the past have been treated like property...but I honestly can't see Dave A. Pleb, Bog Raker from Arse End, Middlesex in the 1600's ever finding value in treating his daughters as "possessions"...it makes no sense...firstly, he likely has quite a few daughters, and so does Alan T. Peasant next door and Percy O'Pauper over the road...women have likely never been commonly treated as property amongst the "common masses"...it may have happened at times, if you had an exceptionally attractive daughter for example and a rich bloke came sniffing around...but amongst commoners...seriously doubt it.

            If you believe that shit, you're fucking outside of your mind.

            Dave: Ayup Al, I'm puttin' daughter on't market end't week.

            Alan: Ooh eck, I'll make sure I put word round then. Get top dollar an that.

            Dave: That'd be fuckin' beltin' would that.

            Alan: Ey, 'appen I find a buyer? Will thee throw us a few sheckles an that?

            Dave: Oh aye, I'll sort thee out.

            Percy: Ayup cockers! Young Percy Jr turned 18 the'day, I'm looking to buy him summat to knob...any word?

            Dave: Ey, our Davette'll be on't market Friday, for 2 bob, I'll bring her round after rakin't bog and Percy can have a fuck.

            Percy: Fuckin' Bobby Dazzler!

            Absurd.

            I seriously doubt any reasonable, bog standard man, throughout history, has held his baby daughter in his arms looked at her and thought "Fucking hell, I'll get a few quid for this one!".

            Screw your head on.

          2. Anonymous Coward
            Anonymous Coward

            Re: At what point do artificial images become "wrong"?

            Boys are hardwired to lust after girls

            Not all of us.

            1. Anonymous Coward
              Anonymous Coward

              Re: At what point do artificial images become "wrong"?

              Hello Sailor!

      2. ShortLegs

        Re: At what point do artificial images become "wrong"?

        Its been wrong for about a decade and a half, or does no one remember this little gem from the last Labour administration, that wanted to put 'thought crime' on the statute books:

        https://www.theregister.com/2009/03/18/thought_crime/

        Only 14 years ago....

    2. Filippo Silver badge

      Re: At what point do artificial images become "wrong"?

      >Its certainly something we're going to have to figure out, as we can't be far from AI generated images being indistinguishable from real, and not too many years from AI generated video being indistinguishable.

      And, a year or two after that, you'll be able to run those models locally. Good luck enforcing anything then, short of going full Big Brother and installing spyware everywhere.

      1. Mockup1974

        Re: At what point do artificial images become "wrong"?

        > going full Big Brother and installing spyware everywhere.

        Maybe that's the plan? Think of the children!

      2. katrinab Silver badge

        Re: At what point do artificial images become "wrong"?

        You can run them locally now. It is doable on an RTX 3060Ti, and probably something slightly lower speeced than that.

    3. Charlie Clark Silver badge

      Re: At what point do artificial images become "wrong"?

      I think the AI issue is a side issue. The real crime is the illicit recording of the patients and what happens with that material.

      Videos on YouTube of typical "japanese" soft porn images of sexy (if somewhat physically implausible), young women are already quite common.

    4. katrinab Silver badge

      Re: At what point do artificial images become "wrong"?

      If you take a photo of a real child, and digitally undress them, that is certainly on the wrong side of the line.

      1. DS999 Silver badge

        Re: At what point do artificial images become "wrong"?

        I would agree anything that uses the image of a real child, even just a real child's face, should definitely be against the law.

        I was thinking more of what will soon become possible where an AI can generate CSAM using pictures of children it creates that aren't based on any specific child (though it would presumably use pictures of a million children found around the internet to train it what a "child" looks like)

        1. Anonymous Coward
          Anonymous Coward

          Are we suggesting that manipulating a clean, non-abusive image of a child is equivalent to abusing said child? If we are saying that, then are we also saying celebrity deepfakes are also equivalent to illegal sexual abuse? If not, why not? If so, then how does this logic differ from people who equate video games with violence? Finally, what happens when teenagers want to deepfake their crushes? Should they be locked up too? If not, why not and how does that differ in terms of real world harm from a non-teenager doing the exact same thing?

          Without evidence of any real, living human children being sexually abused in the making of deepfakes, it makes no sense to ban them. There’s an easier argument to be had for boycotting Nirvana than there is restricting AI for this purpose.

          1. katrinab Silver badge

            Are we also saying celebrity deepfakes are also equivalent to illegal sexual abuse?

            Yes, if they are of a sexual nature.

    5. ecofeco Silver badge

      Re: At what point do artificial images become "wrong"?

      "...we can't be far from AI generated images being indistinguishable from real..."

      About...oh... yesterday. And moving past it, quickly.

      Once again, somehow people think current laws are somehow magically suspended because "digital". Slander, libel, incitement to violence, fraud, discrimination, and underage porn are pretty obvious "points of wrong."

      And AI, hell just plain old sophisticated algorithms, are being used for those for a few years now.

    6. Anonymous Coward
      Anonymous Coward

      Re: At what point do artificial images become "wrong"?

      I'll add this. Years before AI generated imagery was possible, at least in the UK, possession of "pseudo" imagery of children of a sexual nature was made an offence.

      At the time , mostly Photoshop etc existed . I assume one reason for this was that it was not always possible to confirm images were not , fully or partly derived

      from actual real life imagery. Seems the way the law is framed, this (righty or wrongly) simplifies the prosecution of any offences in terms of evidence

      Any distinction between real-life and invented imagery will only get harder to prove, as AI develops.

      1. Anonymous Coward
        Anonymous Coward

        Re: At what point do artificial images become "wrong"?

        "Any distinction between real-life and invented imagery will only get harder to prove, as AI develops."

        Unless you generated it yourself and you still have the seed, prompt, model etc...in which case you can easily reproduce an AI generated image over and over and over....

      2. DS999 Silver badge

        "Any distinction between real-life and invented imagery will only get harder to prove"

        Should actually be easy to prove, if the software that generates it was programmed to embed something in the image thus proving it - and that's likely to become a law eventually.

    7. david1024

      Re: At what point do artificial images become "wrong"?

      I think one place to find answers is the gun industry in the U.S. and how they struggle to control essentially blocks of metal and CAM files as weapons. Well maybe non answers. They are still struggling with it and will continue to do so.

      Goes like this:

      At what point is a kit that creates these treated the same as the material they create? Is the AI with training material and the additional inputs needed the csam? Or do you have to actually make the csam?

  2. cornetman Silver badge

    > I suppose some will argue that that will lead to abusing real children in some cases, and that's almost certain to be true, in some cases.

    I would be interested to see some evidence of the truth of that. Not saying I believe it or not. It just doesn't seem obviously true on its face though.

    I agree with the rest of your comment though.

    1. Anonymous Coward
      Anonymous Coward

      I've seen a plea from psychiatrists arguing for the acceptance of artificial material to satisfy their patients' urges because there's a lot of evidence the vast majority of people with these urges never act upon them outside their fantasies * and they argue there is evidence that for a part of them the material would be enough. On the other hand there's a growing body of evidence that the increased availability of porn in the last two decades has made some people look for ever extreme versions of porn for the same level of arousal. I suspect that most people who end up watching snuff movies started out watching quite innocent porn.

      This is likely one of those subjects that we should leave to the experts (psychiatrists mostly) but that society will never accept to be dealt with by experts because it's too emotive.

      * I personally suspect that paedophilia and ephebophilia are far more wide spread than is commonly believed. I find porn categories where 27 year old porn actresses have to pretend to have turned 18 last week, the fascination for school uniforms, the (step) daughter fascination, or Britney Spears who is a child at 17 years and 11 months but a sex bomb the day after her birthday highly suspicious. That would suggest there is a large proportion of (mainly) men who do have attractions to people well under 18 but don't go out raping children.

      1. Anonymous Coward
        Anonymous Coward

        Anon: Leave to the experts (psychiatrists mostly)

        > ..I suspect that most people who end up watching snuff movies started out watching quite innocent porn ..

        You suspect correctly. And then go on to act-out what they see on-screen. Something to do with pavlovian conditioning.

        Fred and Rosemary West

        Graham Dwyer guilty: Sadist architect stabbed Dublin woman Elaine O'Hara to death during sex

        > This is likely one of those subjects that we should leave to the experts (psychiatrists mostly) ..

        I suspect most psychiatrists are screwier then their patents

        1. mpi

          Re: Anon: Leave to the experts (psychiatrists mostly)

          > And then go on to act-out what they see on-screen.

          Many years past, it was claimed that FPS-Games can cause people to go on a shooting rampage. And of course there were cases of assholes who shot people, who also happened to have been playing FPS games.

          Well, there also have been many cases of murderers who inhaled air prior to, during, and shortly after committing murder. That doesn't show that breathing causes murderous behaviour, it shows that post hoc ergo propter hoc still doesn't work.

          1. katrinab Silver badge
            Meh

            Re: Anon: Leave to the experts (psychiatrists mostly)

            But if I remember correctly, studies showed that FPS gamers were actually less likely to go on shooting sprees than the general population. Possibly because they spend all their time indoors playing computer games.

          2. Anonymous Coward
            Anonymous Coward

            Re: Anon: Leave to the experts (psychiatrists mostly)

            Indeed...and the rate of global warming increased proportionally with the decline of pirates in the Caribbean.

        2. Anonymous Coward
          Anonymous Coward

          Re: Anon: Leave to the experts (psychiatrists mostly)

          I don't know about "most" psychiatrists being screwier than their patients, but I had read a short newspaper article about my former girlfriend's psychiatrist having blown his brains out in his office.

      2. Charlie Clark Silver badge

        I've yet to see any credible evidence of this. What pornography does appear to affect is people's expectations of sex and their own and other people's bodies.

        If feeding fantasy really did lead to escalation we'd need to ban not just all erotica but violent crime fiction, also technically pornography, as well which is predominantly written and consumed by women.

      3. imanidiot Silver badge

        I think it's mostly ephebophilia (attraction to minors in the late stages of puberty, 15 to 18), ie external sex indicators like widened hips and formation of breasts have developed) that is quite widespread. Especially given the many many stories of girls starting to get unwanted creepy attention from men as soon as they start exhibiting external signs of going through puberty. Hebephilia (attraction to early stages of puberty, roughly 10 to 14) probably far less common it seems and pedophilia (prepubescents) even less.

        1. Jellied Eel Silver badge

          I think it's mostly ephebophilia (attraction to minors in the late stages of puberty, 15 to 18), ie external sex indicators like widened hips and formation of breasts have developed) that is quite widespread.

          Probably right. Something I've noticed online a lot is men talking about girls. To me, boys and girls are the juvenile versions of men & women, so it just seems wrong.

        2. Charlie Clark Silver badge

          The porn industry does seem to think women are either in this group or 20 years older. But generally all signs of maturity, especially pubic hair, are rigorously removed. I'm not sure whether this is because the nubile look is the most popular, I guess it is if producers insist on it. Though I personally find bald pubis a turn-off for precisely because it's so "youthful". But I'd consider this all part of the post-chanel cult of youth that has been dominant for the last hundred years or so rather than something the porn industry dreamed up.

          1. Jellied Eel Silver badge

            But I'd consider this all part of the post-chanel cult of youth that has been dominant for the last hundred years or so rather than something the porn industry dreamed up.

            As usual, I blame marketing. One reason I've heard is 'It's more hygienic'. I'm unconvinced given the invention of showers and other hygiene products. Or maybe people shave because they keep getting crabs. So I suspect it's competing marketing from companies that flog waxing and grooming products to either keep them trimmed, or removed entirely. In which case there may be additional hygiene concerns from things like razor burns and ingrowing hairs. I haven't done much study on this subject, but base it in part on the number of sponsorships for manscaped that I see. Personally, the pre-pubescent look isn't one I really like, but then neither is hair long enough to dread or plait.

            1. Ideasource

              Pubic hair is extremely important to the hygienics of people who walk around nude all the time.

              It keeps the bugs out of the genitals.

              No I'm not talking about crabs I'm talking about mosquitoes, and others flying as well as creepy crawlies.

              With clothes it's just a sweat mop that has nowhere to go except to the feeding of odor causing bacteria.

              So with the invention of clothing pubic hair became a disgusting pain in the ass.

              Before clothing it was generally more hygienic because while no one wants bugs in their vagina and a big hairy bush on a guy definitely helps to keep your balls from being bit.

              So while showers are there as soon as you put your clothes back on but then about 10 minutes you're sweating again.

              Also if your pubes are thick enough you're never actually naked.

              If all you can see is hair then you can't actually see any genitals and any complaint is pure psychological dysfunction stemming from an overactive imagination on the part of an objecting observer

              1. Jellied Eel Silver badge

                So while showers are there as soon as you put your clothes back on but then about 10 minutes you're sweating again.

                Err.. I'm not, but then I've never really been a fan of pvc, leather, latex or rubber undies unless I'm going diving. Most of those are not really intended for everyday wear anyway. Most problems could be avoided by choosing appropriate fabrics, but then we also live in a society where people have been conditioned to think work clothing like jeans need to be washed after every wear.

        3. cornetman Silver badge

          > ...ie external sex indicators like widened hips and formation of breasts have developed) that is quite widespread.

          Well I would suggest that probably two factors are at play here.

          - Clearly a woman that has developed sexual features start to form the attributes that make her attractive to men

          - There *is* a general sexual preference among men for younger women, which seems fairly sensible considering that in younger years women are most fertile.

          The dividing line between girls and women has both a legal and biological perspective. The legal one is fairly arbitrary although we clearly need one. It will fit some girls/women well, others less so.

          We are seeing, in modern times a general trend for women to wait much later in life to get shacked up, but it wasn't always so and there were sound social and biological reasons to get it done as early as practical. Society has moved on, but I dare say that our biology has not.

          1. Snowy Silver badge
            Holmes

            With all the hormones in the environment puberty is getting earlier.

      4. Anonymous Coward
        Anonymous Coward

        Funnily enough there have never been any REAL snuff movies proven to exist, well outside the imagination of media and political types...why kill your actors when there are plenty who can do a convincing acting job, with wigs and make up quite possible to play lots of "victims" in which case how is it any different to Martyrs, human centipede, hostel and many many other horror flicks?

    2. mpi

      Reminds me of the "killer games" discussions of many moons past, when it was claimed, without any solid evidence of course, that FPS-Games and similar video game genres (the more radical simply claimed that this applies to all video games) led to an increases likelihood of physical violence.

  3. Paul Crawford Silver badge

    David Tatum, 41, found guilty by a jury in May, has been sentenced to 40 years in prison and 30 years of supervised release

    Just how long do they expect this guy to live to and still be a risk to children?

    1. stiine Silver badge
      Meh

      They did that so that they never have to stop monitoring him like they would have to if he were a murderer who served 30 years followed by no supervised release. This sentence will alos survive the termination, if that ever happens, of the sex offender registry.

      1. Dinanziame Silver badge
        Devil

        Just wait until they stopped supervising him at 111, and he immediately commits a new crime!!1!

    2. Jedit Silver badge
      Meh

      "Just how long do they expect this guy to live"

      He was in a position of authority and now he's a nonce in prison. You could have stopped with this portion of the question.

      But yeah, as someone said they're dotting the Is and crossing the Ts. In the US a prisoner is eligible for parole after serving a third of their sentence with a "maximum minimum" of ten years. Unless the judge made a specific ruling that parole should be denied for a longer period Tatum could thus be released aged 51, which is hardly too old to pose a risk of recidivism. So the second half of the sentence is them saying that while he may not be confined for the rest of his life, they're never going to stop watching him.

  4. This post has been deleted by its author

    1. imanidiot Silver badge

      Problem is how to train an AI on what content to refuse to generate? Either you need to do it on the input side, which is very hard to do (as people are far too inventive on creating workaround prompts) or do it on the output side, but that would involve/require possession of training material, ie CSAM. There's some argument in the text to image generation field that such a block by detecting undesired output would basically mean training an AI to generate CSAM and thus break the law. And make it easier for bad actors to get the desired output.

      1. stiine Silver badge

        Microsoft and the Library of Congress (in the US) have all of that stuff, along with the FBI, NSA, and most NGOs in the CSAM space...

    2. mpi

      If someone hits another person over the head with a hammer, the tool company isn't complicit.

  5. Plest Silver badge
    Facepalm

    Hands up if anyone reading this is the least bit surprised.

    AI is just a tool, and a camera or even pencil and paper can still be used for disgusting purposes in the wrong hands.

    1. Anonymous Coward
      Anonymous Coward

      Re: Hands up if anyone reading this is the least bit surprised.

      Not to mention, an imagination...

      1. Dinanziame Silver badge
        Big Brother

        Re: Hands up if anyone reading this is the least bit surprised.

        We shall have to do something about it (see icon).

      2. Charlie Clark Silver badge
        Coat

        Re: Hands up if anyone reading this is the least bit surprised.

        Arrest that man!

  6. General Turdgeson

    Not saying what he did was right, it wasn't and it was disgusting in fact, but in many parts of the US, he wouldn't get nearly the same amount of time if he physically molested a child in the real world. And one can still serve as little as 7 years for 2nd degree murder.

  7. Snowy Silver badge
    Megaphone

    The Ai

    Was the least of his crimes, he actively created it by exploited family and his patients lock him up and throw away the key!!

    1. Ideasource

      Re: The Ai

      It's more humane to execute a swift death penalty than to lock them up forever.

      Gaining comfort through the knowledge that others are suffering is an evil trait that is not to be encouraged.

      If You have to solve a physical problem through hard action, so be it.

      But capitalizing on human suffering for emotional benefit is the mark of supervillains.

      We ought not to be training general society to such ill methodologies of soothing.

  8. Anonymous Coward
    Anonymous Coward

    Was the crime actually that this was found out, and not what is being called the crime?

    If nobody would have found out about this (other than himself), Would anybody really have been hurt by this (except for himself)?

    Anyway, How did they find out about this?

    What are the 'victims' supposed to do for 'therapy' now?, Go to another 'child psychiatrist'?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like