back to article Users prompt Elon Musk's Grok AI chatbot to remove clothes in photos then 'apologize' for it

Grok, the AI chatbot owned and operated by Elon Musk's xAI, is facing a firestorm of outrage after users prompted it to create images of naked and scantily clad people from real photographs, some of whom are underage. Grok is a chatbot originally created by xAI, Elon Musk's artificial intelligence startup. Earlier this year, …

  1. DrewPH
    Mushroom

    "I saw a couple of fratty looking dudes demonstrating something on their phones with big grins on their faces. I asked them what they were doing, and one of them took a photo of me, then used AI to generate images that appeared to put me in compromising positions – one had me kissing an imaginary woman, another had me flanked by a couple of scantily clad strippers."

    Did you sue the arse off them? Because if you didn't, you're part of the problem.

    1. IGotOut Silver badge

      Did you even think about that post for even a nano second.

      1. Spend thousands suing them for......

      2. What are you ACTUALLY going to sue them over?

    2. Dan 55 Silver badge
      Facepalm

      Great solution you've come up with, absolutely perfect for school bullying and people being harassed online. Let's give it a name, how does "victim blaming" sound?

  2. refitman

    Payment processors

    I assume all the major payment processors (Visa, MasterCard etc) will be suspending their services ASAP? The same as they did with legal adult sites?

    I'm also sure all the authorities that constantly bang on about "think of the children" will be lining up prosecutions for the executives that let this happen?

  3. Anonymous Coward
    Anonymous Coward

    Stay away from dark, dingy and rubbish strewn back alleyways

    Leave them to the rats.

  4. demon driver

    Consequences

    "Whatever happens, society will have to adapt to the consequences."

    In a system called democracy, society should not just "have to adapt" to any minority-made (i.e. billionaire-made) life- and world-changing developments...

    1. Yorick Hunt Silver badge
      Holmes

      Re: Consequences

      "a system called democracy" != "democracy"

      1. Anonymous Coward
        Anonymous Coward

        Re: Consequences ... apparently ONLY happen to OTHER people !!!

        ""a system called democracy" != "democracy""

        This fact has been the source of some confusion in the US of A for many many years !!!

        Even the POTUS has problems with it !!!

        :)

        1. Not Yb Silver badge

          Re: Consequences ... apparently ONLY happen to OTHER people !!!

          Surprised no one has piped up with "it's a republic, not a democracy" yet.

          1. Jou (Mxyzptlk) Silver badge

            Re: Consequences ... apparently ONLY happen to OTHER people !!!

            'cause it is an Empire, not a republic.

            1. Anonymous Coward
              Anonymous Coward

              Re: Consequences ... apparently ONLY happen to OTHER people !!!

              Anyone thinking that's just a reference to Star Wars, is probably not paying enough attention.

              1. The Indomitable Gall

                Re: Consequences ... apparently ONLY happen to OTHER people !!!

                Unfortunately this real life Emperor doesn't have the lightning coming out of his fingers to give him away as the baddie....

                1. Rich 11

                  Re: Consequences ... apparently ONLY happen to OTHER people !!!

                  Are we sure that that wasn't one of his NFT scam images? Or maybe he was talked out of publishing it by Stephen Miller. "Forgive me, My Leader, but although the lightning bolts truly signify your magnificence and innate potency they might be misconstrued by the little people. The glorious day has not yet come in which we can fully announce your natural suzerainty over humanity to the world."

              2. Jou (Mxyzptlk) Silver badge

                Re: Consequences ... apparently ONLY happen to OTHER people !!!

                Well, I was thinking about adding "which means returning to its English roots from 15th and 16th century and colonize the unknown world outside its borders"... Canada already got a taste of that idea last year.

            2. T. F. M. Reader

              Re: Consequences ... apparently ONLY happen to OTHER people !!!

              'cause it is an Empire

              Any mention of "Empire" in the context should be followed by the "no clothes" bit. It just fits.

              1. Jou (Mxyzptlk) Silver badge

                Re: Consequences ... apparently ONLY happen to OTHER people !!!

                Oh, his dementia is already this far? **checking** Oh, not yet, but it wouldn't surprise me...

                1. Anonymous Coward
                  Anonymous Coward

                  Re: Consequences ... apparently ONLY happen to OTHER people !!!

                  With his recent revelations of how much aspirin he's knocking back each day, and the well known risks associated with bleeds - it may not be dementia but simply self inflicted brain damage.

                  1. ICL1900-G3 Silver badge

                    Re: Consequences ... apparently ONLY happen to OTHER people !!!

                    As Fats Domino once opined:

                    Ain't that a shame...

          2. Mostly Irrelevant

            Re: Consequences ... apparently ONLY happen to OTHER people !!!

            Constitutional republic, that's what they always say. And of course the first like of the description of constitutional republic says that it's a form of a democracy. Even Wikipedia still says this after years of vandalism by people who blindly repeat dogma.

    2. Scotech

      Re: Consequences

      The big players aren't the root problem. Some are already trying to self-regulate, per the article. Others (e.g. Grok) will be forcibly regulated, if they don't bring themselves in line first. But for every commercial LLM out there, there are several hundred open source ones, and the same goes for image-gen models, and soon, for multi-frame image-gen models (video models) too. Anyone with the right consumer-grade hardware can easily use these models. And that's before we look at how trivial it's becoming to tweak the outputs, or even train while new ones. The core problem is the random guy (or state agency) somewhere who very carefully uses a number of models running locally to create highly believable deepfakes across multiple media that are strewn carefully around the web to feed some conspiracy narrative serving some narrow interest or antidemocratic movement. AI amplifies the ability to produce and distribute misinformation. The genie is out of the bottle on this now, and there's no way to legislate the problem away.

      This is the reality that society needs to adapt to in order for democracy to survive - we as a society need to learn to always be sceptical of everything we consume second-hand, take better care of what sources we choose to believe, and to strike the right balance between trust in domain authority and distrust in power. Paradoxically, the LLM boom has actually made good journalism more relevant than ever before, right as traditional media is collapsing before our eyes, a process now accelerated by that same boom. It's probably orders of magnitude more important that we legislate to protect good journalism from predatory practices by tech businesses than legislating to ban naughty uses of AI tech at this stage, but as usual, legislators are about a decade or more behind the curve when it comes to all things tech.

      1. Neil Barnes Silver badge
        Flame

        Re: Consequences

        We have arrived at a point whereby unless we are in the physical presence of a second party, we cannot assume that they performed any action attributed to them. Period.

        We can speak to 'someone' on the phone, but without external mechanisms cannot prove that it is the person with whom we expected to speak.

        We can 'see' them perform actions, but don't know whether or not they were performed, or whether the actors were those portrayed.

        We can read prose without any immediate clue as to whether it was written by the claimed author, or represents his views, or is a distortion thereof.

        It's all very depressing.

        1. Persona Silver badge

          Re: Consequences

          It's an old problem. There was a time centuries ago when the kings orders were written on parchment and delivered to far flung places to be implemented. Enterprising types realised they could just write orders themselves so this was countered with the Kings Seal which was a metal stamp used by the king to imprint sealing wax to "prove" it was genuinely from him. What could possibly go wrong....

          1. Doctor Syntax Silver badge

            Re: Consequences

            One means of stopping things going wrong was to record the decisions in the Patent Rolls or Close Rolls. Archive.org will provide you with digitised copies of some of the publications of these made in the late C19th/early C20th (search for calendars and the name and regnal number of the king).

            If you produced a forged charter with a forged seal on it something very nasty would happen to you. That's what could go wrong.

            1. Rich 11

              Re: Consequences

              Some of the medieval punishments for forgery were quite imaginative. Even a function as basic as the authorised minting of coins was so strewn with legal consequences for error, let alone malice, that it's a surprise anyone dared take on the job.

      2. brainwrong
        Stop

        Re: Consequences

        "The big players aren't the root problem."

        People are the root problem. Ordinary people. They use these things for their own selfish reasons without any thought as to the consequences to the greater good. The big players are simply catering to the market that they see.

        I don't expect that to be a popular opinion.

        1. thosrtanner

          Re: Consequences

          People have been the problem since for ever. The Roman emperors found out that bread and circuses kept the masses happy and they (or at least the ruling classes) could do more or less what they wanted. That hasn't changed much in the last 2000 years. The bread is now fast food and cheap booze and the circuses are provided by the media/social media.

    3. Anonymous Coward
      Anonymous Coward

      Re: Consequences

      Would you like me to make a list of the things you have to stop using?

    4. jdiebdhidbsusbvwbsidnsoskebid Silver badge

      Re: Consequences

      "Whatever happens, society will have to adapt to the consequences."

      "adapt" in these cases seems to mean "put up with". If this is illegal as suggested in the article, why aren't the police knocking on the door of X and arresting them? Let's face it, the evidence is out there and irrefutable.

      Seems that these days, corporations are above the law if their computer systems (that they made and operate) do things that are illegal. Accountability is disappearing.

      1. aks

        Re: Consequences

        It's a tool. Are you going to ban paintbrushes as they enable painters to create erotic pictures? Same applies to other tools, such as Photoshop.

        AI is just another tool.

        1. retiredFool

          Re: Consequences

          You really can't be comparing the two. I am pretty sure I could give you all the paint brushes, paint and canvases you want, and would not come up with a credible image. I know I couldn't. But I suspect if either of us had access to grok, took a photo or a woman, we and probably 1/2 of humanity or more could come up with very credible images.

          1. Persona Silver badge

            Re: Consequences

            They are comparable as you are just arguing about the relative quality of the tools. Both need the intent of the person using them, one just needs more skill.

        2. JamesTGrant Silver badge

          Re: Consequences

          Oh - a gun argument! This is why we arrest shooters and not gun owners.

          1. Jou (Mxyzptlk) Silver badge

            Re: Consequences

            Oh, it wasn't until you made it one. But while there: Why not adopting the swiss model of gun and ammo control in USA? Every US gun lover knows at least ONE personally where he clearly thinks "should not own a gun" - but he/shoe has one. Or the Chris Rock variant: Every bullet costs at least 5000$.

          2. Naselus

            Re: Consequences

            "This is why we arrest shooters and not gun owners."

            Which, in turn, is why the US experiences more mass shootings than days per year.

        3. jdiebdhidbsusbvwbsidnsoskebid Silver badge

          Re: Consequences

          Yes it's a tool, but a very different type of tool with lots of autonomy that far exceeds that of a paintbrush or even photoshop.

          The paintbrush analogy would work only if it was a self-painting brush that responded to the operator's vague directions. Same for the Photoshop analogy - if Photoshop had a "make this person nude and post the image on the internet" single click function.

    5. Anonymous Coward
      Anonymous Coward

      Re: Consequences

      Right, we should never have had to adapt to anything, and just stay exactly the same forever unless the majority wants the change. *sigh*

  5. Baird34

    Morals

    The guardrails are put in place by the creators. So Grok having no morals says a lot about the ultimate boss. After all, it is the marketplace of ideas so long as they're the 'right' ideas.

    1. Doctor Syntax Silver badge

      Re: Morals

      I think if people started posting nudified images of Musk the guard rails would be applied PDQ. Or maybe they're already there but only apply to images of him.

      1. X5-332960073452
        Stop

        Re: Morals

        I need some REALLY strong mind bleach now!

      2. Anonymous Coward
        Anonymous Coward

        Re: Morals

        They already exist

  6. Anonymous Coward
    Anonymous Coward

    Golden opportunity

    Not of the shower variety.

    People could use this as an opportunity to "normalize" having nude pictures of _everyone_ available. Someone posted a nude picture of you? So what, someone posts nude pictures of everyone. Oh, that was a *real* nude picture of you, with your second boyfriend, that you don't want your first boyfriend to see? Really? It's not AI?

    Instead everyone is taking the radical opposite approach - Their virtual tits are THEIRS to hide, and no one else can see the generated data that the virtual owner doesn't know about and which doesn't represent the virtual owner!! Oh that picture with your second boyfriend, given we know that it's so-much-harder to do AI nudes? Well, I, I, uh, ....

    Still waiting for the day that a human body is not an offensive, unacceptable thing. Compare the reactions to these events to the reactions of "the populace" to the Janet Jackson superbowl "event" with Justin Timberlake.

    1. Anonymous Coward
      Anonymous Coward

      Re: Golden opportunity

      Too many people think that naked photos of even consenting adults are somehow inherently perverse, unless there's an urn or a plinth, or maybe a cherub in the corner..

      1. Jou (Mxyzptlk) Silver badge

        Re: Golden opportunity

        Hold it, that is the US of A. Rest of America, Europe, Russia and so on are more relaxed. Even India is getting more relaxed, less "accidentally wearing white singing in the rain" and more actually showing.

        Keep in mind, they all have less or no problems with nakedness, but there is a difference when it comes to sexual activity - even in France.

      2. Bebu sa Ware Silver badge
        Happy

        Re: Golden opportunity

        "naked photos of even consenting adults are somehow inherently perverse, unless there's an urn or a plinth, or maybe a cherub in the corner.."

        The noted arbiter of Ankh·Morpork taste (sgt Fred Colon) put it thus:

        "Nude women are only Art if there's an urn in it", said Fred Colon. This sounded a bit weak even to him, so he added, "or a plinth. Both is best, o'course. It's a secret sign, see, that they put in to say that it's Art and okay to look at." §

        †Terry Pratchett Thud ! … § The rest of the dialogue is too good to omit.

        • [Nobby] "What about a potted plant?"
        • [Colon] "That's okay if it's in an urn."
        • "What about if it's not got an urn or a plinth or a potted plant?" said Nobby.
        • "Have you one in mind, Nobby?" said Colon suspiciously.

        1. Anonymous Coward
          Anonymous Coward

          Re: Golden opportunity

          Well spotted, it's almost like I knew about that dialog when posting earlier... (thanks though, not everyone can spot a Discworld reference via semaphore)

    2. Bebu sa Ware Silver badge
      Coat

      Re: Golden opportunity

      "the day that a human body is not an offensive, unacceptable thing." — little chance once the AI overlords have the upper hand.

      In any case utterly impossible the US as that excuse for a nation ceaselessly strives for a global monopoly of hypocrisy. [Hypocracy - rule by hypocrisy ? ]

      While anything even vaguely touching on Musk's interests ought to be condemned out of hand, I am a bit puzzled about the details. I don't have any actually experience of these AI "disrobing apps", but I would be surprised if these apps produce an accurate representation of the subjects body; at least from a single image.

      Thanks to the "miracle" of the internet the availability of unclad images illustrates the enormous variety of human anatomy. An interesting experiment might be be to take the initially clothed images from the likes of Prontube; have grok et al. disrobe it; compare the result with an unclad and presumably later frame of the subject.

      I was thinking that an unclothed child would be of little interest to all except the paedophile minority until I realised that "child" in the US means anyone under 18 years without distinction between the youngest and eldest.

      The production, possession or dissemination of any depiction of child exploitation whether computer generated, hand crafted or photographic is a serious offence in most jurisdictions. Posting to social media of even inaccurate images of unclothed children would likely constitute such an offence.

      Ultimately taking an image of another person and without their consent modifying it, certainly demonstrates a lamentable lack of respect; sharing or publishing such a modified image is fairly clearly an invasion of the subject's privacy. The question might be posed: "How would you feel if your 14 year old daughter's image were so treated ?"

      † taking the internet's "killer app" generically.

      1. Long John Silver Silver badge
        Pirate

        Re: Golden opportunity

        Your general concern is legitimate. However, one of your points requires broader understanding by people called upon to enact law or to administer it.

        A 'nudified' photograph is nothing more than an artist's impression; the artist being an 'AI' embodying information about anatomy. A pencil sketch by a human artist would fall into the same category. Thus, a simple construct of a naked real person is nothing to fulminate over. It's not a candid camera picture.

        When the naked construct is shown with lascivious connotations (e.g. its constructed pose altered from the original with lewd intent) it could be deemed bad taste and, maybe, as impugning its subject. Portrayal of sexual activity, especially with another identifiable person, ups the ante considerably.

        One imagines people with oversight of prosecutorial services spending happy hours makings sets of hypothetical guidance images to assist the judiciary in determining the gravity of an offence.

        1. Doctor Syntax Silver badge

          Re: Golden opportunity

          The objection arises when the image purports to be of someone who has not consented. Purports in that the face is recognisably that person.

          Bebu rightfully reminds us of the Golden Rule.

      2. Doctor Syntax Silver badge

        Re: Golden opportunity

        "Hypocracy - rule by hypocrisy ?"

        Excellent suggestion. It just about describes the US at present.

    3. captain veg Silver badge

      utter tit

      Consent.

      -A.

  7. Richard 12 Silver badge

    Society will have to adapt to the consequences

    Society will "adapt" by imposing consequences on those making the software available. Initially fines, later prison time for those CEOs.

    There are already civil and criminal laws against the creation, distribution, and/or possession of some of these images in some jurisdictions, and in some cases the only defences are "as part of a criminal investigation" or "it didn't happen".

    The UK will be making offering this kind of software an explicit criminal in the next few months. It'll be interesting to see the text and whether prosecutions ensue.

    1. Burgha2

      Your last sentence is the important one, whether there end up being any prosecutions. You can pass all the laws you like, but if you don't enforce them, well...

      1. Long John Silver Silver badge
        Pirate

        Your remark reminds me of what happened in the UK when elaborate legislation to curb online child pornography was introduced. The legislators' good intent turned into a nightmare for the people (police and prosecutors) designated to enforce it.

        Resources for preventing/detecting lawbreaking are finite and stretched. The concept of 'opportunity cost' eludes legislators, whose attention concentrates on one thing at a time, without careful thought over how that fits the bigger picture. Enforcement authorities were given little guidance over how to prioritise attempts to curb child abuse. Images long in circulation on the Internet may be nasty, but are an offshoot of the deeper, and much more serious matter, of abuse taking place now. Perhaps, police forces were tempted by easy pickings from Internet surveillance instead of the more costly and skilled investigation of continuing abuse within UK jurisdiction.

        Anyway, when the legislation was enacted, we soon heard of police officers turning up in museums and galleries in response to complaints about images which hitherto had been acceptable and deemed artistic. The same level of inanity as when, in response to legislation concerning 'hate', a pub was raided by multiple officers because the landlord had golliwogs on display. Another amusing incident was when the TV presenter/personality Anne Robinson was placed under investigation - apparently for months - by North Wales Police because somebody had complained about Robinson having quipped "What use are the Welsh?".

      2. Doctor Syntax Silver badge

        Quite often existing laws, if enforced, would be applicable. I suppose nobody in law enforcement wants the task of running it up to the Supreme Court which might be necessary. Thus we end up with a welter of new legislation, dealing with narrower and narrower offences.

    2. John Brown (no body) Silver badge

      There are already civil and criminal laws against the creation, distribution, and/or possession of some of these images in some jurisdictions, and in some cases the only defences are "as part of a criminal investigation" or "it didn't happen".

      An interesting aside: If Grok is prompted to created the image, are both the person instructing it and Grok co-conspirators[*]? Grok originated the image and is both creator or possessor. The person instructing Grok has it delivered to their phone/laptop and so is also a possessor. But because the data is sent in ram form and only assembled into an image when it arrives on their device, did they also create the image? ISTR in UK law the person converting the data into the image can be deemed "creating" the image although I'm not aware this has been tested in court yet.

      * Conspiracy is a good catch-all with excellent jury prospects of conviction because "conspiracy" sounds more scary and only requires proof that two or more parties made plans to commit a crime. Asking Grok to commit an illegal act and then Grok carrying it out and delivering the goods sound like conspiracy and the act to me ;-)

      1. Dan 55 Silver badge

        Grok doesn't have agency so can't be part of a conspiracy. The software is a service hosted online by xAI. Such software is not currently illegal in the UK although the government want to make it illegal. Looking forward to Musk spending five years in prison.

        1. retiredFool

          Jail?

          Musk's software has already killed people. FSD. He is not going to prison. He has very good lawyers. Very very good lawyers.

      2. Anonymous Coward
        Anonymous Coward

        xAI's Grok is (I can't believe I'm making this point after this article) NOT a person. It is incapable of conspiring with anyone.

        1. John Brown (no body) Silver badge

          True, but the person(s) or entit(y|ies) in control of grok could be culpable. Guns don't (generally) kill people, the people firing them do ;-)

      3. Anonymous Coward
        Anonymous Coward

        "ISTR in UK law the person converting the data into the image can be deemed "creating" the image although I'm not aware this has been tested in court yet."

        In UK law you don't have to actually take an indecent image of a minor to commit an offence, the act of saving an indecent image of a minor to a hard drive is classed under the offence of "Making an indecent image", and that includes if you view an online image in a browser and your computer automatically caches it to your hard drive, you have still commited the offence. Indecent images are not just actual photographs but also 'pseudo' images, including AI generated ones of a person who is or appears to be under the age of 18

  8. Bebu sa Ware Silver badge

    Lateral Thinking…

    A putative application for the fashion industry (as in haute couture) that allowed a designer to enrobe anatomically realistic models with their latest creations before scissor meets textile, when run backwards in time would arguably be exactly the kind of application that might be banned.

    From what I understand that industry is fairly comfortable with the naked (even blasé) - probability a neccessity given the need to rapidly change between creations during showings.

  9. Bebu sa Ware Silver badge
    Coat

    After twelve months of …

    weekly sessions of dissection in human Anatomy you do come to the conclusion that the only truly naughty part is above the neck.

    I still gag at the merest reek of formaldehye (formalin.) The (lab) coat is obvious.

    1. Anonymous Coward
      Anonymous Coward

      Re: After twelve months of …

      15. The naughty bits of Reginald Maudling

  10. Long John Silver Silver badge
    Pirate

    Get used to reality

    From adolescence onwards, some males, and some females, choose to display their 'assets' to onlookers by wearing clothing, and assuming postures, intended to provoke imagination of what lies beneath: a human variant of animal pre-mating displays. This behaviour may provoke accusations of lewdness from people afflicted either by good taste or by prudery. It occasions congratulation when the person of interest makes a good showing as a fashion model or in a beauty contest.

    Hypocrisy is, and always was, a feature of modern societies. Legislators enjoy opportunities to display their moral gravitas to the section of their electorate of prudish tendency; they will prate about degeneracy and nebulous harms to immature or vulnerable people and strive to be seen dotting i's or crossing t's to construct complicated meshes of legislation; all too often, their sentiments don't gel with their 'off duty' behaviour. That's called 'politics'.

    Technical aids to fevering imagination, e.g. 'nudifying', are here to stay. No amount of legislation and 'opportunity cost' inducing policing will put genii back in their bottles. As long as the Internet persists, open source 'AI' models, community created variants, and modifiers such as LoRAs, will abound and flow freely. Although most people may rely on 'apps' for satisfying their forbidden desires, the underlying image manipulation is so simple that home-brew recipes will flourish.

    Thus, what imaginative people do in private is merely an extension of their inbuilt visual processing. Outlawing it would be as useless as a moral injunction against picking one's nose. The only sensible response is the use of criminal and civil remedies when 'AI' technology is used to harass individuals. Even so, some people appear to beg for being lampooned through deep-fakery: arrogant politicians, 'celebrities', 'influencers', and others among the self-proclaimed 'great and good'.

    1. LionelB Silver badge

      Re: Get used to reality

      > From adolescence onwards, some males, and some females, choose to display their 'assets' to onlookers by wearing clothing, and assuming postures, intended to provoke imagination of what lies beneath...

      With relevance to the article, I've highlighted the crucial word for you.

      > The only sensible response is the use of criminal and civil remedies when 'AI' technology is used to harass individuals.

      I'm sure we can all agree there, but is there any case for nudifying an individual without their consent which does not qualify as harassment (in the colloquial if not legal sense)?

      1. Long John Silver Silver badge
        Pirate

        Re: Get used to reality

        It's harassment only if the image is published, e.g. posted in a forum.

        1. Dan 55 Silver badge
          WTF?

          Re: Get used to reality

          Because if you're found with a terabyte of child porn, you're usually let off free, right?

        2. Sp1z

          Re: Get used to reality

          "Recently, some X users noticed if they took a photograph that had been posted on the service and prompted Grok to remove the clothing from that photo, it would do so and POST THE RESULTS PUBLICALLY ON X."

          Try again.

    2. refitman

      Re: Get used to reality

      Sounds like some top-tier nonce-apologia to me. These 'tools' have been used to sexualise children and your response is "best get used to it *shrug*".

    3. nobody who matters Silver badge

      Re: Get used to reality

      "As long as the Internet persists......."

      And how long do you think it will persist?

  11. that one in the corner Silver badge

    Asking the impossible

    > Grok's human creators appear to have failed to prevent it from creating posts that remove the clothing from real people in real photos when asked to do so.

    More to the point:

    Grok's human creators have absolutely no idea how to ... prevent it from creating posts that remove the clothing from real people in real photos when asked to do so. Or any other unwanted outputs.

    The underlying models are completely opaque. Nobody, but nobody, has any knowledge of what fiddling with any given nadan of the bejillions of so-called parameters will do (without just trying it and hoping it'll get involved in processing a test prompt). Let alone inverting that and calculating the *complete* set of changes that'll *reliably* ensure a *desired* change in the results[1]. And if they *do* luck upon an observable result[1, again] the next bit of training will jumble it all up again.

    So they can't honestly claim to be modifying the contents of the LLM to remove the ability to do the Bad Thing[2].

    Leaving them with trying to bolt on external filters - but made of what? What do they have that could possibly do that job? Something that can parse and comprehend the natural language prompt? But if they had that, why prat about with LLMs in the first place?

    The whole "we are adding guardrails" spiel just sounds like wishful thinking (if not outright delusional thinking or, say it softly, simple fraud).

    [1] there was an article on El Reg a while back (sorry, ref not hand) where one research group claimed to have found a nadan where "the concept of Paris" (IIRC) was stored and changed it so that the model inserted "London" instead (at least, for their test prompts), but that was no more than finding where one string token's id number had been stored and changing it for another, like getting the id wrong in a case-statement that prints a readable value for an enum. AND there wasn't anything presented there to definitively prove that their model didn't contain another activation path which led to another instance of that id, which hadn't been changed, so still got converted back to the string "Paris".

    [2] even if they tried an approach like an ice-pick lobotomy (e.g. a less nonsensical version of "feed in a prompt and if it generated a Naughty Result, look for the 'parameters' that were involved and set them all to zero") they'd need to send in every single possible variation of that prompt; good luck with that.

  12. Tron Silver badge

    I would be more impressed...

    ...if it could send me a couple of strippers and bill them to Musk.

    All AI is doing is automating photoshop for lazy people.

  13. John Brown (no body) Silver badge

    CAUTION: Mind bleach required if you continue reading.

    I wonder how long it would take to fix this issue if many, many people asked grok to create "bad" images of Musk and his family/friends? He's "free speech absolutist" of course, so this may not bother him. Like publishing the location and flight plans of his aircraft. oh, wait....

    1. Fruit and Nutcase Silver badge

      Aversion Therapy

      What if, when requested to "uncloth" an image of any person, Grok complies, but the human form and face in the resultant image is that of Musk. An image of a female adult "unclothed" results in an "unclothed" image with the "naughty bits" of Musk and his face. For an image of a younger person, it will be the same - the body and face of adult Musk. That should hopefully cause the requester to require mind bleach and stop seeking to uncloth images of whatever type of people he was looking for

  14. Anonymous Coward
    Anonymous Coward

    What?

    Recently, some X users noticed if they took a photograph that had been posted on the service and prompted Grok to remove the clothing from that photo, it would do so and post the results publicly on X. This may have violated various laws, such as the TAKE IT DOWN Act passed by the US Congress in April and signed into law in May, which "criminalizes the nonconsensual publication of intimate images."

    Yeah, except these were not real photos, so how could it violate said laws? If I drew a picture of a naked celebrity would that be considered illegal as well?

    This insane faux outrage, with an obvious political bias, is getting extremely tiresome. It's also sending us down roads once mocked in satire.

    Any of you lot out there that take this BS even remotely seriously should watch the following from 16:54 onwards: https://www.youtube.com/watch?v=eC7gH91Aaoo

    1. Dinanziame Silver badge
      Alert

      Re: What?

      If I drew a picture of a naked celebrity would that be considered illegal as well?

      As far as I understand, pictures of naked people generally become illegal if they look realistic. This is true for famous people like for children, which is why so many Japanese comics representing underage sex are legal (UK excepted), just like South Park episodes showing Trump.

      1. Anonymous Coward
        Anonymous Coward

        Re: What?

        As far as I understand, pictures of naked people generally become illegal if they look realistic.

        Right..... So most of the galleries on the planet are hosting illegal material then?

        And to think, this post was upvoted 4 times. I guess that's kinda the level we're dealing with round these parts nowadays.

        1. Dinanziame Silver badge
          Boffin

          Re: What?

          Please try to understand posts in their context. It will help you in your social life as well.

    2. Anonymous Coward
      Anonymous Coward

      Re: What?

      Thing is though, others, like Meta, have sought to prevent nudification abuse and related revenge pornization of their platforms, in accordance with the TAKE IT DOWN Act, acceptable behavior, proper taste, and social citizenship ... xAI on the other hand is like karoline Leavitt to chipmunk's anthem to wanton male hormones here: "boys will be boys", I guess ... Lame!

      And, frankly, that Brass Eye episode just plain sucked, bad.

    3. Doctor Syntax Silver badge

      Re: What?

      "If I drew a picture of a naked celebrity would that be considered illegal as well?"

      If you have the artistic talent you could try your luck. To do that properly, of course, you wouldn't do that A/C. You might learn something about the difference between criminal and civil law.

  15. Snittycat

    Darned stochastic parrots!

    Just like animal parrots who are taught expletives and bawdy song lyrics, LLMs like Grok never forget.

    Unfortunately Grok posts are available to everyone, and what seemed funny in 2026 might not be so in 2028, 2038, etc.

    Besides, revenge in kind is always a possibility...

  16. ConstantCustard

    Guns don’t kill people

    Just as in the truthful title, the problem is not the software the problem is with the users and their lack of morals. It’s just fun is not in my mind an acceptable excuse for placing people in fake compromising situations. The law needs to be enforced and social media should do better to detect and report criminal imagery

    1. Dan 55 Silver badge
      Devil

      Re: Guns don’t kill people

      Flooding the country with guns and educating users not to use them obviously works (see the US for a example), so let's flood the Internet with kiddie pr0n-as-a-Service and trust the users not to use it, it's surely the best way of tackling this.

      Making grok illegal until this functionality is removed is crazy talk. We just can't deprive a morally bankrupt billionaire of his toys.

    2. Doctor Syntax Silver badge

      Re: Guns don’t kill people

      US commentards have no idea how stupid that sort of argument sounds to those from civilised countries.

  17. david 12 Silver badge

    remove the clothing from that photo, it would do so

    A chatbot can invent a naked body image, and stitch the hallucination into a photo

    In an article which notes that chatbots don't have agency, lets also have at least a nod to the fact that chatbots can't remove clothing, even in a photo.

  18. Anonymous Coward
    Anonymous Coward

    Don’t feed the trolls

    Grok!

  19. Doctor Syntax Silver badge

    "a tweet or whatever you call it now"

    Can we agree on Xcretion?

  20. Anonymous Coward
    Anonymous Coward

    I'm slightly surprised at this.

    Picture this. A bunch of people in a pub, already a few beers in, when one takes a photo and gets Grok to turn it into a video where an attractive woman in a bikini approaches one of the blokes and sits on his knee. Everyone wants in as it all gets a bit Benny Hill. Then one of the women present asks if she could have a video of an attractive man in Speedos approach her and give her a kiss. Grok outright refuses. So Grok does have some guard rails

  21. Anonymous Coward
    Anonymous Coward

    Convergent agency

    The agency of an LLM is approaching that of humans: a hallucinatory perception that straddles the line between convincingly believable and convincingly fake. Eventually, we'll have little choice but to agree that the agency some LLMs express appears to be somehow comparable to that of humans but still somehow different.

  22. Anonymous Coward
    Anonymous Coward

    Grok

    Your fascism may vary.

  23. ferdinanx737

    It’s clear that Grok has no consciousness or free will. Everything it does is a result of human programming and the data it receives. The issue lies with humans, not AI.

  24. glennsills@gmail.com

    Strong identification for Chat Bot users.

    Chat bot users should be strongly identified so that when the user instructs the bot to do something it shouldn't everyone knows who that person is. Images especially should come with a username watermark

  25. Ken G Silver badge
    Facepalm

    "I don't know where any of this is going."

    Neither do I, but then I'm not writing a feature on it for a technology website.

    Maybe try and get a handle on it and then write a new article telling us?

  26. Omnipresent Silver badge

    well

    Grog (xtwitter) is a russian ruled chinese "influencer" hotspot with a dash of nkorean seasoning, and a side of bro cult, so I don't know what else anyone would expect?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon