back to article It took Taylor Swift deepfake nudes to focus Uncle Sam, Microsoft on AI safety

Fake sexually explicit AI-generated viral images of pop royalty Taylor Swift have struck a nerve, leading fans, Microsoft's boss, and even the White House to call for immediate action to tackle deepfakes. The X-rated images, to which The Register won't link, circulated online over the weekend and were published on Twitter, …

  1. Catkin Silver badge

    Just AI?

    Shouldn't that be all involuntary porn, regardless of how it's made?

    1. Zibob Silver badge

      Re: Just AI?

      Only when it becomes easy apparently.

    2. lglethal Silver badge
      Facepalm

      Re: Just AI?

      Only when it affects the rich and famous, of course...

    3. mostly average

      Re: Just AI?

      I recall a similar moral panic a few years ago involving celebrities and manually photoshopped pornography. That and there's plenty of humans capable of creating very realistic images on canvas and practically every other medium. But when a human paints Taylor Swift in compromising poses with Ronald McDonald, it's art and a commentary on capitalism or something. Not that I condone either practice. I'm just pointing out there's no stopping the dirty mind, and there never was. I also think it's pretty hypocritical they only care now that the rapid princess is involved.

      1. Youngone

        Re: Just AI?

        The rapid princess. I love it so much I'm pirating it.

        That's mine now.

      2. Anonymous Coward
        Anonymous Coward

        Re: Just AI?

        There's also no stopping launching Duck Duck Go and a VPN and searching for "Taylor Swift deepfake nudes" in image mode, a friend informed me, but thankfully I ignored her and I remain uncorrupted.

      3. jake Silver badge

        Re: Just AI?

        "they only care now that the vapid princess is involved"

        FTFY

        1. FIA Silver badge

          Re: Just AI?

          There's many in the entertainment industry that could be called 'vapid', not Taylor Swift though.

          If you're not a fan of her work (I'm not) that doesn't detract from her obvious creative and business talents. For example, she's a Grammy award winner, including one for directing her own music video. (Becoming the first artist to win one in this category as a sole director). She has the business sense to pretty much own all of her work (she owns the production company that makes all her videos for example).

          Even just having the gumption to go 'fuck you' to the recording industry and re-record her early work to regain control of it deserves some respect.

          There's many vapid people out there, but she's not one of them.

          1. Patrician

            Re: Just AI?

            The definition of "vapid" is "offering nothing that is stimulating or challenging; bland." I would suggest that is a very good description of her music.

          2. Youngone

            Re: Just AI?

            I'm sorry, but her "her obvious creative and business talents." are completely unobvious to me.

            She's the child of multi-generational wealth who gained her first recording contract because her father bought a large stake in the company.

            The Grammys have no integrity and never did. They're an award for whoever sold the most records last year.

            I have heard her described as "the McDonalds of pop music" which sounds about right to me.

      4. Yet Another Anonymous coward Silver badge

        Re: Just AI?

        So what's the limit?

        If (.)(.) is Ms Swift it's illegal fake porn but if it's Ms Clinton it's 1st amendment protected free speech ?

        If the bill passed (it won't) the first target would be any cartoon of $POLITICO claiming it was obscene

      5. Anonymous Coward
        Anonymous Coward

        Re: Just AI?

        It's certainly true that controversies surrounding manipulated images and celebrity personas aren't new. The debate often delves into the realm of artistic expression versus exploitation. Human creativity knows few bounds, whether it's digitally manipulated imagery or provocative paintings.

        The recent focus on the involvement of a prominent figure like the rapid princess does seem to heighten scrutiny and criticism. It raises questions about societal perceptions of privacy and consent in the digital age.

        Ultimately, navigating the complexities of art, morality, and technology requires nuanced discussion and consideration. As for the keyword you mentioned, Luis Zahera Pareja, it's interesting how different perspectives and interpretations shape our understanding of these issues.

  2. b0llchit Silver badge
    Facepalm

    Evolution of the stick

    Funny how all new things always get used for war and porn (not necessarily in that order).

    Must be our highly evolved and extremely civilized mind that is to blame.

    1. Anonymous Coward
      Anonymous Coward

      Re: Evolution of the stick

      Some use war as porn. You only have to look at the History Channel's output ...

    2. DS999 Silver badge

      My high school English teacher

      Said all of literature could be boiled down to stories about sex and death. He may have been overgeneralizing a bit, but only a bit. If it is (mostly) true for literature it would be true for other forms of media as well...

      1. Yet Another Anonymous coward Silver badge

        Re: My high school English teacher

        What about butterfly larvae with eating disorders?

  3. A Non e-mouse Silver badge
    Facepalm

    "Your Scientists Were So Preoccupied With Whether Or Not They Could, They Didn’t Stop To Think If They Should "

    1. Anonymous Coward
      Anonymous Coward

      I suspect that any thinking was more along the lines of "Pooorrn! Moneyyy! Fuck Yeah, we're rich!!!"

  4. Flightmode

    Now that megastar Taylor Swift has been pulled into this quagmire...

    Giggity.

  5. Anonymous Coward
    Anonymous Coward

    Misdirection....Of Course!

    It's an election year in the UK and the USA.

    If you think the AI abuse of Taylor Swift is a problem.........just wait...........

    1. Anonymous Coward
      Anonymous Coward

      Re: Misdirection....Of Course!

      Any comment from El Reg's Australian correspondent on this story?

      https://www.bbc.co.uk/news/world-australia-68137013

      1. John Brown (no body) Silver badge

        Re: Misdirection....Of Course!

        LOL, "Photoshop automation". My wife would like a copy of that version, once she's stopped laughing at the excuse :-)

        I wonder if it can "automate" away my beer gut?

        1. VicMortimer Silver badge
          Childcatcher

          Re: Misdirection....Of Course!

          Yes.

          https://www.colorexpertsbd.com/blog/photoshop-liquify-for-body-slimming-and-reshaping/

  6. jake Silver badge

    We used to call them ...

    ... "Frankensteins", and telling the newbies that they weren't real every September was part of the job in the early days of alt.* ... The letdown when it was pointed out that yes, that was definitely Lynda Carter's face, but unfortunately the body is last year's Miss October was sad to see.

    What, you kids think YOUR generation invented them? The Venus of Willendorf probably has a recognizable (to the artist) hairdo on a fantasy body.

    I should probably point out that IMO the sad losers making this kind of fake should probably step away from the keyboard and go feel some grass under their toes.

    1. LionelB Silver badge

      Re: We used to call them ...

      > I should probably point out that IMO the sad losers making this kind of fake should probably step away from the keyboard and go feel some grass under their toes.

      I suspect that those making serious $$$ out of this probably don't see themselves as "sad losers" (then again, those may well not be the same people living in the Cheetos wrapper-strewn basement of their mum's house).

  7. Khaptain Silver badge

    All porn will soon be AI

    I would be surprised if the majority of porn won't soon all AI generated, it's not like there is no data available on which the models can be trained.

    It will probably spell the end of Only Fans as there won't be anything to offer there that other AI generated porn sites wont already offer.

    I can probably go so far as to imagine that you simply type in your porn fetish desire of the day, no matter how absurd, and it will be dynamically created just for you.

    1. blackcat Silver badge

      Re: All porn will soon be AI

      All I can think of is the fake 'computer' Randy Marsh uses in Over Logging.

    2. Yet Another Anonymous coward Silver badge

      Re: All porn will soon be AI

      Strangely that wasn't something the SAG strike mentioned

      Is there a separate union for 'artistic performers'? The Brotherhood of Strippers, Flaunters, Fluffers and Amalgamated trades"

  8. The Dogs Meevonks Silver badge
    Facepalm

    What a surprise

    The people that created the face eating leopard bot, shocked when it eats faces.

  9. lglethal Silver badge
    Go

    " The bill aims to criminalize the creation and sharing of sexually explicit non-consensual AI pictures, with penalties of up to ten years in prison."

    So just curious why not put some requirements on the makers of the AI programs to prevent this?

    The companies who make chemicals that can be used as precursors for illegal drugs, have to make sure that they monitor their supply chains, and follow Know-Your-Customer style regulations.

    Put a requirement on the AI firms. Public version has handrails that prevents any form of porn being created. And for those that want to create porn, they need to supply full details - address, bank account, passport, etc., and if caught producing deep fakes without explicit permission, then those details get handed over to the authorities.

    Ai has plenty of good uses (supposedly), but it also has a lot of bad uses (to which it's already being put to use). Treat it as you do all other Dual Use technologies...

    1. Charlie Clark Silver badge

      How is this supposed to work? The only thing I can think like this has ever worked, is the restrictions on copiers and printers in the way deal with bank notes. Not that this stopped counterfeiting.

      The models and the knowledge required to make them are out there, there's no sending the tide back out. But you could use existing legislation about people's right to their own image to enforce take down notices and fines. But only in countries that have such legislation: tough luck America.

      1. Yet Another Anonymous coward Silver badge

        The makers of typewriters and pens have to put technology in place to prevent the creation of fan written "slash fiction" - so this is just the same

    2. Ken Hagan Gold badge

      Perhaps because AI isn't good enough to actually understand its own output?

      It needs a human being. Happily there is one handy, in the form of the end user. Sadly, some end-users are horrible. Happily we have a (legal) system for that. Sadly, we actually have several, which provides for an escape route even in the comparitively few cases where the end-user can be identified.

  10. navarac Silver badge

    We knew it would happen!

    Well, we knew it would happen. Any technology is utilised by the porn industry and we should not have been surprised that it would have been in the case of AI. Time to reign in the Big Tech industry with legislation. They can not be allowed to be autonomous above any government. AI itself needs to be halted until it can be controlled, and NOT by Microsoft, X, Meta etc etc.

  11. Anonymous Coward
    Anonymous Coward

    The real shit

    Hits the virtual fan

  12. Kurgan

    There is no undoing the AI

    Too late. As for the atomic bomb, there is no way of undoing what has been done. AI image generation is something that people can do at home now. So even if the big players will try (and fail) to block porn, home users will make it. A LOT of it.

    There will be laws, rules, the whole lot of useless regulation, and this will not stop deepfakes, in porn and in scam or voters manipulation, fake news, etc. But regulation will surely hinder legitimate users, as it always happens.

    Speaking of something much more mundane, for example, some 30 years ago I made model racing cars with two stroke engines. They ran on alcohol, oil and nitro methane. We bought the three components and made our fuel at home for a low cost. Then laws came that made it impossible to buy this "industrial use" alcohol because someone abused it to make alcoholic drinks. The abusers still have a way to get alcohol, and we could not make our fuel at home. We had to buy it already made at 5 times the cost.

    1. Zolko Silver badge

      Re: There is no undoing the AI

      Even worse: you can't buy nitro anymore because some "terrorists" made bombs with it. You can't even buy the gasoline for those RC cars with more than 16% nitro where I live. Used to be 25% some time ago.

      But aaaahhhhh .... the noise ! And the smell !!!! Electric cars might be faster - dunno - but hell are they boring

      1. jake Silver badge

        Re: There is no undoing the AI

        Quarts of nitromethane from Hyperfuels cost for about 35 bucks. Gallons about $85. They ship. Mix your own.

        (If you're a full-sized drag racer, they also sell nitro by the 55 gallon drum ... I've used them for years. They will even ship to the track, overnight, in an emergency. Recommended.)

        https://hyperfuels.com/policies/shipping-policy

        "Electric cars might be faster"

        Not by much, if at all. And they tend to burn out quickly. My 50+ year old toy nitro cars still work. For small values of work for some of them ... low compression. The oil in the fuel otherwise preserved the moving parts quite nicely. Compression can be fixed with an easy and cheap rebuild if I ever feel the need. If you have access to a mill, machining your own top-end is a good learning tool for your sprog.

  13. Anonymous Coward
    Anonymous Coward

    I'm sure that Taylor Swift pissing off some groups by encouraging her fans to register to vote, and also annoying US football fans by going to watch some games, is in no way related to the circulation of these fakes.

    1. Anonymous Coward
      Anonymous Coward

      I wouldn't discount "horny people exist"

    2. Cruachan Bronze badge

      Nor is the fact that congress wants to take action when deepfakes have been around for at least 5 years and photoshopping celeb heads on to nude bodies a lot longer than that.

      Something does indeed need to be done about deepfakes and AI, but my cynic alarm is going off when all of a sudden it's headline news in an election year and the person who's been deepfaked is known to be a supporter of one political party and to encourage her massive and dedicated fan base to vote.

      It would exactly the same in this country though, I'm sure people of my age will remember Tony Blair and New Labour accepting any and all endorsements from Britpop musicians at the time for example. Politicians are not going to turn down celebrity endorsements (in most cases, there are of course celebs whose brand is tarnished who would not be welcomed).

  14. Dan 55 Silver badge

    "We have to act"

    ... Nadella told NBC News, referring to guardrails that need to be put in place to prevent Designer from creating this kind of material.

    The time to act was at the requirements stage, Satnad.

    Big Tech surprises exactly nobody yet again.

  15. Bebu
    Windows

    Apart from any legislation likely being ineffectual...

    Non consensual sharing of private or intimate material probably should attract sanction even in the absence of image or audio manipulation whether by AI or more manual techniques.

    Where an individual has published or made material generally available there ought to be an overriding enforceable requirement that such material can only be further disseminated substantially intact, unaltered and without any misrepresentation (of context say.)

    So grafting Ms Swift's head on to the torso of another athletic miss would not be acceptable nor gafting the head of Suella on to the torso of Ms Swift which even ignoring the incredible bad taste should also be unacceptable.

    Oddly enough very recently one of our state MP was subject to a slightly more benign altered image even if there is an element of me too about the complaint it is still an unacceptible practice.

    In the end the ability of modern computing power and AI/LLM to generate images or voices etc that to humans look and sound so alike as to be practically indistinguishable but are in fact are demonstrably different will impede the efficacy of any legislation.

    A prurient imagination make me wonder what a fully computer generated porn industry would look like - fluffers would be redundant for a start, as would the "talent" I imagine except possibly a few computer as artists' models or inspiration. Like the six million dollar man "we have the technology" more or less. With total automation the "victims" would be the consumers and those affected by the possibly altered perceptions and behaviour of those consumers.

    I am surprised Musk hasn't invested in this as he has the branding. :)

    (If the Tesla self driving software is any indication it would be a very brave man who purchased an X branded cybernetic sex doll.:)

    1. Jellied Eel Silver badge

      Re: Apart from any legislation likely being ineffectual...

      A prurient imagination make me wonder what a fully computer generated porn industry would look like - fluffers would be redundant for a start, as would the "talent" I imagine except possibly a few computer as artists' models or inspiration. Like the six million dollar man "we have the technology" more or less. With total automation the "victims" would be the consumers and those affected by the possibly altered perceptions and behaviour of those consumers.

      It would probably look like hentai, only weirder. So I did some research and found some sites that have fake nudes. They range from fairly realistic to extremely improbable. Like a woman with 4 nipples, or assorted anatomically impossible poses. They do seem to be improving, but still suffer from the uncanny valley effect and a lot just look wrong. Then again, so do some shots of human models when the photographer overdoes it with smoothing and other filters. Then again, the 'beauty' industry has also convinced a lot of people that they should look like AI people with fillers, skin peels and other expensive procedures.

      I guess for the porn industry, the challenge would be cost. So I don't know how long it takes to render an AI nude, but rendering and animating would take a lot more grunt. And it's not like there's a shortage of real people in the industry that will provide content for a lot less money.

      1. Scott 26

        Re: Apart from any legislation likely being ineffectual...

        > So I did some research

        Not all heroes wear capes

  16. Jimmy2Cows Silver badge

    Criminalize the creation and sharing

    I get criminalising sharing these fakes. The distress and damage potentially caused by such sharing is abhorent.

    But criminalising just the creation of them? Privately, where no one else can see? Apart from being completely ineffective, how's it any different to sticking a photo of her head on a pronstar's body while she's going at it? As mentioned above ("Frankensteins"). Been going on for decades. Not a new thing. Just easier with AI.

    How people get their kicks behind closed doors is no one's business but their own, if it's not actually harming anyone. All this puritanical hysteria around the existence of these images, and generally others like, needs to stop.

    1. Anonymous Coward
      Anonymous Coward

      Re: Criminalize the creation and sharing

      Don't underestimate the opportunity this presents to manufacture consent for some very intrusive measures. If the "response" to "public pressure" is swift, it's worth wondering how organic the pressure ever was

      https://www.theguardian.com/media/2007/jun/13/politicsandthemedia.pressandpublishing

      I don't think this phenomenon is Labour-exclusive, I just think that self destructing, secure messages mean we're unlikely to see documentary evidence on quite the same scale in the present or future

      1. Anonymous Coward
        Anonymous Coward

        Re: Criminalize the creation and sharing

        correction, this is the story I referred to, the other link was a later overview:

        https://www.theguardian.com/politics/2004/may/24/uk.pressandpublishing

    2. Jellied Eel Silver badge

      Re: Criminalize the creation and sharing

      How people get their kicks behind closed doors is no one's business but their own, if it's not actually harming anyone. All this puritanical hysteria around the existence of these images, and generally others like, needs to stop.

      But it can and does harm people who're victims of 'deep-fakes'. So there was a story recently of someone who took an instagram pic of a clothed woman, unclothed her and circulated it. Or there have been cases where people have turned pics of school kids into CP. Or for celebs, much of their value can be in their image and likeness. So someone 'photographed' riding a donkey may not get a Disney contract or sponsorship deals. There's also a lot of IPR around likeness, so people can't just slap Swift into an ad without getting a licence deal, which may not happen if the product isn't on brand.

      There's also sometimes just strange behaviour by search engines. I had someone saying some odd things about my current partner that ended up with them telling me to search for her with safe search off. One of the first results was for some porn sets with a woman with the same first name and last initial. Sets were from around 10yrs ago, and bore a passing resemblence to her, and enough to make me look closer. If she'd been applying for a job, maybe an HR person would do the same thing and bin the application, or maybe people will just get the wrong idea. She already knew about the pics, but there's no way to get the search engines to break the connection.

  17. James O'Shea Silver badge

    Ah, yes

    The latest episode (as of the time of writing) of the venerable cop/lawyer show Law & Order, as available last Thursday, viewed by me on Saturday, thanks to my DVR, wasn't very good. The new L&O episodes are simply not up to the days of Brisco and Greene. There's a reason why I didn't bother watching it live.

    However, a significant part of the plot was an apparent deepfake generated by a 'tech billionaire' to stitch up one guy for murder. Now, the guy was guilty; he'd confessed but the confession was thrown out, but McCoy & Co used the deepfake despite Really Serious Concerns about it. As in, they were pretty sure that it was a fake. And they used it anyway. (Hint: the fake showed the murder in real time, complete with the murder weapon in the guilty guy's right hand. The guilty guy is left-handed, and McCoy & Co. knew it.)

    Deepfake porn is just the beginning. Hang on, laddies and lasses, it's going to be a bumpy ride.

  18. martinusher Silver badge

    Haven't seen any yet.

    A storm in a tea cup. After all, we're all nude under our clothing.

    In the US we're obsessed with public nudity.....its all part of the need to keep our womenfolk under control since they are someone's property after all. If we could just get over this then we'd realize that deepfake nudes of anyone are intrinsically boring. Even real nudes are. Once we realize this then the images have no value, they're a pointless waste of pixels.

  19. Ideasource

    More deep fakes for eventual peace.

    If deep fake nudes become prevalent enough then it will discredit any genuine nudes.

    This is helpful to society because it'll break the social mechanism by which leaked nudes cause social harm.

    If everybody becomes casually discrediting towards nudes out of suspicion of fakery they lose their power and we no longer have to pay the overhead of effort in focus to moderate them to protect living people's reputations.

  20. Grogan Silver badge

    I've seen photochopped (whatever program they used) images of celebrities that looked real enough. Real enough to make me feel guilty looking at them.

  21. talk_is_cheap

    Not sure why MS wants to take the credit

    MS seems to want to claim that their systems and tools are responsible for these images. I guess the alternative is for them to have to admit that there are a rather lot of open source-based 'AI' tools out there that can be run locally that do what is known as text-to-image processing.

    There has been a sub-culture using these tools for some time and sadly they are getting rather good as tools and the people involved have been getting rather good at combining different tools to generate better and better output.

    I guess that MS is worried that while their share price may take a hit if their AI is linked to such images, the resulting hit will be a lot larger if it becomes well-known that their AI is not linked to such images.

  22. Alf Garnett

    Good luck

    I don't know what anyone will be able to do about this that will work. C.p. has been illegal in the U.S. for 50 years or so, but that hasn't stopped people from making it, distributing it or finding it. Calling attention to it in the media only serves to tell more people it's out there. I hope some way can be found to stop this garbage, but I'm not holding my breath.

  23. johnrobyclayton

    Only non-cosensual?

    the Preventing Deepfakes of Intimate Images Act. The bill aims to criminalize the creation and sharing of sexually explicit non-consensual AI pictures, with penalties of up to ten years in prison.

    I am sure it would be quite easy to ask an AI image generator to ensure that the produced image is not recognizably similar to any images of anyone whose consent would be required.

    Easy enough to get around.

    1. John Brown (no body) Silver badge

      Re: Only non-cosensual?

      Whose consent would not be required? Remember, "deep fakes", by definition, are intended to portray real people. If it's a non-recognisable person, specifically generated to not resemble anyone specific, then it's "art", not "deep fake".

  24. ComputerSays_noAbsolutelyNo Silver badge
    Holmes

    Prayer for the poor

    ... loosely based on Michael Moore.

    If you want an injustice in the world to be tackled,

    pray that it befalls some of the rich and powerful.

  25. 0laf Silver badge
    Big Brother

    Why now?

    Just wondering why the trigger now?

    There has been celebrity pr0n fakery going on since before the internet including other AI stuff more recently.

    What's so different about the current ones to cause outrage where there was none (or little) before?

    Is it just because Taylor Swift is a near diety in the US?

    1. Tim99 Silver badge
      Trollface

      Re: Why now?

      Well obviously, it's because a Taylor Swift is a "Pentagon asset": The Guardian.|

    2. John Brown (no body) Silver badge

      Re: Why now?

      Yes. But also because past generations meant the creator had to have some actual skill as an artist to be able to pull it off. Or skills with physical cut'n'paste photo editing. Or the ability with Photoshop or similar to blend lighting effects to get the realism required. Now, it's possible for kid in living in mommys basement just typing into a computer to tell it what he wants, and refining the text a few times and using the produced images as source for said refinements. Very little skill required, just time and maybe some patience.

      Oh, and it's an election year so "Something Must Be Done" about the public outrage du jour. (and if there isn't one, create one!)

  26. Anonymous Coward
    Anonymous Coward

    Benefits of unregulated AI porn

    1. Once AI porn is indistinguishable from reality, any porn will stop being a blackmail material. Anyone could use "fake news" argument with plausible deniability. Real human suffering will be reduced. Exploited persons may suffer less. Exploitation will become unnecessary, as one could "undress" anyone. Also the huge volume of newly generated content will make specific person less of an interest or finding them online less likely.

    2. Everyone will realize the dangers of disinformation and become much more selective, if not analytical.

    3. Real journalism and personal reputation will get proper weight.

    4. Politicians' results, not looks or youth mistakes, will become a stronger argument.

    5. Technology will accelerate. Apparently previous porn cycles improved tech by faster test iterations.

    6. Banning AI porn is waste of resources. Serious AI dangers deserve attention instead.

    After a few efforts with irrelevant (censored?) Google results, I found those Taylor's nude photos. Personally, most of them are vomit-inducing, because I personally find her nice and professional. But she is smart enough to understand that those photos may increase her popularity, while also protecting her from potential real photo leaks. It is hard to discredit someone with a great reputation.

    Besides, "Cod knows what makes YOU horny"

  27. Anonymous Coward
    Anonymous Coward

    It is singularity already

    Just look at those CEOs flying in panic like a flock of birds.

    Or at Musk with his "freedom of speech", while banning Talor's pictures from Twitter.

  28. TheMaskedMan Silver badge

    "...and even the White House to call for immediate action to tackle deepfakes"

    One should never waste a good crisis, particularly with an election looming. There are bound to be deepfakes galore doing the rounds, featuring politicians of all parties doing everything you can imagine. Best for them to raise awareness now, so nobody will be surprised. And maybe the odd genuine image can be passed off as fake, too. Handy.

    As others have pointed out, there's nothing new (or interesting, to my mind) about slapping a celebrity face on a naked body, so I don't see why it's so surprising or shocking just because it's done with AI.

    Of course, it probably shouldn't be done at all, but it is and always has been, and no amount of legislation will prevent it from continuing to be done. Best for the rich and famous to just get on with being rich and famous - it's a struggle, I know, but I'm sure they'll cope.

    AI porn is an inevitable thing, if it isn't a thing already, and it will be interesting to see what that does to the adult industry. Social media influencers are already squealing that AI influences are stealing their lunch (fake, uninteresting people replaced by fake fake uninteresting people, how sad) and I expect porn stars will be going the same way soon. Will they go on strike? March naked through the streets? Or adapt by having a model trained exclusively on their images and licensing it to the producers? Popcorn time.

  29. Zack Mollusc

    I don't get it

    We know not to trust plain text such as "Nobody needs to pay their taxes this year - Official Statement From IRS", because any random idiot can type it (and did, just now).

    We know not to trust audio because that also can be faked.

    We know not to trust photographs because photoshop.

    CGI and deepfakes are just another reason to distrust images/video.

    What is the problem?

  30. PapaPepe
    Facepalm

    Nothing much to see here...

    A pop entertainer rakes in millions selling her sex-appeal. Meanwhile, one of her lonely customers acts on his wet dream and shares a digital file with his compadres. I harbour no sympathy or antipathy for either, and I specifically see no particular reason for the Chief Magistrate to stick her nose into the case.

    For those finding graphic artist's behavior disgusting, please consider the following beneficial consequence of the forthcoming deluge of such imagery and footage: simple and convincing deniability by the models featured in such material, whether real or fake. The ability of blackmailing the model is from now on inversely proportional to the volume of such product on the public 'net.

  31. VicMortimer Silver badge
    Stop

    Waste of time and money

    What a waste of time and money.

    There's no way this could be even remotely constitutional.

    I mean, fake nudes of Trumpty-Dumpty with it's tiny mushroom were all over the place a few years back, no way could banning those pass constitutional muster, but this would purport to ban them.

    The courts would shoot it down immediately.

  32. Innique

    The issue needs a legislative intervention. Who do you sue for slander if not the platform, you have made the internet so anonymous it is a bigger problem. Oh I will just verify everyone that post, really ya the people getting hacked on FB aren't enough to show that it is easy enough to do. https://www.wired.com/story/indias-government-wants-total-control-of-the-internet/, foreign governments are affecting the US, obvious or not. TS is only bringing light to the situation, but have no doubt someone will make a Biden falling down video that isn't real for the election.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like