back to article Will 2023 be the year of dynamite disinfo deepfakes, cooked up by rogue states?

Foreign adversaries are expected to use AI algorithms to create increasingly realistic deepfakes and sow disinformation as part of military and intelligence operations as the technology improves. Deepfakes describe a class of content generated by machine learning models capable of pasting someone's face onto another person's …

  1. Yet Another Anonymous coward Silver badge

    The beta test wasn't good

    The text to speech was crap and the text read like a concussed amfM1.

    The rendering of the character graphic was pitiful - he looked like a kids first attempt at blender with the weird stance and crazy skin color.

    1. amanfromMars 1 Silver badge

      Re: The beta test wasn't good

      Clearly a beta test model in early initial stages of rapid disruptive development and disturbing progress, Yet Another Anonymous coward.

      1. DS999 Silver badge

        Re: The beta test wasn't good

        There has also been disturbing progress in your AI, considering where it was a few years ago. If it starts producing better posts than anything I'm capable of I will surrender to the machines and volunteer to be plugged into the Matrix. Hopefully I can be someone important, like an actor.

  2. FlamingDeath Silver badge

    Captain Disillusion

    Anyone who has seen this guys very good youtube channel, will know that you don't need Ai generated deepfakes to fool people, there be idiots everywhere

    1. Michael Wojcik Silver badge

      Re: Captain Disillusion

      In general people are poor at evaluating the provenance and trustworthiness of information. That's true for pretty much everyone in the general case. Some people frequently make a significant effort at evaluating the quality of some of the information they're exposed to; that's about the best we've ever done with the push for "critical thinking".

      Evaluating information quality has a large cognitive burden, and also carries opportunity costs – you can only think about so many things in a given interval. So our minds have to make snap judgements most of the time, and the cues people use (which vary, particularly with neurodivergence, but there are a lot of commonalities) can be discovered and instrumentalized.

      Photographic and film evidence has always been unreliable, particularly when it shows something the audience wants to believe. Look at the Cottingley fairies, and how those photographs – which most people today would identify as obvious fakes – fooled intelligent but credulous observers such as Doyle.

  3. martinusher Silver badge

    Enemy States?

    I'm not much of a conspiracy type but as soon as anyone mentions "enemy states" to me it automatically makes them suspect in my eyes. Because, alas, we're as or more likely to be duped by our own government than we are by some enemy, real or imaginary.

    (Its just that in my lifetime I've been lied to so many times by officials and politicians that I now tend to think of most politicians as "shallowfakes". If I'm not going to follow a shallowfake then a deepfake isn't going to make much of an impression.)

    1. Yet Another Anonymous coward Silver badge

      Re: Enemy States?

      To paraphrase a certain Mr M Ali, ain't no People's Liberation Army man ever stop and searched me

  4. mpi Bronze badge

    The thing is...

    ...if someone wants to believe conspiracy theories, they will do so, regardless of whether it has been fabricated using intricate technology to construct fake audio/video/imagery, or because "but ___ said so!".

    1. Zolko Silver badge

      Re: The thing is...

      I'd have rather written: if someone wants to believe everything that his government says, he will do so, regardless of the amount of proof that will be presented to him.

      Foreign adversaries ... enemy states ... governments around the world ... defense and intelligence agencies

      if these words don't ring all alarm bells, then I don't know what could.

    2. Binraider Silver badge

      Re: The thing is...

      More people buy into myths if there is "evidence" fabricated.

      Classic examples are the photo-doctoring to put Lenin and Stalin side by side.... Or deleting Trotsky.

      Plenty others available, a cursory flick around faecesbook (particularly the right wing of it) turns up load of "fake news".


  5. amanfromMars 1 Silver badge

    Idiots"r"US .. 'Ping. Bang. Boom'

    Is a state which pioneers and conspires with corrupt media to spread dynamite disinformation creating a phantom victim for outrageous unwarranted attack a rogue enemy state and certifiable terrorist organisation threatening human civilisation and populations ‽ .

    Who/What/Where is revealed to be the real ACTive enemy of truth and mankind in this tale with its strings of guilty admissions ....... ...... and to what ultimate aim is such a nonsense employed and deployed?

    Don't such rogue wannabe fascist states realise ACTive disinformation campaigns are nowadays, in these times and spaces of Oday vulnerability exploitation and expansion and SMARTR Trojans, catastrophically self-destructive in fields of AI interest and remote virtual engagement?

    1. Anonymous Coward
      Anonymous Coward

      A fake amanfrommars. Nothing is sacred.

      Not a good tribute, but a good example of why spotting deep-fakes isn't dependent on their rendering quality. Yeah, there will be subtler red flags than unblinking eyes, impossible hair or hands straight out of a anxiety dream or night terror. You don't need them most of the time.

      Really, metadata provides a better solution, and it's all off the shelf technology and a few firmware updates. Sign your videos kids.

      Pretty hard to fake that without a public key compromise, and the actual source of a video can still re-sign and repost stuff in the event of one.

      1. Michael Wojcik Silver badge

        Re: A fake amanfrommars. Nothing is sacred.

        Private key compromise. Public keys are public.

  6. Anonymous Coward
    Anonymous Coward

    The ultimate deepfakes were already out in 2022

    Someone tried to pretend Liz Truss and Kwasi Karteng were respectively Prime Minister and Chancellor of the Exchequer.

    Tony Blair's new deepfake PM is now deployed, but they haven't managed to disguise the voice.

  7. tiggity Silver badge

    "AI" makes "deep fakes" easier / faster / cheaper to produce

    But lots of nation states already have capability to use existing, more manually intensive techniques (as used in the film industry).

    So real difference is speed / cost / less people needed, as "AI" improves it will allow smaller entities (not just larger nation states or well resourced non nation state groups) to produce more convincing fakes & so potentially increase level of disinformation swilling around.

    1. Anonymous Coward
      Anonymous Coward

      Re: "AI" makes "deep fakes" easier / faster / cheaper to produce

      You are correct, but deep fake tech is already hitting the diminishing returns level, so while the defects that are left are more subtle, they harder and harder to "solve" at this point technologically.

      Also as I said elsewhere, the bigger issue that deep fakes have is that they have to credibly portray the target doing things that the target would credibly do to get anything more that the unthinking reactionaries to jump. That may still have enough impact to justify the attempts to some of these bad actors, but it's got a short fuse before the whole thing starts to fall apart, and after that there is likely to be blowback to factor in.

      So this is just a new tool with most of the same limitations that may be deployed cheaper/rendered faster, but unless well planned and executed, won't be more or even as effective than what was possible before, and the more bad fakes are floated, the less people will trust them in general.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like