back to article OpenAI latest to add 'Made by AI' metadata to model work

Images emitted by OpenAI's generative models will include metadata disclosing their origin, which in turn can be used by applications to alert people to the machine-made nature of that content. Specifically, the Microsoft-championed super lab is, as expected, adopting the Content Credentials specification, which was devised …

  1. Adam Azarchs

    This could be useful...

    If, and only if, non-AI-generated content all starts using this as well. So if for example cameras started to sign the images they produced (would have to be with a hardware-protected key, which certainly complicates things). And then photo editing tools would need to sign their output as well (including whether or not whatever they started with was signed). That gets even trickier unless it's a cloud based tool, because keys can always be extracted from local software. It's not going to be easy.

    Basically, it's not very helpful until things get to the point where _not_ having that provenance information starts to become suspicious.

    1. TheMaskedMan Silver badge

      Re: This could be useful...

      That sounds like a privacy nightmare. If cameras sign images, surely it would be possible to determine which camera - and by implication which person - took the picture, or edited / created it. I routinely strip metadata from images I upload, and have been doing so since long before generative AI was a thing. That won't be changing to satisfy AIphobic brigade.

      I'm not thrilled with software spontaneously plastering watermarks on an image I'm viewing / editing, either. If they're going to do this, there'd better be a way to turn it off, or I'll be stripping metadata out of downloaded images, too. Sure, include the logo in images viewed on social media if you must, but I want control over whether I see it or not.

  2. Richard 12 Silver badge
    WTF?

    90KB!?

    What's in it?

    The prompt (max. 400 bytes), model name, version and marker are all text. So 1KB maximum payload, surely?

    An x509 certificate is under 2KB. If it includes an entire certificate chain, then let's say 10KB of certificates.

    Assuming the whole lot (certs and all) doubles in size when cryptographically signed (not true), and is incompressible, that'd be 22KB.

    It seems that the reason it's large is that it includes a thumbnail so you can manually compare it to the actual image. Weirdly none of the marketing blurb seems to mention this.

    Amusingly it seems to use both JSON and CBOR. Which is a bit odd, did they just want more ETLAs?

  3. Filippo Silver badge

    Better than nothing?

    Are we sure it's better than nothing? In the field of security and trust, a solution that doesn't really work (such as this one) could lead user into a false sense of security, or, at the other opposite, erode trust in trustworthy sources. Both results are arguably worse than nothing.

    1. Duncan10101

      Re: Better than nothing?

      Yep. It's security theatre.

      1. Richard 12 Silver badge

        Re: Better than nothing?

        Essentially, it can only prove that an image is AI-generated by its presence.

        It does not and cannot prove the image is not AI generated.

        So it's basically useless.

  4. Simple Rick

    Screen grab

    Would this metadata be magically transferred to a screen grab of an AI generated image?

  5. Dan 55 Silver badge

    OpenAI has all generated output anyway

    So they should just keep a copy of all generated images and offer a TinEye-like image search to anyone who wants to check if an image they have was generated by OpenAI or close to what was generated. And the same goes for the other ML image generators. It'll be the cost of getting on to the ML image-generation bandwagon.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like