back to article Adobe will use your work to train its AI algorithms unless you opt out

Adobe automatically analyzes user content stored on its Creative Cloud services to train AI algorithms unless people opt-out of the service. The Krita Foundation, an independent non-profit group building open-source graphics software for artists, raised concerns about the policy this week. In the privacy and personal data …

  1. Anonymous Coward
    Anonymous Coward

    Sounds like M$'s Github Copilot all over again then?

    I was considering the following problem the other day.

    If you put you open source project on GitHub, then under the terms of their service, they can use your code for any purposes they feel fit, circumventing any copyright protections you may have as author of the code.

    So you then decide not to host your open source project on GitHub, but somewhere else.

    Then someone forks your project and puts it on GitHub and GitHub filtches your code for their own purposes.

    It appears the only way out of this situation is to add an express term to the copyright wording in your work to expressly forbid the hosting of it on any such platform.

    Thoughts?

    1. Ken Hagan Gold badge

      Re: Sounds like M$'s Github Copilot all over again then?

      The only solution is to keep your code private. As far as I can see, things like YouTube proved long ago that a sufficiently rich entity can shout "fair use" and ignore copyright.

      If society still believes that copyright is a desirable concept, it needs to start enforcing it against rich people as well as normies like us.

    2. Max Pyat

      Re: Sounds like M$'s Github Copilot all over again then?

      In principle various open source licenses would cover this, e.g. that AI generated code based on GPL must also be GPL, and if you'd published under GPL it would be a "good thing" for GPL code to spread in that way...

      However as Ken Hagan points out, wealthy entities will ignore that as much as they can, and will be almost impossible to hold to account. Layers of AI tech will further obfuscate what's actually happening.

    3. Mike 137 Silver badge

      Re: Sounds like M$'s Github Copilot all over again then?

      "If you put you open source project on GitHub, then under the terms of their service, they can use your code for any purposes they feel fit, circumventing any copyright protections you may have as author of the code."

      Contractual terms can not trump legislation. They can therefore not 'circumvent' your copyright. Making code open source does not eliminate copyright in said code -- the license merely grants others certain specific rights in respect of that code. Consequently use for "any purposes they feel fit" is not permitted unless use for those purposes is specifically granted by the license.

      However, the rule of law seems dead. Large corporations (and indeed governments) increasingly ignore the law when it's to their advantage, and the problems we have are [1] that it's too expensive and complicated to argue and [2] in many cases the perpetrator is the arbiter of their own actions.

      Until we have internationally coordinated legislation to curb such abuses, they'll continue unchecked, and I'm quite certain that the impetus is lacking for such legislation be enacted. Money doesn't shout, it screams.

      In the meantime, keep your important files on your own servers -- cloud primarily serves the provider, not you.

      1. Filippo Silver badge

        Re: Sounds like M$'s Github Copilot all over again then?

        >Until we have internationally coordinated legislation to curb such abuses, they'll continue unchecked

        Various political movements all over the world have explicitly targeted international cooperation efforts (UN, EU, individual treaties, whatever), by painting them as attacks on respective nations' sovereignty.

        While such political movements have always existed, it is only in recent years that they have experienced widespread success. International cooperation is the only possible check on the power of multinational corporations. I suspect this is not a coincidence.

      2. Anonymous Coward
        Anonymous Coward

        Re: Sounds like M$'s Github Copilot all over again then?

        That is completely false - contractual law can overrule open source licenses. I've actually co-authored an open source license and spend years advising companies on various commerical strategies around open source, including how to leverage contract law for hybrid business models.

        There are a few exceptions to this in places like Germany where copyright is a "natural right" but in general, the law doesn't work the way you assume. Someone or an organization that holds a copyright can give away those rights either broadly, like an open source license, or narrowly, like a commercial contract. They can even do both at the same time.

        Now, I have not read the GitHub TOS, but it would not surprise me if they take that into account.....

        Posted anon for obvious reasons.

  2. sgp

    I'm sure there's a language in which Adobe means pure evil.

    1. lglethal Silver badge
  3. NATTtrash

    Opt in, opt out?

    So not a legal expert here, but did I not read somewhere/ remember that people have to be asked for permission explicitly? That assuming all is OK if nobody objects is not as it is stipulated legally, and you have to ask specific permission before you can do what Adobe is doing here? Or is this different, since Adobe does this under US law? And if it does, would that then include outside US users/ content automatically?

    (Like I said, no legal eagle, so please be gentle with me)

    1. Anonymous Coward
      Anonymous Coward

      Re: Opt in, opt out?

      As far as I can tell, Adobe plays the multi-headed hydra game here. When you install the mandatory creative cloud virus (let's call it what it is as what it does is by no means 100% acceptable), it brings along other friends to infest your system and the question is if you have given permission for all of that or just permitted the installer without fully knowing what you let yourself in for - that's what your tame lawyer will have to work out. It's also a swine to uninstall, btw.

      Our security people check what software does before purchase is approved. Adobe instantly ended up on the ban list so our people use Serif's Affinity products which, as of this year, are also far easier to license.

      However, when you start investigating further creative resources you will discover Adobe has quietly done a Microsoft and built a monopoly in fonts as well as webfonts (and they also show up more and more as image site owners). A web design with Adobe, for instance, requires you to use their webfonts (nothing downloadable) which, like Google's, provide what in the business is knwon as "atmospheric" intelligence (think of it as trends, a bit like Twitter hashtags, pre-Musk). Try to buy a regular font and you'll find that most of the font foundaries are again owned by Adobe so you can't buy a font, you have to subscribe and install their infernal virus-alike software to access it.

      However, because it's in the creative world, nobody has really picked up on that monopoly building yet so the plan is to tip off the EU commission. Sure, it'll take another decade before they take action but there is then at least a bit of daylight..

    2. Anonymous Coward
      Anonymous Coward

      Re: Opt in, opt out?

      These are not Personal Identifiable Information by themselves, so they don't automatically end under laws like the GDPR. It is still true that those images may also contain PII - of the author, or other people in the images - so Adobe would need explicit permission to process them, especially if processing is based on that very PII - for example face recognition, biometric data are PII.

      1. Anonymous Coward
        Anonymous Coward

        Re: Opt in, opt out?

        I think the permission question resides in the copyright aspect of whatever you create. Adobe has no legal entitlement to it (including when you use licensed images, because you or your company holds the license, not Adobe) unless it has buried that somewhere deeply in its agreements like Google has done in its Terms.

        I think in the UK this entitlement could well be challenged under the banner of unfair contract terms, but I guess in the US it would take an expensive lawyer (read: inaccessible to any small creator) to challenge it. But that's assuming it's in there - I don't know as our security team banned the company and its products outright.

  4. karlkarl Silver badge

    How will that work?

    Most Adobe users have blacklisted all the Adobe servers in the hosts file so that the cracks work effectively right?

    1. Anonymous Coward
      Anonymous Coward

      As usual, this damages honest people who pay for their software and work hard to create something, instead of the freeloaders that just take advantages of someone else's work. Including Adobe - if I pay for software and storage I do expect they don't take advantage of my work without explicit permissions.

      At last they are so gracious they don't - yet - attempt to analyze your local storage, something that Microsoft attempts too - for example using that "mysterious" Photo Media Engine that gets installed over and over even if you remove it each time.

      1. Anonymous Coward
        Anonymous Coward

        In that context I'm also a dead non-fan of any facial analysis that is performed on the pictures I take, I don't care about the reasons why or any alleged usability.

        It's the only thing I *really* do not like about Apple's devices because there's no way to disable it and frankly, I feel that ought to be fully optional because (a) I don't like it, (b) it's my machine/phone/tablet and (c) I don't know for certain that that data stays local, and facial analysis is *way* to hairy to release, certainly of third parties you don't have permission for or who you try to protect because it also works on kids.

        No, no, no.

        1. Anonymous Coward
          Anonymous Coward

          As far as I am aware (taken from Apple's privacy statements) all facial recognition processing is performed and stored locally; the automatically generated albums of people that are shared via iCloud only include data that provide the album links, not data on why.

          It would be good if such AI could be disabled by the user but, as long as it stays local it doesn't give me any personal concerns (and I find it useful when I'm searching for particular photos). I suppose the answer, if you don't like what Apple devices do, it to use a different make (although there are limited choices, and I would never recommend anything Android for increased privacy).

          It's a shame no company has come into the market with a viable mass-market alternative - but then, most people buy Apple devices because they like what they do (and will accept the compromises). The people on this forum are a very small minority of users and are able to hack settings. I like Linux on the desktop but rarely suggest it for relatives - for them, I try to persuade them to get Apple kit as it gives me far fewer support headaches.

  5. Howard Sway Silver badge

    Creative Cloud

    So once again, obviously inspired by the dismal example Microsoft has set with github, the cloud turns out to be a place where you pay a big company to take your work, run it through a mass data extraction tool, and sell it on to others in the guise of a "generative AI".

    As a business model it's creative all right, just repeat the word "cloud" thousands of times over enough years and people will be shovelling masses of money and data at you which you can then do what you like with before selling it on to other people to make even more money from.

  6. mark l 2 Silver badge

    It the cheek of Adobe to not only charge you for using their software but also then try and monetize you further by using your work to train their AI, in my mind they should offer users a discount if you want to opt in to that service since you are providing THEM with something, not making you go through the hoops of having to opt out.

    Glad I don't have to use Adobe software anymore, GIMP is sufficient for my needs. And would probably be for a lot of people who pay for creating cloud if design schools taught their students how to use other software than Adobe.

  7. Sceptic Tank Silver badge
    Devil

    THis could work...

    All that Adobe et al need to do now is to add an option where I can opt out of receiving payment for my work and any derivatives of it. Should make for an interesting payday once in a while.

    I must admit that I don't quite comprehend what "aggregated content" is supposed to look like. If my picture consists of a closeup of a bee on a flower and yours is that of a machine part, what does the aggregate look like? Sounds more plausible that they aggregate everybody's work into one directory and then sift through the files looking for stuff that can be monetised. Unless, of course, they put my bee on your gear wheel and sell it off as their own work.

    1. Anonymous Coward
      Anonymous Coward

      Re: THis could work...

      Unless, of course, they put my bee on your gear wheel and sell it off as their own work

      In which case you could both become rather well off as that would be a double copyright violation. That's why you really ought to read all of the Terms you agree to - the last time I looked, Google essentially pulled the same trick but it's very deep into their Terms in the hope you get bored before you get to that point. And no, there's no exception made for email AFAIK (I may be wrong, it was a while back when I looked and I no longer use any Google products as a result of that review).

  8. Mike 137 Silver badge

    "For anyone who prefers their content be excluded from the analysis, we offer that option here"

    Translation: "we breach your copyright for our own advantage unless you opt out"

    That's breaking the law, almost everywhere world wide. Contractual terms can not trump legislation, so ideally, there should be a growing mountain of legal actions (but sadly, probably not as fighting Adobe in the courts would be prohibitively expensive).

    1. Anonymous Coward
      Anonymous Coward

      Re: That's breaking the law, almost everywhere world wide.

      ... so I'm taking Adobe to court! :(

  9. bertkaye

    theft and more

    Raw theft, to use private files for training an AI. And, what delta in content is enough to make an AI-generated image different from a copyrighted private image? This gets Adobe into shaky territory.

  10. Anonymous Coward
    Anonymous Coward

    MS also guilty of this. SharePoint/O365 docs scanned for e.g. acronyms and catalogued.

    I understand the utility of having a look up table of company specific terms to pull from; but what commercial secrets are they slurping up to invest in the stock market based on scanning contents of customers files?

  11. John Brown (no body) Silver badge

    it doesn't look at content

    "The company said it doesn't look at content processed or stored locally on user's devices."

    When it comes to "computer vision", we may need to expand on the definition of "look". They have worded their statement to imply that "look" means humans. But computers "look" at things to, it's just that the source isn't always a pattern of reflected or emitted light. The end result is pretty much the same though, Patterns a data in a storage system. This means the "company" DOES "look" at the data.

    1. Anonymous Coward
      Anonymous Coward

      Re: it doesn't look at content

      I agree. A better word would be something like "evaluate" - native English speakers could probably come up with something even more precise.

  12. Anonymous Coward
    Anonymous Coward

    Don't use their stupid cloud services

    it really is that simple.

    I use Photoshop and Lightroom but

    with a few rules, you can keep them legal (in Adobe's eyes) and not send them any usable data.

    Don't use their so-called 'free' Phone/Tablet apps. AFAIK, they work off images stored in their cloud services.

    Adobe and MS are scummy whores when they do this. They ain't gonna get their mits on my images.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like