back to article Filing NeMo: Nvidia's AI framework hit with copyright lawsuit

Nvidia is the latest tech giant to face allegations that it used copyrighted works to train AI models without obtaining the permission of the authors. A proposed class action lawsuit [PDF] filed against the GPU supremo in San Francisco on Friday March 8 claims the company used copyrighted material to train large language …

  1. Neil Barnes Silver badge
    Holmes

    neural networks that meet today's needs

    So we want search engines that make up the answers if they don't find them, image generators that can't count arms, writing systems with added blandness?

    Amazing. I never knew.

    1. Anonymous Coward
      Anonymous Coward

      Re: neural networks that meet today's needs

      Nvidea made $31 billion profit in the 12 months to October 2023 and returned $10.4 billion dollars to shareholders in the last quarter of 2023. Yet just like all the other mega-rich corporations involved in AI currently, they chose to steal data from creatives, which will make them even more billions, rather than pay them a fraction of their quarterly profits.

      Just another day in the enshitification of the whole planet by immoral and psychopathic billionaires and toxic corporations.

  2. Doctor Syntax Silver badge

    "We respect the rights of all content creators and believe we created NeMo in full compliance with copyright law."

    Believe? On what grounds?

  3. Pascal Monett Silver badge
    Mushroom

    Ooooh, Nvidia

    Lawers must be salivating at the prospect of carving out a piece of Nvidia's financials.

    I'm sure they're already calculating how many times they can bill $300K/hour.

    Oh, sorry, defending the little guy ? Who do you think you are ? Move over, I've got money to make.

  4. IGotOut Silver badge

    If blatant uplifting of material is OK...

    then I'm sure the like of MS, Google and Nvidia have absolutely no issue with running it against all their internal emails and code, after all it's only "learning" from it.

  5. martinusher Silver badge

    Never heard of them...

    How is knowledge passed from person to person? How do we learn?

    We read material written by others, of course. We're not allowed to copy the material directly (except under well defined situations and with attribution) but we freely use the knowledge. Its how society works.

    Why should a machine be any different? Its true that the typical LLM seems to be the equivalent of monkeys with typewriters -- very fast monkeys -- but in essence its just mimicking human behavior. There are real issues there but they're not ones of copyright.

    1. Falmari Silver badge
      Devil

      Re: Never heard of them...

      @martinusher "Why should a machine be any different?"

      In this case the machine is not being treated any different to humans. We're not allowed to copy the material directly, when we read we read a licensed copy, not a copy of licensed copy that would be piracy. Books3 was based on a collection of pirated ebooks, downloading Books3 to train AI is piracy, no different to a human downloading a pirate copy of a book to read.

      AI companies paying the copyright holders for the copies of copyrighted data in their training sets is machines being treated the same as humans. Not paying is machines being treated differently.

    2. JamesMcP

      Re: Never heard of them...

      If our El Reg authors read articles at another site and then reproduce material portions of them on this site, that's plagiarizing and a copyright violation. So...same standard as humans.

      But an LLM isn't a person. So what if it's an object?

      If you steal a book from a shop, that's theft. If you download a digital copy of the book without a license, that's also theft. There are a host of different laws around "not profiting from theft", which means if an LLM is based on stolen materials, the company shouldn't profit from it.

      In the weirder realm, there are some very old (and very tortured and twisted) laws around items that are part of a crime becoming themselves criminal ("deodand") that are the basis for asset forfeiture. If you train an AI using "stolen" data and then the AI itself performs something criminal (i.e. copyright violations), the AI could itself be declared criminal and subject to asset forfeiture.

      To be honest, this may be the first time the "deodand" concept actually seems like something other than belief in witchcraft.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like