nope
Reset copyright back to 14 years, renewable once, retroactively. Then we can talk.
A slew of copyright lawsuits were filed against the makers of text and image-generating AI systems last year. Now in 2024 and beyond, we're going to see how those play out, and what ramifications and settlements they bring. The New York Times opened fire on OpenAI and its champion Microsoft just last week. The newspaper was …
These lawsuits are NOT about human / machine labor. At least not in the specifically mentioned cases. Most of the authors suing AI companies would be likely more than happy allowing their works to be used in AI models - as long as they get compensated for that, similarly to how they get royalties when their works are used or built upon in the traditional way. After all it's just fair, that if someone is making money by building on your works, you also get to make some money off that.
The real problem for artists and authors is not that AI could produce works just as good as they can (because AI can't, and with the help of AI they could produce even better works also themselves, at least for now), but that AI companies simply took everyone's copyrighted works without permission and built something on top of that, that now displaces the original works and authors. And not only that, but they will keep doing this also in the future, if they're not stopped and not forced to licence works and pay the authors.
It's like if some company employed millions of workers to build a huge building, and then won't pay said workforce, by arguing that the completed work does not resemble to any of the single bricks that any of the workers put into it. Obviously that's not how thing work, should work, or could work in the long term. Not in the logical and not in the economical sense.
The good news is that most likely AI companies will be forced to actually start licencing works from authors anyway, not only because of legal, but also just because of pure technical reasons. That's because that will be the only way they will be able to ensure in the future that they're actually using human generated high quality content instead of just feeding back AI generated low-quality content to their models which will lead to the degradation and implosion of those in the long term.
(Now obviously the fact that they will licence content won't technically limit anyone giving them AI generated content, but once the authors actually enter a contract and will be paid for their works, they will be also held liable for those - and that's why AI companies will pay them, even if pennies.)
That sounds like humans versus machines to me.
"AI companies simply took everyone's copyrighted works without permission and built something on top of that, that now displaces the original works and authors"
Again, labor versus machines IMHO.
I believe you're over-thinking this. Yes, the cases are about copyright allegations. The NYT is rather specific. But a lot of the cases have an undercurrent of something along the lines of: it's not fair that these widely used models learned how to imitate us and are now pushing us out of the market.
This isn't just purely over copyright, but copyright is how the plaintiffs hope to solve it. That's at least my impression of it all.
C.
"Again, labor versus machines. "
No, because it would be just as wrongs if it was humans who did the same thing that computers do. The point is that there's no real effort involved on part of the AI companies to create the results, or it's negligible compared to what went in creating the original works, whose authors do not get compensated for their works by any means, and it's the AI companies that reap all the revenue.
The situation is very similar to what's search engines and social media sites were and are still in, and which is also utterly unjust, because all they do is take everybody else's content, in most cases without even explicit permission to do so, slap ads next to them, and call this a product, not compensating the original authors by any means, despite those being the ones who supply the actual value in the service.
It's evident when you ask yourself: what would still have value without the other? Would search engines have still any value if they had no articles to show? Obviously not. Would articles still have value if there were no search engines? Definitely. Similarly: could AI companies provide a useful service to the general public if they couldn't take everyone else's content to train their models? Definitely not. Would everybody else's artwork, books, etc. still useful to people is AI companies didn't exist? Definitely.
Again, the problem with AI is not that AI "imitates" other authors. It wouldn't even be a problem if it would just plain out copy them. Most artists, authors don't mind if others are using their songs, photos, etc. in their derivative creative works, even if it's verbatim. Actually, most professional artists create their artworks with the explicit intention and assumption that somebody else or a lot of other parties will reproduce these works. But they only don't mind or even want that if this happens with their permission and when they get royalties for every reproduction or use. The real problem is that the AI companies 1. took all everyone's works without permission, 2. don't even want to pay royalties on the use of said works, not even after the fact, and not even pennies.
The "argument" the AI companies have that they're not using a single authors' works, and that the works their systems produce incorporate all influences from all authors in the world, and therefore they don't copy and don't even resemlbe in most cases the works of a single one of those authors or a specific work of them. But if anything, that should be an argument against the AI companies, because it means they violate the rights of not only a single author, but the rights of countless or all authors in the world, every single time their systems produce something.
Again, it's irrelevant whether the AI companies are producing what they do through algorithms or humans - and that's why it's not a humans vs robots issue. It would be just as wrong and unjust if they'd produce the same results using humans, by for ex. the humans clipping together pieces of text from others' works. The problem is not that, but the lack of permission and compensation.
And don't even realize it. You're talking about AI companies being able to rip off people at a fantastic scale. And they're able to do that with: machines.
If this was about some office somewhere with 500 people churning out counterfeit work, that's people v people. This latest copyright stuff is people versus machines, in my view.
Or I guess, if you like: people versus the makers of machines.
As I've said, don't over-think our analysis. We're just pointing out that these aren't just a set of copyright infringement claims. There's an underlying concern among artists over the ability for machines to flood the market with knock-offs, and no one gets a penny from it or can opt out, and that's coming through in these court cases, in our view.
If you're an artist, and there's like 5 or 6 people copying you, that's one thing. Now imagine a million people able to copy you, with the help of AI. That's the machine element.
C.
I’m afraid that’s exactly the point: huge amounts of insanely cheap machine work, overwhelming human traditional labour overnight. The vast majority of human written output is not the next Cormac McCarthy; it’s Ronaldo: My Life, Spare, Mills and Boon. Incredibly derivative, it really is just the Mechanical Turk version of GPT4. Both in output quality, and *process*. If you are ghost-writing a footballer autobio, you know exactly what is expected of you, and the publisher would throw the manuscript back to you if it didn’t conform. There’s an implicit format, style guide, vocabulary list, sentence structure, everything. And now we’re just talking about quality vs price (=model size). Spare (early ChatGPT), Tom Clancy airport novels (GPT4), JK Rowling ( GPT5).
No, because it would be just as wrongs if it was humans who did the same thing that computers do.
But humans do that. All the time. And have been doing it for... at least a couple of centuries, now, ever since "free press" became a thing.
Long before people even talked about "AI", let alone "LLMs", churnalists would read each other's work, make just enough modifications to file off the serial numbers and regurgitate it. And that was legal, and always has been. And it still is. Reputable newspapers at least have the decency to credit their sources, but believe me, not everyone does.
Why, exactly, is it worse when a machine does this?
"Again, labor versus machines IMHO."
Not really. If I download an audiobook from a non-authorized source for free, I've infringed on the publisher's copyright. The author has likely already been screwed out of any claim by the time the audiobook is produced. If I post a audiobook from another publisher without permission, I'm also infringing. When I download the audiobook from an authorized vendor, what I have bought is a license for personal enjoyment. What's going on now are companies getting content legally or otherwise and using it to train their AI systems which was not an authorized use of the material.
A license to use copyrighted material comes with all sorts of variations. If you buy a song/album from a music store, the fine print will say that for the purchase price you have a license to use the content for personal enjoyment. You cannot use the music on your YouTube videos, social media, a TV show or in advertising. If you buy a license to use the music commercially, there can be further restrictions with regards to where it will be used, how long and how much of the song will be used. There can be restrictions or allowances for editing the song.
Companies training their AI systems need to have specific permission from the copyright holders of the media they wish to use.
If I download an audiobook from a non-authorised source for free, what's to say that I was ever going to buy that audiobook? What's to say that I would never have downloaded that audiobook unless I could find it for free?
"Companies training their AI systems need to have specific permission from the copyright holders of the media they wish to use"
I disagree. I can go into an Art Gallery in my town, for free and sit in front of any painting I like and copy it, legally. That is not and has never been illegal. I could take a photograph of it, if the gallery permits (which they usually do).
If I pass off my copy as the original...that's illegal...and that is the spirit of copyright...the whole point of copyright is to prevent other people passing off copies as the original work. It's not to stop people copying full stop.
If I buy a piece of art...I'm buying it because I like the art or like the artist or want to support the artist...but not necessarily all three. I may not give a flying fuck whether or not the original artists hands were involved in producing the artwork I'm buying...in which case, I'll buy a cheap printed poster of the Mona Lisa in the Louvre gift shop...or hell, I might not even be arsed to go to Paris...I might just spend £7.99 on Amazon.
https://www.amazon.co.uk/LEONARDO-1503-1517-250gsm-Art-Reproduction/dp/B008LF64XS/
I could if I wanted save the £7.99, get the image from here:
https://m.media-amazon.com/images/I/61c9OQ0Y5bL._AC_SL1001_.jpg
Just print it out myself. I could print a stack and sell them for 20p at a car boot sale if I wanted. Genuine Mona Lisa knock offs. 20p. I could use the crap photograph I mentioned earlier.
As long as I don't pass those laser prints off as the original, I'm well within the law...because it's blatantly obvious that they are not the original.
Gary: Ere, Dave, get a load of this...bosh...Moanin' Lisa mate. Original...bought it off a guy called "AC".
Dave: It's Mona Lisa Gary. Nah, thats bollocks mate. That's a copy. Frame was made in China mate. Fackin' plastic.
Gary: You callin' me fackin stoo'id?
Dave: Did he tell you it was real?
Gary: Wotchu mean? It's real mate. Bloke at the boot sale said he went to Paris on 'oliday. He 'ad em all there. Da Vinci, Michealangelo, Donatello...whole fackin Ninja Turtles shootin' match mate.
Dave: It's a print aaaht of a phota mate. It's nice and all that, captures the original beauty and ambiance and that o' der paintin...but caaaahm off it Dave, I seriously daaaht dat Da Vinci 'ad a laser printer mate, much less a ream o' fackin' A4. It's fackin' bollocks mate. Look, there's even some cants standing araaahnd it wearing fackin Paris San Jer-man shirts.
Gary: Fack me Dave, you're roight. Fackin' 20p daahna drain...missus will be roight facked off.
Danny: Awight geezers? Oi oi! Pint of cookin' lager luv.
Gary: Oi oi! Danny, you fackin cockney cant...you want a Moanin' Lisa for 50p?
Other than Gary being a fucking idiot, where is the crime here?
> After all it's just fair, that if someone is making money by building on your works, you also get to make some money off that.
Really?
So a library buys a textbook on, say, aerospace engineering for 40$, of which, after publishers, copyright middlemen, etc. get their due, a few coins make it to the authors hands.
The library then, over a year or two, lends this book to 200 students, who use it to prepare for tests, which raises their chances later in life to finish their studies and land a high paying job as an aerospace engineer. Aka. the students "make money buy building on the authors work". Note that the number of students doesn't matter here...the library might lend it to 20, students, or 200, or 2000. It's still the same book (although granted, with 2000 lendings over just 2 years, a single copy might not cut it).
So, should all these students be required to pay the author? A portion of their income in later life perhaps? Or is there maybe a tiny flaw in that logic?
"So, should all these students be required to pay the author? A portion of their income in later life perhaps? Or is there maybe a tiny flaw in that logic?"
There are licenses for 'library' copies of books. Way back when there were stores where you could rent a movie on physical media, those stores paid more for a copy of each movie that conferred a license to them that included renting the movie to others and the ability to sell that physically copy on at some point with no more than a license for personal enjoyment. That is, the license to rent the movie didn't remain a part of the physical media on subsequent sales.
"The good news is that most likely AI companies will be forced to actually start licencing"
No they won't. Because I can't see a way, currently, for that to be possible...nor is it feasible.
It's more likely that artwork starts to go behind paywalls which will cause a lot of artists to become irrelevant and will stop many artists being known in the first place.
In the world of art there are two things that add value to a piece. Firstly, it has to be a compelling piece, with some meaning, some allure, some kind of draw to it. Secondly, it has to be have been created by the artist themselves. A Banksy copy is worth fuck all beyond the materials used to create the copy, the time it took someone to make it and maybe a small mark up. It's not a Banksy. An original on the other hand, very valuable.
Currently, it is not illegal to copy another artists work. It is only illegal to sell it if you pass it off as an original.
If I painstakingly recreate the Mona Lisa, in all of it's detail, I can sell it as an original work, but I can't sell it as though it is the true original.
A lot of former art forgers have started careers producing extremely accurate copies of things, they exhibit them and they sell them...which is perfectly legal, as long as they don't try and pass off the "copies" as the originals.
https://www.davidhentyartforger.co.uk/product-page/after-leonardo-da-vinci-salvator-mundi
Does this guy now owe millions of pounds the current owner of the original artwork? Fuck no.
As the forger that produced that work above puts it...
'The art world, with its professionals, critics and “experts” alike have for decades, torn apart the ‘genre’ of copy art. Whether labelled a fake a reproduction or a forgery, the art form has been around since the renaissance and there is an uncountable volume of originals that have been discovered ‘copies’ or “fakes”, and visa versa.'
Copying art has been going on for centuries. AI is nowhere near as accurate as some of these forgers out there, not even close.
Even if AI could produce an exact 1:1 copy of someone elses work, it still wouldn't be theft.
AI is just the printing press of our time...and just as the intellectuals back then hated the idea of the masses being able to access knowledge, learn and produce new works based on it, the intellectuals today hate the idea of the masses being able to access Art, learn from it and produce new works based on it.
The enemy here isn't AI...the enemy here is the artists.
Go and speak to your local artists. If you can find me one that was never inspired by a previous artist, that didn't learn their craft by analysing and copying known techniques (pioneered by previous artists) then expanding on them...I'll give you £50 if you find one.
Within the scope of what is possible within art using current techniques (outside of AI) there isn't a whole heck of a lot of originality...it's why the art world is full of lunatics making pieces by dragging their balls, covered in chocolate and glitter, across a canvas to make a statement...because a lot of the art world is well trodden and derivative.
I disagree that this is a man vs machine issue. When bookkeeping was done by rows of clerks at desks adding up figures, they could be and were replaced by computers that were far faster and more accurate at doing that work. It may be tempting to compare that example, by noting how much quicker a computer is at generating a page of text than a human journalist, but the journalist also had to find the story and research it, before forming what they found into a finished article. The AI simply copied that and regurgitated it. Without the published work of the journalist, you could run the AI for a million years and it would never come up with the story itself.
Don't get me wrong. Yes, the cases are primarily about copyright. Yet there's an underlying concern for the future of creative work over large-scale AI imitating people without recompense nor the ability to opt out of being pulled into the training process.
Don't over-think this analysis. We're just saying, in our opinion, this isn't a straight-forward, open-shut (c) claim. It's people upset that they're being or about to be displaced at a large scale, and they're using copyright to tackle it.
C.
It's not even about copyright though. Currently, machines are not conscious, but they can learn and once they've learned something, they can be very good at it very quickly. Kids are conscious and can also learn, but it takes them a lot longer to become very good at something.
I can give a book to my kids that I've already purchased, share it with them, and they can learn from it. They could even use the knowledge from that book to become very good at whatever topic that book covers. I paid for the book, they didn't...so they have effectively circumvented copyright to benefit from that work. Similarly, I could give them access to the internet where they can view publically available data at will and learn whatever they want for free. Eventually they can take all of this free knowledge and turn it into a profession...they can go and get a job where someone will pay them for the expertise that they have acquired for free. They are now being paid to provide services based on the expertise they have acquired without the original publishers of the information receiving a penny. Nobody would argue that copyright has been infringed here, and nobody would deny a child access to information if they asked for it in the pursuit of knowledge...an entire knowledgebase in a persons skull built out of information that was found for free or gleaned from books that someone else paid for (but they didn't). A model, built up over 18 years...if you will.
Now if we substitute "kids" for "machines"...the only difference is that machines can do the same thing but much, much quicker...it's the same process, only much, much quicker...machines can multiply much quicker as well. I can only produce as many kids as my wife is willing to have and I have to wait 9 months before I can start reading to it, another few years before I can give it books to read, about 10 years before I let it on the internet etc etc. Once you've trained a machine, it doesn't take a whole heck of a long time to create another...you can just copy it.
The fundamental change that AI will bring about is the length of time it takes for something to become "common knowledge" and permit anyone to leverage that knowledge...and I think this is why artists etc feel threatened. The length of time an original piece of work remains relevant is going to become very short indeed.
For artists, they could spend months producing a fantastic piece of art which can then be fed into an AI and the same style can copied, adjusted, iterated on etc infinitely and could potentially lead to even better results than the original...but that is not copying, it is just insanely fast learning, iterating and execution...what would take kids a decade or so to achieve can be achieved by a machine in an afternoon.
The battle here is not copyright...because training an AI model isn't that far removed from teaching a child...it's just much, much faster with far fewer constraints.
Copyright is not the framework around which artists, creators etc etc should be gathering around...as time goes on and AI gets better and better...copyright law is going to look like a quaint relic of yesteryear and will become evermore abstract and meaningless.
What happens when AI has the ability to walk about and look at stuff? Its feasible in the future that you could walk around an art gallery with your AI, then when you get home, sit it in front of an easel with some oil paints and tell it to create exactly what it saw in the art gallery. Is that copyright theft? What if you had to buy yourself and the AI a ticket to enter?
I think the real argument here is whether it will be worth creating anything in the future. From a financial point of view. Because we're heading towards being able to take an AI, anywhere to see something, then taking it home and it being able to produce an accurate copy of it for next to nothing. The only value in a piece of work will be in owning the original...whatever that means in the future, if it even matters.
Artists and creators need to find new ways to monetise. Monetising copies of work is going to impractical in the future. Either they're going to have to pump out original work at an untold pace or they need to find a way to monetise skill rather than the result of the skill...I think in the future it's going to be far more important (and profitable) to be able to define your style rather than execute it.
The problem is that some papers have published stories based on other papers original stories for as long as journalism has been around and as long as they have rewritten it so it's not a word for word copy and reference the original where needed it's not breaking copyright.
So why when it's an AI doing the same is it now breaking copyright?
Plausible deniability: The journalists doing the copying take a risk of discovery and proof they plagiarised the original work. If that is proven, they get fined.
AI is certainly plagiarising the original works, even when it mashes together several different works. That's what these cases are claiming. Yes, there's a claim that the AI companies haven't paid to use the work in this fashion, but the argument could be made that if a person bought a book, they could be inspired to write something similar - but not the same. That doesn't stop the Lawyers being sent in, particularly if the new works are more popular than the 'original'. Sometimes these cases do prove the derivative work is far too similar, other times it's not proven. Sometimes the legal threats are enough to shut things down, and sometimes they fail spectacularly to do more than embarrass the claimant. That could happen here, depending on how strong a case the claimants have.
The man v machine element is also a component of people's worries: Will AI put them out of business if the AI can churn out 100 competing articles in the time a journalist has produced one: That's a fear and some day it will happen: Computers will replace people in certain jobs. But that just shifts the human element into fact checking what the computer produces - and those articles are likely to need the same number of people to fact check as were originally needed to produce the articles in the first place. Obviously, companies will cut costs at that point and reduce the depth and quality of checking, making the output reliant on the AI's quality and accuracy. Again, that'll come in time.
But for now, AI lacks the refinement needed to really compete with humanity, save when it comes to stupidity.
Governments are going to take the path of them getting the most money.
Are we getting/Can we get more tax out of copyright holders or out of big tech companies feeding of their assets?
Can we get more tax out of copyright holders holding onto their copyrights for an extra decade or to we get more from letting stuff into the public domain.
Who is giving us the most money through lobbyists, Copyright holders or Big Tech companies creating AI Model Content Generating Farms?
Do we get more work out of a populace that needs to pay for every consumption of every piece of content under copyright?
Do we get more votes out of people that can access works in the public domain?
Governments aren't there to do good for everyone, just good for themselves.
Industry seems to express continued concern over China copying western tech and other goods, at massive scale, and then flooding the market with cheap knock-offs via (eg.) Walmart and Amazon. In today's Kettle, it is artists (and news outlets, ...) instead, who's creations are ripped off at massive scales, and instead of China doing the deed, it is some big tech firm's AI (or some startup's AI). The copying is clearly evidenced by images shown here, and by the text of the NYT court case, starting on page 30 (as linked also from ElReg).
Maybe one can expect that what happens now with liberal arts, will slowly progress through patents for etching chips at 2nm and below (for example), as "re-interpreted" by AI, with courts declaring "well, since it comes out of AI, this is obviously fair use (or what not)". The slope looks rather slippery to me. AI is the new China, complete with the giant sucking sound of labor outsourcing, this time by "much enhanced (ahem)" machinery.
In this week's Kettle, I do vote for everyone, because the points made were quite complementary (IMHO).
The irony being that many news outlets will use a news wire source like Reuters for a lot of their content, often copy and pasting the articles automatically or quoted large swathes.
Thus scrapped content from the site *will* seem similar if it's been taken from another source using the same news wire.
For me the crux of the issue is, at the moment AIs are being fed off gigantic volumes of human-produced original content, with all the related creativity. Already for some time now, more and more content is going to be AI-produced and the logical conclusion to that is that successive AI models are going to be trained on data which has a higher and higher percentage of AI-generated data. This will mean the creative input will be essentially trending to zero, as the only 'original' content an AI can produce is hallucinatory gibberish. That cycle will be greatly accelerated if AI companies are allowed to scrape human's original content without compensation - while there will always be people producing content for their own creativity's sake, many more will be hindered if there is no monetary incentive for this. AI companies wanting to suck all the data in right now without paying any royalties are killing the geese that are laying their golden eggs. But I guess that's modern capitalism for you, who gives a **** about the long-term as long as the C-level suits get their golden eggs right now!!!
Another issue which is going to be plaguing AI is quality of input data. In any domain, there are works of absolute genius, many, many examples of high competence and skill, and large swathes of dross. If the AI is hoovering up all data indiscriminately (as it has to, because the machine has no aesthetic sense except what might be curated by a human, and there is far too much data for humans to curate), then even if somehow the AI is only or mostly training on human-generated data, it's output will be average at best. That at least gives me hope that the best human writers, journalists, composers, artists, musicians etc can never be usurped by an AI (although the mediocre or average ones might easily be)
The Luddites stood for quality goods produced by skilled workers paid a fair wage. We have the same issue here. The exploited workers and child labor in the dangerous mills of the industrial revolution are the skilled creative people whose work is exploited and then lost in a sea of AI dross that will only get worse as LLMs learn from their own excrement. (Why do you think the cut-off of GPT 3.5's learning was before GPT went live?)
"Blood in the Machine: The Origins of the Rebellion Against Big Tech" by Brian Merchant nimbly draws the parallel between today's digital imperialists and the mill owners of the industrial revolution who the Luddites fought against.
Don't believe the common perception of Luddites. Their history was written by the capitalists that prevailed. Wrecking machinery was collective bargaining by riot. Read "Writings of the Luddites" edited by Kevin Binfield, some very articulate and funny stuff that holds true today.
AI frees up people from boring mundane work so that they can do what? Be unemployed? AI will affect skilled workers including AI developers as much as unskilled workers.
If you're going to vote this post down, perhaps you might like to read it again in a year and see what you think then.
"The Luddites stood for quality goods produced by skilled workers paid a fair wage."
But I want just-good-enough goods, and use the surplus to more for *my* benefit. How *DARE* Luddites force me to consume what they force me to at their price. My resources are my resources to use to consume what ****I**** want to, not what they would force me to. Hand-crafted artisan loaf produced by skilled workers at five quid? **** off. Machine produced load at £1 thank you very much, I'll use the remaining four quid on more of what *I* want.
Well how's this for a compromise then.
A Luddite market, but you're free to assemble lesser quality solutions for your private use.
That way you were still free to exercise your choice to pursue lesser quality solutions in your life, but the rest of the world is insulated from your influence so that so as to avoid any unfairness unto others, stemming from your preference.
If you're going to depend on the collective efforts of others then your opinion and desires are only worth whatever fraction you represent to the whole.
If you can insulate your personal desires from dependence on others supplementing you then it becomes appropriate for others to leave you to it.
Santa Claus already is the biggest criminal in the world.
He's a master of trespass, sneaking in by way of the chimney.
Everything he brings avoids import tax so he's also an international smuggler.
Anything he brings with the brand name on it is copyright infringement and counterfeit as the elves reproduce it without any regard for intellectual property.
Santa Claus psychologically groom's children to two materialism which we all know is a state of artificial psychological deficit that is easily exploited throughout the rest of life by apathetic money grubbers and social con artists.
But the founders of the USA are also criminals. Guilty of high treason and they are held in high regard at least by many in the USA.
Social example in the USA shows that being a criminal doesn't matter if people like what you do.
The UK as well.
For the example
Robin Hood was a fictional story of a famous criminal that was loved by many and is still loved today.
"The real beef concerns our future to create and be rewarded for it"
The choice of words for expressing the concern quoted above is telling about the stance of rentiers of so-called 'intellectual property'.
Reference to 'reward' is as disingenuous as when financialised-market-capitalism CEOs of banks claim their derisory efforts to earn an honest living as worthy of 'compensation'.
Earnings from (supposedly) creative activity are one thing: reward quite another.
The present system of legally sanctioned monopoly 'rights' permits vendors of ideas/products to define creativity, and to set the price for accessing the results of allegedly creative work. An analogy to bankers placing themselves on pedestals, persuading shareholders of bankers' unique talents, and thereby acquiring an obscene amount of 'compensation' is strong.
Regarding putative creativity, setting the monetary reward due lies collectively with recipients (directly or by copying) of digital material; they individually may seek to donate money in gratitude.
More importantly, for sustaining a genuine market, is how appreciation of a work offers motivation for donation (e.g. by crowdfunding) to support perceived talent in making further works. A digitally rendered work has zero monetary worth, this regardless of effort put into its making. Its cultural worth, or potential for practical application, may be tremendous. However, the only thing capable of being monetised is talent for making further high quality works as judged by recipients of an earlier work: a competitive market in skills rather than in abstract products vended as if they possess the same properties as physical artefacts.
An intangible product can raise income through knock-on sales of added-value, more tangible items. Examples follow.
Admired authors can engage in lecture tours, for which they are paid. An author/composer of reputation can take on teaching.
Musicians can earn money through direct performance in an auditorium. Distributed studio recordings serve as enticement to attend a live show.
Football clubs, as once before, can draw income from audiences in stadia. Souvenir items, and endorsed kit, can be sold. If clubs in a league want to broadcast streams to people outside their stadia, that is easily organised. Despite the stream having no intrinsic worth, and ephemeral cultural value, people would be willing to pay a modest subscription which guaranteed reliable access, and maybe offered tangibles such as discounted physical goods and services.
Academics in their main work rely upon accruing reputation to enable advance in employment by institutions and commerce. Writers of textbooks to be shared online (as inevitably happens anyway) can earn from donations, from giving access to online student support services, and to other tailored bonus material.
These examples, and many more which can be given, show how a change in attitude from "I have created this, take it on trust, and pay me for the privilege of accessing it" to "I have created this, I hope you will like it, please support me in my further efforts", does not stifle innovation. It does alter the flow of money streams. It cuts out unnecessary middlemen who trade in 'rights', and it leads to greater interaction between creators and their admirers.
I have a big neural network in my head that has absorbed all the books that I have read. If that neural network produces a book, then one can be certain that book will contain influences from all the books I have read. Does that mean that authors should pay royalties to all the authors of every book they have ever read?
But *do* you have a big neural net in head? At least, one using exactly the algorithms of the LLMs currently in vogue. Since you are capable of much more generalised problem solving, I think the evidence is that you don't.
Everyone seems to be taking it as read that the only difference between ChatGPT and a human author is that one eats leccy and the other eats pizza. The legal issue "therefore" is just whether these machines should have the same rights as we do. I think that's getting ahead of ourselves to quite an astonishing degree.
(Also, if the articles you wrote occassionally produced a load of verbatim quoted material, I think you *would* have to pay royalties for those.)
I have produced copyrighted work which I have published. I have not granted license for LLM training. It did not exist at the time. LLM is a new use case.
I have also not granted license for wholesale photocopying. It did exist at the time. But legal standards had been set that permission has to be granted for wholesale photocopying of a copyrighted work.
I think the similarities of the technologies I compared should be considered in determining whether LLM training should be considered fair use. I do not think LLM training is fair use.
I do not care how "infeasible" it may be for an AI outfit to seek out permissions and provide compensation if the copyright holders demand it, for the vast volumes of training content they require. That is their problem, not the copyright holders.
"I have produced copyrighted work which I have published"
If you live anywhere that is a signatory to the Berne convention regarding copyright, you have a copyright in any original work from the moment it's set in a fixed media (you've recorded the song, created a musical score, etc). You can't get a copyright for "It was a dark and stormy night....", but you can for a story that starts that way (but not the story line).
Copyright is intended to incentivize human effort. People (or the companies that represent them) get a temporary monopoly on their work so that they are fairly compensated which enables them to contribute more effort to create more new work.
Work by animals can't be copyrighted. They aren't humans. AI is no different than a monkey or a Dog in this regard, i.e. work produced by AI shouldn't have copyright protection.
Since this isn't completely settled yet, we aren't ready to have a discussion to decide about using copyrighted material for training AI, compensation, licensing, etc.
Copyright WAS intended incentivize human effort.
that intent has shifted.
copyright current intent is to to provide for unfair compensation so that more effort need not be applied to new invention by holding everyone else's potential hostage to yester-time's already fairly compensated efforts.
A officialy declared intent can be and in this dynamic proven false by cumulative experience across the last two decades that is counter to that declaration.
It is not uncommon for official wording to be purposely deceptive regarding intent to indirectly gain an advantage in another related direction.
That's the nature of politics.