It sounds...
...a little bit like her. In a movie where she was pretending to be an AI.
Movie megastar Scarlett Johansson has complained that one of the voices OpenAI's ChatGPT uses in voice interactions sounds eerily like her own – despite having rebuffed an approach to record a bot for the AI hype leader. This odd tale starts in 2013, with the release of a film named "Her" in which Joaquin Phoenix's character …
If he starts taking part in commercial ventures or other projects, where he is deliberately passing himself off as Paul Hogan, maybe not saying he is Paul Hogan, but copying his mannerisms, catchphrases, and modes of speech, such that people might reasonably assume it is Paul Hogan.
Then absolutely Paul Hogan should go after him.
That's what this comes down to, having a voice so similar to that of a famous actress, which can be so easily manipulated to trick people into thinking it is her, would allow people to create deep fakes, fake ads, etc. that could really do massive damage to her reputation and ability to work in the future.
Hell if a recording came out tomorrow of her declaring she hated all black people, and hoped trump would kick them out of her country. Even if she denied ever making the statement, how many people would believe it was true, how many people would boycott her films, how many jobs would she lose out on from producers not wanting to be involved with a "controversial" star. Even if it was completely faked, even if it was proven to be completely faked, some people would still believe it.
This sort of AI voice shit is super dangerous, and the sooner it gets massively regulated, the better.
"Even if it was completely faked, even if it was proven to be completely faked, some people would still believe it."
Even if it was nothing like her, some people would still believe it.
Look-a-likes and impressionists out of a job too from the sound of it, a long with all parody accounts. Fairly sure these are currently protected under Fair Use legislation in the US.
Regulation works when what you are regulating can be controlled.
With the ability to take openly available LLM's and run them on your own machine.... we now have let the AI out of the Secure Data Centre.
Similar to attempting to solve the issue of Knife crime, whilst the whole population has a sharp knife at home. Bloody difficult.
OF course, regulating AI and not using it for warfare or other nefarious purposes is silly, if we take into consideration how rogue states (North Korea, Iran, China, Russia) absolutely ignore regulations and international law.
Welcome to the new age of AI.
This post has been deleted by its author
That's probably going to be openAI's argument. They'd have liked to had SJ permission, but the went and searched out a doppleganger and licensed that.
And that is the issue with the whole situation. AI steals and breaks so much. Going to take decades to stabilize Intellectual and Personal property law.
From Family Guy - The Griffin Winter Games
Lois Griffin : I have to say, I'm very proud of Meg.
Peter Griffin : Who knew she was good at something? Yeah, does anybody else in the family have any secret talents we don't know about?
Chris Griffin : I'm the Quahog edging champion.
Peter Griffin : That's great, Chris. Now what's that?
Chris Griffin : It's the practice of erotic sexual denial.
Peter Griffin : [disapprovingly] Go wait in the car. And no edging!
Ron Howard : But he did continue edging. Bringing himself to the verge of sexual pleasure, only to stop at the last moment. I'm Ron Howard, and I do voice-overs for this show now too.
Kristen Bell : But that was the only voice-over Ron Howard did for the show. He asked for too much money. I'm Kristen Bell, and I do reasonably-priced voice-overs.
Josh Robert Thompson : But her voice-overs were not reasonably priced, so they turned to me. A guy who sounds like Morgan Freeman, but is not, in fact, Morgan Freeman. My name is Josh Robert Thompson, and I perform for scale.
She isn't threatening to sue the actress who sounds like her, she's (possibly) threatening to sue OpenAI for deliberately seeking out someone who sounds like her, after approaching her and being refused, and using the name of her movie to introduce the new AI.
Your customer sounding like Hogan isn't a problem, but if someone hired him specifically for that and introduced an AI using his voice with the tagline "that's not a knife!" implying that it was actually Hogan's voice, then Hogan would have a legitimate gripe with the guy who hired your customer. Especially if he'd previously tried to hire Hogan and had been refused.
This post has been deleted by its author
Horny incels are the target market for a lot of tech advances. Why do you think the biggest share of internet traffic for many years was porn? Does anyone really doubt that truly successful VR will involve a lot of masturbation?
Why do you think Altman referenced "Her"? He knows that if ChatGPT reaches the level where it can replace Onlyfans girls for horny incels - someone willing to talk to them giving them the praise they think they deserve and pretending to like what they like etc. they'll have a gold mine. Combine it with live AI generated video and they have a made up girlfriend who will tell them how wonderful they are, do whatever kinky things they want to see, and be at their beck and call 24x7. A lot of 20 year girls who couldn't hack college are going to have a find a real job in a few years!
And it isn't just Onlyfans, all the similar sites, all the "softcore" model/influencers on Instagram, the gamer girls on Twitch and so forth. That's a pretty big pot for AI to take over. Say what you want about people will want real women, I doubt most will. They would rather have a fantasy that's perfect for them than a real girl they have to share with others who might get a boyfriend and quit streaming or gain weight (or lose weight for the ones who like big girls) or have a bad week and ignore her fans. The AI girl will never change unless you want her to change.
And that probably is a good chunk of the reason why we aren't doing anything to stop young men from describing themselves as "involuntarily celibate". Because the advertising industry is collectively too lazy to think of a better message than the hopelessly-outdated "Men: If you buy this product, you will get laid!"
Many, many years ago, our great-to-the-nth-power grandmothers made the decision to allow men to treat them as property in exchange for access to their muscle strength. Things must have been very different in those days, and maybe it didn't sound as raw a deal then as it sounds to modern ears thoroughly used to technology and its side effects.
The invention of the electric motor, in particular, took that old social contract and tore it to shreds.
Machines can do most of the things our ancestors used to need men for; and they don't expect sexual favours in return. If a woman even wants a man around at all, she can set high standards, and does not have to settle for the first one who comes along.
Yet as a society, we are still teaching boys that they don't need to make an effort (and even, indeed, that "being good at things" is for girls, and therefore something they should avoid) -- which is setting them up for a painful impact with the reality that men are no longer indispensable.
We need to do much better.
It they aren't outright claiming it was her, is there really a problem?
A lot of the voices you hear on adverts (for example) are soundalikes (often the same couple of people who specialise in doing this for many different voices) rather than the famous person you assume they are.
It might not be nice but they made no claims about who it was doing the voice, however it may have been inspired.
From the article it seems the lawyers didn't claim it was illegal, but asked OpenAI to detail the specific process by which the "soundalike" voice was created, presumably to ensure it was created by legal means, and not by illegally using unapproved samples.
OpenAI chose to withdraw the voice, read into that what you will...
It they aren't outright claiming it was her, is there really a problem?
rgjnk,
They could try to claim it was just a coincidence - and then detail the process by which they got the voice to demonstrate it.
But the fact that they tried to pay her to use her voice twice, and she refused both times, and then they came out with one similar anyway - well that puts a different gloss on things.
Plus a Tweet from the boss linking it to a film using her voice (that they might possibly have used as training data) - and a film he's publicly admired - well that's just the icing on the cake.
I'm not sure what the law is here, but it's going to make their case much harder to defend in court that they themselves thought they had to pay her to get what they wanted. That might not affect the verdict in teh case, because the law says what it says and this is a whole new area with precedent to set. However if you lose the case - you've admitted you knew you shouldn't do this without permission, you tried and failed to get it, but did it anway, then the penalties will be far worse.
Or they just don't fancy the expense and legal hassle of arguing. Plus having to reveal your methods in court.
- OpenAI really wanted a Johanssen voice (because, film)
- Spurned, they went for a Johansson-alike "Sky"
- So... they don't need Johansson, right?
But yeah, they've now realised they're probably going to need her permission, especially once the CEO makes the obvious connection with the film and her character.
Plaintiff: Was the voice of Scarlett Johansson from the film "Her" ever included, at all, in the data used to train GPT-4o, at any time in the development of GPT?
OpenAI employee subpoenaed as witness: It's complicated ....
What is the chance that young exuberant employees did play around with her voice, at Sam's suggestion, using it as data, before Sam made his appeal? Remember that Sam's OpenAI "It would be impossible to train today's leading AI models without using copyrighted materials" could be introduced as evidence of motivation to do so.
Compare Sam tweeting "her" to Robin Thicke's Yahoo post “[d]efinitely inspired by that, yeah. All of his music ... he's one of my idols." IMO, the song sounded nothing Marvin Gaye, but Thick had to pay 7.5 million for it.
In the US it is. Bette Midler vs Ford Motor Co. set the precedent, Tom Waits vs Frito-Lay expanded it, it's established law ever since. OpenAI just did it to a woman who had the stones to sue Disney and force them into a settlement, participated fully in the recent WGA/SAG strike where AI and these fakes were a huge issue, and her statement indicates that she is more interested in setting legal precedent than any monetary outcome.
Considering bosses already tried and failed to stiff Scarlett for Black Widow earnings through a simultaneous Disney+ and theatrical release, you'd think people would know better than to release an AI voice that sounds like her past work.
I'm also amused that they think withdrawing the voice will make much difference. I rather suspect she'll pursue this all the way.
We are supposed to live in an society where everyone is equal, there is nothing special about Scarlet or Sky. We shouldnt be elevating people on their skin colour or their celebrity, because there is nothing special about being an actor - its all fake bullshit that pretends they are better, just like its fake to pretend black is better than white or white is better than black.
This post has been deleted by its author
Oh, if prior permission becomes a requirement, some people absolutely will give it. Everything has a price. I don't know for sure, but I imagine video game producers are already inserting clauses into their contracts for voice actors, for anyone who's desperate enough to get the work.
I'm also not clear how a ban would be enforced, in the shadowy world of knockoff porn.
"Here’s a serious question… If someone has a voice which is very much like Scarlett Johansson’s and then licenses their voice for use in OpenAI… Can she still sue?"
Enter the lawyers. But that well put question is I think, the crux of this whole issue. I would guess that if someone has a similar voice and licences their voice specifically because of that similarity (without her permission), then they would be guilty of exploiting Johansson's likeness, which her lawyers would argue is part of her brand and should be protected.
The counter argument could be that the actress simply possesses a generic "Hollywood actress" style voice, that just happens to sounds like Johansson, because she is also a generic Hollywood actress.
Feels a bit like the recent court case where Ed Sheeran was accused of copying a particular song (without permission). Shereen claimed that the bit he was accused of copying was in fact a generic and commonly used melody. I think he won.