So Arasaka should have beern called Amazon all along. This is Relic 1.0.
Amazon can't channel the dead, but its deepfake voices take a close second
In the latest episode of Black Mirror, a vast megacorp sells AI software that learns to mimic the voice of a deceased woman whose husband sits weeping over a smart speaker, listening to her dulcet tones. Only joking – it's Amazon, and this is real life. The experimental feature of the company's virtual assistant, Alexa, was …
COMMENTS
-
Thursday 23rd June 2022 14:18 GMT Mike 137
Nothing new under the Sun
"AI software that learns to mimic the voice of a deceased woman whose husband sits weeping over a smart speaker, listening to her dulcet tones."
123 years ago Arthur Conan Doyle wrote The Japanned Box (Strand Magazine, January 1899) in which a widower regularly plays a phonograph recording of his deceased wife's voice. The difference, though, is that it was her real voice.
-
-
Saturday 25th June 2022 17:27 GMT doublelayer
Re: Nothing new under the Sun
Wrong in this case. Most speech systems do use recordings of human speech, but usually it's hours of painstakingly recorded samples, thoroughly dissected by lots of manually-written analysis software, and then reassembled with more rule-based algorithms. This automates a lot of the process and, as the article says, means that the person whose voice you want to copy doesn't need to sit in a recording booth for perfect recording quality and read a prewritten script, as the software can use a shorter recording not intended for that purpose. That's a different method of obtaining the same goal and it does have differences for the resultant quality and ease for the user.
-
-
Thursday 23rd June 2022 14:51 GMT nintendoeats
Compare this to The Sixth Day (basically a terrible movie), in which Arnie is encouraged to purchase a clone dog to shield is child from the reality of death. While the film totally failed to explore this idea with any depth, the fact is this is a real issue that deserves deep public discussion. As we become more and more able to shield people from the painful things in life, one has to seriously investigate the effect on long-term mental health and cultural values.
I fear a world where the rose seemingly has no thorns.
-
Thursday 23rd June 2022 16:00 GMT fidodogbreath
There's so much potential for manipulation here, and not just the obvious political shit.
Running the "Grandma scam" (calling senior citizens and pretending to be their grandchild who's in trouble and needs money) using the person's actual voice.
Luring a child into a car by playing back a parent's voice on a fake speakerphone call.
Cops using a suspect's voice to place a fake 911 call to create a pretext for an illegal search.
Blackmail.
Manipulating people with cognitive issues into giving up banking info.
Harassing and bullying people by using their loved ones' voices.
All of the above will happen. This timeline sucks.
-
-
This post has been deleted by its author
-
-
Saturday 25th June 2022 11:21 GMT Anonymous Coward
"Alexa, can Grandma finish reading me The Wizard of Oz?" at which point
you pat yourself for havng FINALLY streamlined Grandma out of your home budget, originally outsourced to a 'Golden Sunset Prospect', aka 'quick retirement' home, and then got an Xmas special, so now Granny-box sits happily by Alexa.
-
Saturday 25th June 2022 15:16 GMT martinusher
Telemarking deep fakes never work
A quality Indian call center / scam operation talks colloquial English (or American) at their target. They work at it, they're really good, but ultimately they lack the immediacy of context that reveals their true identity. (This, I believe, is one version of a CAPCHA.)
Grandma's voice may be comforting, even something worth treasuring, but to make really useful needs the voice to be attached to an ersatz consciousness which is able to interact and adapt to contemporary life.
-
Saturday 25th June 2022 23:59 GMT Dr. Ellen
Re: Telemarking deep fakes never work
I could use an accent translator that would let me understand the assorted flavors of voice at help centers. If they could be turned into American Midwestern, they'd be one hell of a lot more useful. But as many have warned: fake voices could be dangerous in many ways. The useful is outweighed by the hazardous.
-