back to article Amazon can't channel the dead, but its deepfake voices take a close second

In the latest episode of Black Mirror, a vast megacorp sells AI software that learns to mimic the voice of a deceased woman whose husband sits weeping over a smart speaker, listening to her dulcet tones. Only joking – it's Amazon, and this is real life. The experimental feature of the company's virtual assistant, Alexa, was …

  1. Anonymous Coward
    Anonymous Coward

    So Arasaka should have beern called Amazon all along. This is Relic 1.0.

  2. Mike 137 Silver badge

    Nothing new under the Sun

    "AI software that learns to mimic the voice of a deceased woman whose husband sits weeping over a smart speaker, listening to her dulcet tones."

    123 years ago Arthur Conan Doyle wrote The Japanned Box (Strand Magazine, January 1899) in which a widower regularly plays a phonograph recording of his deceased wife's voice. The difference, though, is that it was her real voice.

    1. Gene Cash Silver badge

      Re: Nothing new under the Sun

      There's folks that keep loved one's last voicemails and such.

    2. druck Silver badge

      Re: Nothing new under the Sun

      It's not even new in the computer world, all natural voices available from companies such as Nuance are based on a real persons speech, taken from a sample recording and automatically broken in to phonemes. All someone has done is stick an AI label on it.

      1. doublelayer Silver badge

        Re: Nothing new under the Sun

        Wrong in this case. Most speech systems do use recordings of human speech, but usually it's hours of painstakingly recorded samples, thoroughly dissected by lots of manually-written analysis software, and then reassembled with more rule-based algorithms. This automates a lot of the process and, as the article says, means that the person whose voice you want to copy doesn't need to sit in a recording booth for perfect recording quality and read a prewritten script, as the software can use a shorter recording not intended for that purpose. That's a different method of obtaining the same goal and it does have differences for the resultant quality and ease for the user.

  3. nintendoeats

    Compare this to The Sixth Day (basically a terrible movie), in which Arnie is encouraged to purchase a clone dog to shield is child from the reality of death. While the film totally failed to explore this idea with any depth, the fact is this is a real issue that deserves deep public discussion. As we become more and more able to shield people from the painful things in life, one has to seriously investigate the effect on long-term mental health and cultural values.

    I fear a world where the rose seemingly has no thorns.

    1. Anonymous Coward
      Anonymous Coward

      I fear a world where the rose seemingly has no thorns.

      I fear a world where the rose seemingly has thorns.

  4. fidodogbreath

    There's so much potential for manipulation here, and not just the obvious political shit.

    Running the "Grandma scam" (calling senior citizens and pretending to be their grandchild who's in trouble and needs money) using the person's actual voice.

    Luring a child into a car by playing back a parent's voice on a fake speakerphone call.

    Cops using a suspect's voice to place a fake 911 call to create a pretext for an illegal search.

    Blackmail.

    Manipulating people with cognitive issues into giving up banking info.

    Harassing and bullying people by using their loved ones' voices.

    All of the above will happen. This timeline sucks.

    1. nintendoeats

      I agree, this technology has far more applications for ill than for good.

    2. WanderingHaggis

      Hollywood here I come

      This would make a good Hitchcock movie -- speaker under the bed haunting the attempting to sleep.

    3. This post has been deleted by its author

    4. Anonymous Coward
      Anonymous Coward

      Telephone Banking?

      Will this get through the Telephone banking checks that now use your "unique voice pattern"?

    5. Nifty Silver badge

      Agree, that's why I never use the same voice twice.

  5. Warm Braw

    The saviour of TalkTV?

    Teach it Piers Morgan's voice and hook it up to Twitter's "trending" feed and Murdoch might turn a profit.

  6. Anonymous Coward
    Anonymous Coward

    "a means to build trust between human and machine"

    This idea is wrong on many levels.

  7. thejoelr

    Do not want.

    Maybe if Bezos can read me bedtime stories, but I have no interest in dead relatives whispering me off to sleep. This is nightmare fuel.

  8. The Oncoming Scorn Silver badge
    Big Brother

    Twenty Minutes Into The Future

    Bryce: You're looking at the future Mr Grossman, people translated as data.

  9. Jan K.

    "Rohit Prasad... described the tech as a means to build trust between human and machine..."

    I swear... the day my pc starts talking to me, I'll get the shotgun!

  10. Teejay

    No.

    This is absolutely tasteless, and wrong on so many levels.

  11. EricB123 Bronze badge

    Shucks

    When I read of a new Black Mirror episode, I was going to renew my Netflix subscription. Ah, just fake news

  12. Anonymous Coward
    Anonymous Coward

    "Alexa, can Grandma finish reading me The Wizard of Oz?" at which point

    you pat yourself for havng FINALLY streamlined Grandma out of your home budget, originally outsourced to a 'Golden Sunset Prospect', aka 'quick retirement' home, and then got an Xmas special, so now Granny-box sits happily by Alexa.

  13. martinusher Silver badge

    Telemarking deep fakes never work

    A quality Indian call center / scam operation talks colloquial English (or American) at their target. They work at it, they're really good, but ultimately they lack the immediacy of context that reveals their true identity. (This, I believe, is one version of a CAPCHA.)

    Grandma's voice may be comforting, even something worth treasuring, but to make really useful needs the voice to be attached to an ersatz consciousness which is able to interact and adapt to contemporary life.

    1. Dr. Ellen

      Re: Telemarking deep fakes never work

      I could use an accent translator that would let me understand the assorted flavors of voice at help centers. If they could be turned into American Midwestern, they'd be one hell of a lot more useful. But as many have warned: fake voices could be dangerous in many ways. The useful is outweighed by the hazardous.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like