My bet on the accounts?
I'd say they'll find they are mule accounts operated by possibly unwitting victims
Authorities in the United Arab Emirates have requested the US Department of Justice's help in probing a case involving a bank manager who was swindled into transferring $35m to criminals by someone using a fake AI-generated voice. The employee received a call to move the company-owned funds by someone purporting to be a …
This seems like one of the shakiest.
It sounds like the group that ElReg described back in September - https://www.theregister.com/2020/09/01/brit_bloke_extradited/
But then they were mimicking customers to get thousands from their bank accounts.
The idea that a bank would transfer millions based on a voice call is ludicrous.
From the sound of it, they also had fake documents and emails to start the scam, with the voice call only as a second factor. The bank certainly should alter their policies so one person, even completely convinced, cannot lose them that much money without oversight.
"The idea that a bank would transfer millions based on a voice call is ludicrous."
It illustrates a failing in the banking systems, how are they going to "fix" this problem, will they do all transfers this large in future via an app - that would avoid them getting blamed, they would just say that the app had been hacked and they were insured.
Did anyone else read the article and think, the bank manager is 100% positive that a deep fake voice mimicking software was used?
Sure, it may have happened. But he could also have made a giant error of judgement and this is his excuse.
For all he knows, it was a voice actor putting an accent on?
Not the sharpest tools in the shed. A relative perpetrated a doozie on a bank manager, I won't go into details, but it came down to a phone authorization from someone other than the account holder. This was back in the 80s... Nothing new, but my question is why are things authorized this way in this day and age? What about the Blockchain?????? 2FA??? Retinal scans, colonic maps??? Something besides a voice!
Usually when I've seen these, they make you say something that they've just come up with so you can't use a recording. That would require them to have a pretty good model for sounding like you because they could have left out an accent trick which the bank's model has remembered. However, it's hard to test and easy for an attacker to play with, so I would recommend that nobody enable that if they have a choice to use a less convenient but more secure method.
"Imagine your AR device displaying exactly how to hold the sticks during a drum lesson, guiding you through a recipe, helping you find your lost keys, or recalling memories as holograms that come to life in front of you," Facebook said in a blog post.
I've seen the movie picture and the TV series... it doesn't end well in either