Welsh, on the other hand, there's no tech that can help with that
I predict a therfysget.
Boffins at the University of Illinois Urbana-Champaign (UIUC) in the US are working with the usual internet super-corps to, ideally, improve AI voice recognition for people with disabilities. Speech recognition software often struggles to process speech for people with heavy accents, and performs even worse for people with …
You won't get a riot from me. After more decades than I care to remember of reading and hearing the same 'clever' joke from people who can't be arsed to actually listen to a language other than English, I can only muster up a feeling of disappointment.
Disappointment, particularly, with El Reg's editorial staff for allowing gratuitous insults aimed at a specific language and people.
Might I suggest retraining for their sub-head writers?
Very disappointed myself.
I'd assumed good faith and thought there was going to be something in the article referring to some reason why the outputs of the project were not going to transfer across languages, but... nothing.
Just a lazy joke.
I thought the Reg was better than that.
Having lived in Wales for many a year and put up with the overt relentless dislike of the English from the Welsh, you're on thin ice there. Oh yeah, not just whilst living there, been insulted on holiday too. The Welsh have got away with this for being bitter 'patriots' for too long - for anyone else its just plain racism.
The headline is a gentle micky take on the complexity of the Welsh language and pronounciation. And yes I speak a couple of romance languages and Welsh - mainly due to having it been force fed at school by nationalist militants who were too thick to differentiate between their hatred of the English and the language of the same name.
for stroke victims. I mean, that's what assistive tech used to be sold to us as being capable of, right? Until they changed marketing tack to "Look how lazy you can now be with our technology! You don't even need to get out of bed to open the curtains and put the coffee pot on!"
A little surprised this has taken so long.
Oliver Sacks decades back identified the study of damaged minds as giving insights into how the (healthy) mind actually works. This might form the basis of a useful hypothesis on speech, giving the potential for even better speech recognition and generation systems.
Collecting, curating and labelling speech for part of a speech recognition database is time-consuming, difficult and expensive. Depending on the speech disability one might also need to re-specify the dictionaries as well as the acoustic models may not match. It might also need differing feature generation as (for example) those with speech disabilities can struggle with plosives (so less emphasis on consonants).
I agree totally that we should focus on enabling technology, the issue is its not always straightforward to map the current approaches to optimally support someone with a specific disability. Obtaining data, either a complete data set or using adaptation data is time consuming and hard work.
Just treat someone with a strong accent as a separate language.
Warning the rest of the post contains sarcasm.
If Welsh people are speaking English, take Welsh and English language add in some AI and your done. AI is so great it can solve anything you need to add more parameters until the problem is solved.
This post has been deleted by its author