First it's find the apple.
Then it's find Sarah Connor.
AI researchers at Chinese tech beast Baidu have attempted to teach virtual bots English in a two-dimensional maze-like world. The study “paves the way for the idea of a family robot,” a smart robo-butler that can understand orders given by its owner, it is claimed. This ability to handle normal language is essential to …
"So they can train the robot by giving it an instruction like 'can you make me a coffee with one spoon of sugar and two spoons of milk?' "
You would wait a long time for your first drinkable cup of coffee if the bot has to learn all that by itself. There are also the hidden components like fetching water and heating it Just knowing where you keep the ingredients - and what form they take requires pre-knowledge specific to that household eg instant coffee, coffee beans, ground coffee. Then the preparation may vary from jug, cafetiere, percolator, filter, or capsule. Not to mention how strong or how much is your normal definition of "a coffee".
Reminds me of the Haynes car manuals with an instruction like a simple "remove X" - where the process involved pre-knowledge of undocumented hidden fastenings.
I hope I'm reading it wrong, but what that article is describing seems horribly similar to the university AI module I studied back in the eighties, where the only "intelligence" was gleaned from an impractical number of rules that had to be painstakingly built up over time and even then were only any use in one single case scenario. OK, they've bolted some natural language processing, image processing etc on top in this case, but is it really anything new?
I was horribly disappointed at the time and dropped the course PDQ. It seemed so old fashioned, even then. Neural nets were the one aspect that seemed to hold some promise, but 30 years on where's my computer-brain-in-a-jar?
Biting the hand that feeds IT © 1998–2020