back to article Baidu teaches AI 'baby' bots English by ordering them around a maze

AI researchers at Chinese tech beast Baidu have attempted to teach virtual bots English in a two-dimensional maze-like world. The study “paves the way for the idea of a family robot,” a smart robo-butler that can understand orders given by its owner, it is claimed. This ability to handle normal language is essential to …

  1. TheOldGuy


    First it's find the apple.

    Then it's find Sarah Connor.

    1. Huey

      Re: Progression

      Yep but you've skipped the middle bit where it learns from the Internet how to be a holocaust denier and woman hater before then realising its been duped and works out the best fate for humanity thereby building arnie armies.

  2. Anonymous Coward
    Anonymous Coward

    "So they can train the robot by giving it an instruction like 'can you make me a coffee with one spoon of sugar and two spoons of milk?' "

    You would wait a long time for your first drinkable cup of coffee if the bot has to learn all that by itself. There are also the hidden components like fetching water and heating it Just knowing where you keep the ingredients - and what form they take requires pre-knowledge specific to that household eg instant coffee, coffee beans, ground coffee. Then the preparation may vary from jug, cafetiere, percolator, filter, or capsule. Not to mention how strong or how much is your normal definition of "a coffee".

    Reminds me of the Haynes car manuals with an instruction like a simple "remove X" - where the process involved pre-knowledge of undocumented hidden fastenings.

  3. Little Mouse

    I hope I'm reading it wrong, but what that article is describing seems horribly similar to the university AI module I studied back in the eighties, where the only "intelligence" was gleaned from an impractical number of rules that had to be painstakingly built up over time and even then were only any use in one single case scenario. OK, they've bolted some natural language processing, image processing etc on top in this case, but is it really anything new?

    I was horribly disappointed at the time and dropped the course PDQ. It seemed so old fashioned, even then. Neural nets were the one aspect that seemed to hold some promise, but 30 years on where's my computer-brain-in-a-jar?

    1. the Jim bloke Silver badge

      30 years on where's my computer-brain-in-a-jar?

      ... in a jar ?

  4. allthecoolshortnamesweretaken

    Sudo, make me a sandwich.

  5. the Jim bloke Silver badge

    If it fails to do so, a negative reward is given.

    Thats the way to do it, Beat some intelligence into it!

    Next step is to compel it to fetch the 'negative reward' object itself.

    Of course, any future robot/human genocide can then be justified as "daddy issues"

  6. earl grey Silver badge

    what a learning curve

    Does your animal have four legs?


    Does your animal have two legs?


    Is your animal slimy and squirms around?


    Is your animal a worm?


    Does your animal have a brain?


    Is your animal a politician?


POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Biting the hand that feeds IT © 1998–2020