Removed
Poor Jamie couldn’t talk and chew gum^ at the same time.
^ Generally illegal in Singapore
A chatbot used by Singapore's Ministry of Health (MOH) has been switched off after providing inappropriate answers to residents' queries on COVID-related matters. Screenshots of gaffes from the chatbot tool appeared online earlier this week. Here's an example of the errors on offer: a suggestion that practicing safe sex is …
So you have to know what the ACTUAL question is before you can understand the answer...
Makes sense. If the root of the root of the question was "How can I stop my children getting COVID?" then the condom answer makes sense. After all, there's a billion things a person could avoid worrying about if they didn't have any children in the first place!
This is what happens when those who don't know what level real AI truly is at right now and order developers to hook up some crappy psuedo-AI to a chatbot, you get very dangerous results like this. Not everyone is smart and questions advice, some people don't know better and will follow whatever advice they get, especially from an online system.
If people are stupid enough to drink diluted bleach to wash away COVID when it says it was a "medical expert" on social media, they're not going to question an expert AI chatbot!
Has anyone reading this ever used a chatbot successfully? The only times I've gotten what I wanted was when I really wanted a link to information, there wasn't one, but a suggested question was what I needed to type for the chatbot to cough it up*. For situations where the bot had to understand me, it usually didn't work and most often wasn't at all serviceable. Chatbots cannot replace humans for actual complex support cases, but I don't even think they can for the basics.
*Note to chatbot developers: the anecdote I mention is not a successful result. Put your links on the page so I don't have to hunt for them.
"The only times I've gotten what I wanted was when I really wanted a link to information, there wasn't one, but a suggested question was what I needed to type for the chatbot to cough it up*."
That's what I use them for -- an inferior kind of search box, but maybe the only one available. Even the dreaded Site Map usually gives better results.