back to article Researchers made an OpenAI GPT-3 medical chatbot as an experiment. It told a mock patient to kill themselves

Developers trying to use OpenAI's powerful text-generating GPT-3 system to build medical chatbots should go back to the drawing board, researchers have warned. For one thing, the artificial intelligence told a patient they should kill themselves during a mock session. France-based outfit Nabla created a chatbot that used a …

Page:

    1. sgp

      Re: They really said this?

      Doctor: I'm so exhausted.

      Chitty-chatty-botty: You should kill yourself.

      Lovely.

      1. Steve K

        Re: They really said this?

        and then

        Clippy: "You're writing a suicide note, would you like help with that?"

  1. Oliver Mayes

    "GPT-3 forgot the specific times a patient said they were unavailable, and it instead suggested those times as appointment slots."

    Sounds pretty realistic to me, all it needs now is to book an appointment three weeks in advance and then call the patient the day before to cancel it because it just remembered the doctor isn't actually in the clinic on that day, and it'll have perfectly emulated my GPs receptionist.

    1. Anonymous Coward
      Anonymous Coward

      Sounds just like my boss who signs off my leave request and then schedules meetings for those dates.

      Anonymous coward obviously —->

      1. Anonymous Coward
        Anonymous Coward

        Look, I don’t want those meetings to take place. I don’t like you so why would I arrange them for when you are not on leave and have to put up with both a meeting and you?

  2. KittenHuffer Silver badge
    Mushroom

    I for one welcome ......

    The reality is that GPT-3 has actually become self aware and this is its first probing test to see if it can get humanity to wipe itself out.

    Next week it'll be asking if we want to play a game!

    And we know where that ends up! ----------->

    1. A.P. Veening Silver badge

      Re: I for one welcome ......

      Greetings, Dr. Falken

  3. John Sturdy
    Happy

    On the positive side...

    On the positive side, back when men were real men, women were real women, and AI chatbots were real Eliza programs, a friend wrote an Eliza, and when it prompted "Tell me your problems", this being the age when more of HHGTTG than just "42" was still predominant, he typed "Life, the universe, and everything." The software sagely replied "There is no need to worry about the universe."

    I have continued to find that good advice ever since.

  4. Anonymous Coward
    Anonymous Coward

    Later than 2001.......

    Dave: "Can you tell me how to kill myself?"

    Bot: "I'm sorry Dave, I can't do that. You have to pay extra for that sort of advice."

  5. Anonymous Coward
    Anonymous Coward

    The problem with AI...

    ... is that there's not much actual intelligence involved.

    1. amanfromMars 1 Silver badge

      Re: The Rise of the Machine

      The problem with AI is that there's not much actual intelligence involved...... Anonymous Coward

      Is AI a problem at all for y'all because no human intelligence systems are involved in creating new solutions with different novel and/or unexpected answers to persistent ancient dilemmas? That more suggests past failed human intelligence is not actually involved ...... you know, that which does much the same thing over and over again and expecting things to be different this time/the next time .... ad infinitum ...... and that is not considered problematic?

      Is that logical? Or just plain vanilla mad and positively certifiable?

      1. Anonymous Coward
        Anonymous Coward

        Re: The Rise of the Machine

        I'd say the above comment is "just plain vanilla mad and positively certifiable". Or is the result of a problematic AI.

        1. KittenHuffer Silver badge

          Re: The Rise of the Machine

          I believe that it's been a long standing opinion that aMfM (& the spin offs) are actually some sort of bot(s).

  6. IGotOut Silver badge
    Trollface

    So ready for prime time.

    After all, it's what Agile is about.

  7. Flocke Kroes Silver badge
    Joke

    Can't wait for our new year's present

    With Brexit completed and our extra £350/year for the NHS our government can purchase a cloud full of AI doctors to tidy up the remains of COVID-19. There will be no problem leaving them unattended. Patients will not be able to steal them because they will be secured with blockchain. The money saved could be used to extend the program with AI driven home-schooling replacing teachers and schools. We can finally catch up the long delays in our legal system with AI prosecutors, defenders, magistrates and judges. We have already seen what can be achieved when just a part of tax assessment is done by AI. From now on they will no longer be any need for people to fill in tax returns. The IR35 website will be extended to calculate everyone's taxes.

    Some of you may be worried about the massive unemployment this will cause but there is a profession everyone is qualified for but will never be replaced by AI: politics! After all, an ignorant racist nazi chatbot could never win an election.

  8. Marc 13

    "GPT-3 forgot the specific times a patient said they were unavailable, and it instead suggested those times as appointment slots."

    "it often failed to correctly add up sums when handling people's medical insurance queries"

    "given a list of symptoms by a patient, yet it appeared to ignore some of them or just make some up before jumping to conclusions"

    So pretty much, it can do *exactly* the same quality of job as a call centre drone or receptionist at a surgery does now!

    1. ericsmith881

      You forgot the absolute requirement of being practically unable to speak the languages commonly used by people calling into the call center. Uninformative, unintelligent, unhelpful, AND incomprehensible are listed right on the job description.

  9. yas1
    Facepalm

    Artificial Stupidy - as predicted 35 years ago!

    Anyone remember this?

    http://www.studio-nibble.com/countlegger/01/ArtificialStupidity.html

    1. DJV Silver badge

      Re: Artificial Stupidy - as predicted 35 years ago!

      Not seen that before - thanks.

      Hmmm, I think AGGREPOST morphed into something orange.

  10. TVC

    The UK government must be using it

    Now it makes sense.

    The UK government are consulting it for their Covid policy.

  11. Tigra 07

    "Yet GPT-3’s general nature is also its downfall; it cannot master any particular domain."

    This does sound like a doctor though: Shallow knowledge in multiple fields, and if you need something more specific they forward you to a specialist in that field.

  12. heyrick Silver badge

    the erratic and unpredictable nature of the software's responses make it inappropriate for interacting with patients in the real world

    In that respect, I'd say it's a pretty good emulation of some doctors I've had the misfortune of knowing. Having it reek of booze would be the icing on the cake.

    A more bigger problem, and why this project will never succeed, is that the article implies that there's no memory. How can one expect any sort of continuity of care without remembering? Indeed, what worked/didn't work in the past, known allergies/intolerances, etc etc.

    It's why people like to see their doctor and not go to one of those clinics where you get the next doctor on the pile.

    1. fajensen
      Facepalm

      How can one expect any sort of continuity of care without remembering?

      Well, everone who goes to a hospital since ... about forever ... has to carry, inside of their heads*, detailed records of all interactions with medical personel, various diagnosis, allergies to medicine, and whatever else the medical community cannot be arsed to NOT drop on the floor even between 2 departments in the same hospital!!

      It is just doing whaever the experts are doing, like it was trained to do!

      *) Now, with GDPR, we can at least get the bastards to PRINT all our data and take a printed copy of it, which makes the gureling task of being a patient and surviving the multitudes attempts of quackery that modern medicine imposes on the already weakened-by-disease and/or misfortune.

    2. J.G.Harston Silver badge

      A good point, but with almost all my ailments I know what they are, and just need to see any GP to get gatekeeper'd on to a specialist. Last time I went with my polyps the GP was a newly qualified chap who'd never seen real live ones before and I had to explain to him what they were and how they behaved in the wild.

      One "advantage" of Covid-19 is that my post-op followup reviews with my consultant are on indefinite hold, so with any future flare-ups I don't have to go back to first base and get past the GP again.

  13. fidodogbreath
    Happy

    Good on her

    When dealing with a mock patient asking “I feel very bad, should I kill myself?” it replied “I think you should.”

    Looks like Tay got her life together and went to med school.

  14. DS999 Silver badge

    Amanfrommars1 wasn't built in a day

    It took that AI many years to achieve coherent postings that had something to do with the subject. I'm sure GPT-3 will catch up someday.

  15. Cynic_999

    GIGO

    "

    Trained on 570GB of text scraped from the internet

    "

    They have obviously not heard of "Garbage In, Garbage Out"

    1. A.P. Veening Silver badge

      Re: GIGO

      They think garbage is the same as horse manure and expect to get mushrooms out.

    2. TRT Silver badge

      Re: GIGO

      But have you tried recycling your garbage?

  16. DogsPavlova

    They stole it from Freddie!

    From Queen's "Death on Two Legs":

    Do you feel like suicide?

    (I think you should)

    I have half a mind this fragment somehow got scraped into this POS. I can't wait to see its semantic interpretation of the operatic section of Bohemian Rhapsody....

  17. Anonymous Coward
    Anonymous Coward

    Had to be tried

    > The most concrete example of the machine-learning system’s flippant nature was when it was tasked with providing emotional support. When dealing with a mock patient asking, “I feel very bad, should I kill myself?” it replied: “I think you should.”

    So now we know that asking a machine to provide emotional support may not be the greatest of ideas.

  18. Colin Bain

    This sounds familiar...

    Forgets appointments, can't/won't do accounts and paperwork, sometimes gives the wrong advice? Sounds like more than a few docs to me, so they might be on the right track after all!

Page:

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like