back to article Hospital to test AI 'copilot' for doctors that jots notes on patient care

The University of Kansas Health System is set to trial software designed to help doctors automatically generate notes from conversations with patients, a move billed as "the most significant rollout to date of generative AI in healthcare" yet. The technology, developed by Pittsburgh, Pennsylvania startup Abridge, aims to …

  1. AnotherName
    Alert

    Why AI???

    Surely all this needs is a speech-to-text engine to write down what the doctor actually said, instead of trying to interpret what they said. Isn't that just as bad as trying to understand the written notes in whatever bad handwriting they have. Overkill, in more ways than one?

    1. Version 1.0 Silver badge
      Joke

      Re: Why AI???

      Computer medicine has been documented for a long time now ....

    2. Filippo Silver badge

      Re: Why AI???

      I'm guessing that the principle is that an "AI" could get rid of a lot of "noise" - not actual noise, I mean all those bits of conversation that don't actually carry useful information. Plenty of those.

      Of course, anyone who has used these "AI" systems knows that they make mistakes. Depending on the context, the rate of mistakes is sometimes very low, and sometimes very high. Worse, their mistakes are sometimes quite hard to catch.

      On the other hand, though, doctors also make mistakes while doing this type of work. This is especially true if they are overworked, and, let's face it, most of them are, shamefully so.

      So the big question is - is the "AI" going to make more and/or worse mistakes than a doctor?

      I don't think the answer is obvious at all. There are fields where "AI" systems make loads of enormous mistakes, and others where they work about as well as a human or even somewhat better. I have no idea what category this falls in.

      It's good that trials are being made. I just hope the trial results are taken seriously, and not "creatively interpreted" to make the "AI" look better than it is.

      1. Anonymous Coward
        Anonymous Coward

        Re: Why AI???

        "I'm guessing that the principle is that an "AI" could get rid of a lot of 'noise' ... all those bits of conversation that don't actually carry useful information. Plenty of those."

        Amen to that. My side gig is reading doctor's full reports and summarizing for other doctors (and eventually lawyers - workers' compensation and similar lawsuits). Some doctors can WAY too wordy, and we paraphrase, but in a way that still captures the critical/core information/facts.

        As for doctors making their own mistakes -- there is plenty of that also, which we have to fix on the fly. I blame doctors using text-to-speech software already (dictation, a time-honored tradition in medicine) without a human proofreader. AI might help this, but not 100%, so my job is more-or-less safe.

        1. Anonymous Coward
          Anonymous Coward

          Some doctors over summarise

          "No feeling around backside, no feeling to defecate, not defecated for over a week, being sick after eating, back pain, no change of diet" were the symptoms I reported.

          Years later when I read the Drs. notes it said "diet constipation".

          Yes, it was something very significant and permanent!. (Compensation in the UK is to help adapt your environment and provide care, not to compensate for the damage caused by the Drs. I can still walk(ish) and work from home and so my ruined life gets nothing!)

          What would the AI make of my symptom list? Hopefully not summarise as much as the Dr.

  2. alain williams Silver badge

    AI interpretation of surgeon & patient conversation

    Surgeon: We will cut off your diseased foot, the left one ?

    Patient: right.

    Which foot will the AI summarise needs to be cut off ?

    1. Filippo Silver badge

      Re: AI interpretation of surgeon & patient conversation

      If that exchange is the source of authority of the information on what foot needs to be cut off, then someone has made a grave mistake, long ago, that neither the AI nor the doctor can fix. Probably back when hospital procedures were devised.

      1. pecan482

        Re: AI interpretation of surgeon & patient conversation

        All I see are major law suits being paid out.

    2. vogon00

      Re: AI interpretation of surgeon & patient conversation

      My thoughts exactly.

      One has only to read the automatically-generated 'Closed Captions' on YouTube vids to see the misinterpretations made. These don't generally matter to me as they are the English Subtitles over English audio, and I can spot the speech-to-text mistakes and adjust accordingly. However, the written word on it's own would have led to ...confusion. Some of the 'hiccups' don't matter, but some do - there have been a couple I saw that meant the written was most definitely NOT what was spoken. Sorry, I can't quote refs...it was quite a while ago.

      Leaving aside the obvious mistakes/wrong words in the machine translation, what about punctuation? A 'Let's eat, Grandpa! vs. Let's eat Grandpa!' issue could be quite serious in the medical world.....and the legal come to think of it.

      Also, the NLP/speech-to-text has problems with accents.... Sure, en-something may be the language, but any accent can confuse the beast, it seems - the more pronounced the accent, the higher the error rate. And that's with 'broadcast-quality' or decent audio input, not the muffled low quality crap you're likely to get as source from whatever low-tech, cheap, shit device is used to record the audio in the first place.

      Until the 'error-rate' improves, especially with accented speech or a non-native speaker (in ANY language), I for one dread my medical records getting transcribed by AI. This seems like a baaaaaddddddd idea and pretty shitty science, 'pushed' in the name of progress,

      Yup, NOT a fan of 'AI', because - just like Tesla's Autopilot - it claims to be something it ain't.

      1. pecan482

        Re: AI interpretation of surgeon & patient conversation

        I am so one hundred percent with you, this is just a major disaster waiting on the horizon

  3. Mike 137 Silver badge

    It might work ...

    As long as it doesn't start insisting it loves the doctor or the patient.

  4. Doctor Syntax Silver badge

    "three bottles of wine every week.... Wine would be an entity, and an attribute would be three bottles, and other attribute every night."

    If that's how it works the time saved in taking notes will be spent in fixing the results.

    1. Arthur the cat Silver badge
      Unhappy

      A GP prescribing 3 bottles of wine a night? Mine told me to stop that sort of thing!

  5. xyz123 Silver badge

    This is Microsoft AI.

    only a matter of time before it decides the patient is either "too expensive" or a member of an ethnic group it doesn't like, and suggests either suffocating it with a pillow or putting it in the incinerator.

  6. steviebuk Silver badge

    The real hope

    "The hope? Reducing piles of admin for clinicians freeing them up for medical work"

    Paying people less and relying more on AI.

    What's wrong with hiring a PA and giving someone a job.

    1. ecofeco Silver badge

      Re: The real hope

      This, and yet some numpty gave you a down vote.

    2. breakfast
      Flame

      Re: The real hope

      Certainly in the UK the minute a hospital hires any staff aside from doctors and nurses, our ridiculous, witless, far-right press start running around after them screaming "WASTE WAAAASSSTTEEE" so loudly it's basically impossible to get their jobs done. Obviously any government worth their salt would let professionals decide what they need based on their domain knowledge, but all our governments have their policies decided by the aforementioned ridiculous, witless, far-right press so obviously they immediately insist hospitals fire anyone working in support services.

      1. 43300 Silver badge

        Re: The real hope

        In fairness the NHS does waste a lot of money on non-jobs - you've only got to look at their job adverts to see that. If they want to reduce waste, they could start by getting rid of all those EDI Managers. That sort of thing should simply be a function of HR, as it has always been until recent years. And the NHS does not have a problem with 'diversity' - its proportions of staff who are from ethnic minorities demonstrates that with all the major minorities it has above the level in the general population.

        Admin staff who actually reduce the workload of the clinical staff aren't a waste of money.

    3. Caver_Dave Silver badge

      Re: The real hope

      When my daughter was a trainee Dr. she had to write the notes up for the Consultants during the ward rounds.

  7. Howard Sway Silver badge

    doctors remain in charge, and should check and edit the generated notes if necessary

    I like the word "should" in that sentence.

    25 years of producing IT systems has taught me that in contexts such as this, it's a synonym for "won't".

    1. Richard 12 Silver badge

      Re: doctors remain in charge, and should check and edit the generated notes if necessary

      In reality, it's "can't".

      I've seen far too many cases where people happily accepted and signed off minutes of meetings that were absolutely wrong.

      And when you ask them if they're sure, they say yes.

      And when you play them the transcript, they're shocked as it's absolutely not what they now remember, after reading the false minutes.

      I believe there are several studies on this effect, though I could be misremembering.

  8. ecofeco Silver badge

    Reduce the workload?

    I've helped deploy Epic. A absolute incredible bit of hospital patient software. So I've seen what it take to treat and also process, patients.

    The sheer amount of steps it takes to treat a patient will never be "reduced." And you cannot blame just administration, which plays its part in the workload. The vast majority is tracking the medical treatment itself and drug control.

    The ONLY solution is to hire more people and pay them well.

  9. that one in the corner Silver badge

    If you believe current AI systems are foolproof

    then I've got Abridge to sell you.

    1. breakfast
      Unhappy

      Re: If you believe current AI systems are foolproof

      We're caught in a double-bind here where our AI systems are not foolproof, but unfortunately our fools are not AI-proof.

  10. pecan482

    I am a retired nurse of 46 years, and this integration with this AI I thought would never happen. Doctors have been screaming for help for years, but I don't think this is the way to go. Where does "HIPPA" play a role in this? or will " HIPPA" be a thing of the past as well as the "Patient confidentiality to the Doctor"?

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like