back to article An 'AI' that can diagnose schizophrenia from a brain scan – here's how it works (or doesn't)

Scientists have had a crack at using simple machine-learning software to make psychiatry a little more objective. Why, you ask. Well, rather than rely on a professional opinion from a human expert, as we've done so for years, why not ask a computer for a cold logical diagnosis? One benefit of using code as opposed to a …

  1. Anonymous Coward
    Trollface

    Aren't all people a little schizophrenic ?

    Especially those who have to telepathically contact their invisible friend a couple of times a day ?

    1. Pen-y-gors

      Re: Aren't all people a little schizophrenic ?

      Harvey assures me that I'm perfectly normal and stable.

      1. Pascal Monett Silver badge
        Coat

        Re: perfectly normal

        I have come to believe that "perfectly normal" can be replaced by "acceptably crazy" without any prejudice whatsoever.

        1. Captain DaFt

          Re: perfectly normal

          I have come to believe that "perfectly normal" can be replaced by "acceptably crazy" without any prejudice whatsoever.

          Sadly, the way it actually works is money.

          Homeless, ranting on a street corner about the machines taking over?

          Lock him up, he's a danger to society!

          Rich, ranting on twitter about the machines taking over?

          Sign him up for a lecture tour, he's humanity's saviour!

  2. David Crowe

    the elephant in the room

    The elephant in the room is false positives, that they didn't address. If it was truly 75% accurate then it would diagnose 25% of non-schizophrenic people as schizophrenic. It is probably better than that but even if it's 99% accurate (optimistic!) at not diagnosing healthy people falsely it would still produce 10 false diagnoses out of 1000 scans of healthy people. And let's say the rate of schizophrenia in the general population is 1/1000 (probably less) then false positives would significantly out number true positives. The conundrum of Positive Predictive Value that makes medical screening dangerous.

    1. Anonymous Coward
      Anonymous Coward

      Re: the elephant in the room

      >And let's say the rate of schizophrenia in the general population is 1/1000

      Stats suggest somewhere between 1/200 and 1/100 depending on your country. It covers a wide range of effects and intensity, so diagnosis is rather variable (as the article states).

    2. Anonymous Coward
      Anonymous Coward

      Re: the elephant in the room

      The failure to mention false positives and the very small training set were also the things that leapt out for me. Given the relatively low incidence of schizophrenia in the general population then as it stands this is worse than useless. If the false positive rate is anywhere near 25% it would diagnose schizophrenia incorrectly far more often than correctly. Being realistic a test that achieves 74% accuracy in diagnosis in a patient group in whch roughly half has the condition is very very poor. It is more interesting for what it says about the physical aspects/associations/causes of the disease than in any way useful.

      The other question not addressed is how do humans perform at this task, that is intrepretting the results of this paticular imaging protocol perhaps with some specific processing and visulisation? Do they do better or worse or can they not do it at all?

    3. 's water music

      Re: the elephant in the room

      false positives would significantly out number true positives. The conundrum of Positive Predictive Value that makes medical screening dangerous.

      That is a big problem in a population screening program but less so if the diagnostic test is applied to people who have presented with problematic symptoms and a doctor is looking for decision support in deciding on a course of action. Of course the prediction algorithm will simply have encoded the subjective diagnoses used to classify the training subject cohort but there may be some room for improving some diagnoses.

      1. Anonymous Coward
        Anonymous Coward

        Re: the elephant in the room

        This type of screening shouldn't be used as a diagnosis, but it could be used as a filter to identify people who should get some additional screening.

        Well, in theory at least...I doubt giving everyone a brain scan in college (when symptoms of schizophrenia tend to show up) will become part of a normal physical, but who knows? If it could be made more accurate and could be shown to predict schizophrenia before symptoms show up, it might help as at-risk patients and their doctors could be more prepared and begin treatment at an earlier stage.

        Certainly worth more of a look, as a larger training set might increase accuracy, and a longer term study could show whether it is predictive or can only diagnose after symptoms have presented.

    4. regregular

      Re: the elephant in the room

      1. I have nowhere seen an indication that the falses were all false positives. They might as well have been all false negatives, although a split into false pos and negs is the more likely variant. Assuming a 50:50 split between pos and neg it would give a false positive to 12 percent. And it would "miss" the other 12 percent.

      2. You're sort of assuming that this kind of brain scan is going to be mandatory for everyone. I doubt that anyone advocates this kind of method becoming even a routing screening in hospitals. It is just another diagnostical tool to be usedby medical professionals.

  3. Christopher Reeve's Horse
    Big Brother

    Impressive analysis but

    I still don't like the term AI being slapped on every bit of machine learning or 'big data' analysis...

    Maybe I'm just old fashioned, but I liked it when words and definitions had specific meanings.

    1. Anonymous Coward
      Anonymous Coward

      Re: Impressive analysis but

      Don't worry, in a few years we'll have moved on to the next buzzwords, and not everything will be cloud, AI or IoT.

      1. Roj Blake Silver badge

        Re: Impressive analysis but

        I can tell that you're just going to love serverless computing...

  4. James 51
    Meh

    I can't help but be reminded of an early machine learning project that the US military had. They showed the system photos of NATO hardware and Soviet hardware (which shows you how old this story is). In the end the system was very reliable until someone realised what had happened is that the system had actually ‘learned’ to tell the difference between the good quality photos of NATO hardware and the poorer quality shots of Soviet hardware. The contents of the photo were irrelevant.

    1. Captain DaFt

      Then there was the one that was trained to spot tanks hiding under cover.

      It was shown pictures of the same areas, with and without tanks, and appeared to be very successful it until they tried in the field.

      It was worse than useless.

      It turned out that the day they took the pictures without tanks was sunny, and the day with tanks, cloudy.

      It was spotting cloudy skys instead of tanks. ☺

  5. bombastic bob Silver badge
    Coat

    just because your'e scizophrenic...

    doesn't mean the voices aren't real.

    What, you DON'T hear voices in YOUR head? It must be lonely in there...

    "I told it to go away, and it DID, precious!"

    grabbing coat, now.

    1. Throatwarbler Mangrove Silver badge
      Trollface

      Re: just because your'e scizophrenic...

      This explains so much.

    2. Captain DaFt

      Re: just because your'e scizophrenic...

      What, you DON'T hear voices in YOUR head? It must be lonely in there...

      Well, I used to, but they got so pissed off with my snarky attitude that they refuse to talk to me now.

      So now I post on El Reg instead. ☺

  6. Anonymous Coward
    Anonymous Coward

    (f)MRI as a Dark Art ... ? El Reg own opinion, echoes ...

    https://www.theregister.co.uk/2013/04/12/brain_science_low_power_junk/

    https://www.theregister.co.uk/2015/04/21/irrelevant_neuroscience_info_makes_psychological_explanations_appealing/

    https://www.theregister.co.uk/2016/07/03/mri_software_bugs_could_upend_years_of_research/

    https://www.theregister.co.uk/2016/07/07/the_great_brain_scan_scandal_it_isnt_just_boffins_who_should_be_ashamed/

    1. Anonymous Coward
      Anonymous Coward

      Re: (f)MRI as a Dark Art ... ? El Reg own opinion, echoes ...

      Orlowski articles though, so take them with a truckload of salt.

    2. Wapiya
      FAIL

      Re: (f)MRI as a Dark Art ... ?

      fMRI is best described as "how do you want the result"? It is a case where a small dataset is not sufficient.

      The IGNoble Price 2012 for neuroscience proves this rather drastic.

      https://blogs.scientificamerican.com/scicurious-brain/ignobel-prize-in-neuroscience-the-dead-salmon-study/

      http://prefrontal.org/files/posters/Bennett-Salmon-2009.pdf

      Anything based on fMRI with a test base lower than a few million and sigma lower 5 is tarot reading from an expensive toy. fMRI is used by medical people, but the necessary evaluations have to be done on raw data by professionals in mathematics, statistics an computer science. Not by someone, who can use SPSS.

      1. Mike 137 Silver badge

        Re: (f)MRI as a Dark Art ... ?

        not quite as dark as other methods though. Twenty years ago I briefly worked on an attempt to use evoked potentials (brain waves) to do the same thing. I never found out whether the team as a whole finally 'succeeded' in their own terms, but the principle was fundamentally flawed due to individual variation masking the common factor of interest. However that didn't deter them from the apparent ultimate objective of a diagnostic helmet with two lights on it - green for sane and red for mad.

  7. Chris Miller

    74% is a level of accuracy most human psychiatrists can only dream of reaching. (They'll never admit this, of course, because they're 'in denial'.)

    1. The Mole

      This was my thought. How certain are we that the 46 people all do actually have schizophrenia?

      Its not clear from scanning the paper whether the error was mostly false positives or false negatives. If it is predominately false negatives that could just mean the computer is 100% accurate at detecting schizophrenia and the remaining patients have a different brain condition that presents the same set of symptoms and therefore have been misdiagnosed.

      1. Andy The Hat Silver badge

        This is a moot point - as the original diagnoses (pl. sp?) are purely subjective it is always possible, perhaps even probable, that any individual diagnosis was incorrect.This makes the AI results as a comparative measure equally subjective and prone to "inaccuracy" ie not reaching the same conclusion as a psychologist.

        In any such "interpretive" field, do we accept AI's computational results, where results are by definition subjective, as equal to the "correct", human derived conclusion?

  8. allthecoolshortnamesweretaken

    No AI could possible diagnose me. And me neither.

  9. John Smith 19 Gold badge
    Unhappy

    Seems pretty low res.

    27 000 is 30^3. Not exactly down to the neuron level, is it?

    This is indeed "machine learning" in a very limited, highly mathematical way.

    "Artificially Intelligent?" I don't think so.

    Apart from false positives (or negatives) we also have the question of wheather the same results are part of other metal illnesses or disorders (illnesses are treatable, disorders have to be managed), of which there are a lot.

    So it's a start, but there's a long way to go.

    BTW all joking aside in most cases of schizophrenia the "split" is between the patients idea of reality and actual reality, often with the symptom of hearing voices.

    1. Pen-y-gors

      Re: Seems pretty low res.

      BTW all joking aside in most cases of schizophrenia the "split" is between the patients idea of reality and actual reality, often with the symptom of hearing voices.

      So, we have 17,000,000 schizophrenics in the UK then - plus Trumpf and friends over the pond

  10. Anonymous Coward
    Anonymous Coward

    Doctors

    I went to see the doctor about my schizophrenia.

    He said he was in two minds as to whether I had it or not.

  11. Miss Config
    Holmes

    Abolishing That Popular Belief

    The question is :

    even if they managed to cure schizophrenia, how long would it take to get rid of the horribly popular misconception that schizophrenia is .......... Multiple Personality Disorder

    ( when people say that being in two minds about something is 'schizophrenic' )

    ???

  12. This post has been deleted by its author

  13. handleoclast

    When one person

    When one person suffers from a delusion, we call it a mental illness.

    When millions of people suffer from the same delusion, we call it a religion.

    When one person hears voices in his head telling him what to do, we call it schizophrenia.

    When millions of people hear the voice of JHVH/Allah/Jebus in their head telling them what to do, I still call it a mental illness.

    YMMV, depending upon what the voices in your head tell you to think.

  14. Anonymous Coward
    Anonymous Coward

    I don't need a machine to tell me that a psycho is chasing me down the street with an axe mumbling he's on a mission from god that there's a 75% chance that they are suffering from schizophrenia, I can tell you with 100% certainty.

  15. TRT Silver badge

    Oh! I see...

    f MRI... I was wondering how gross structural morphology was going to be a correlate of schizophrenia.

  16. TRT Silver badge

    Without reading the original paper...

    It seems to me that the protocol was "spot which of these fMRIs you've seen before."

  17. MNGrrrl
    FAIL

    Title

    "New AI tells people sane people they're crazy 25% of the time".

    FTFY

  18. Chris G

    Schizophrenia; "The voices told me to do it."

    Paranoia: "It's the fault of all the others."

    "But they are all blaming me."

    1. TheElder

      However and as usual, being paranoid may be because they really are after you.

  19. TheElder
    FAIL

    Nonsense

    "One benefit of using code as opposed to a psychiatrist is that its results should be consistent across all patients"

    One little problem. Patients are not consistent across all patients. A good friend of mine is a forensic shrink. He deals with serial killers and similar. The only consistent thing there is that they are very good actors.

    The human brain follows chaos theory, same as the weather. One can predict the weather with 75% accuracy by saying it will be the same as today.

  20. Anonymous Coward
    Anonymous Coward

    Things Are Better Now

    When my wife complains that I never hear what she says, I ask how I am supposed to diferentiate her voice from all the others in my head. Get my attention first, then speak.

    The most fascinating aspect of the disease for me is that if you live in a third world country without drug treatment - you may get better.

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon