back to article Why build your own cancer-sniffing neural network when this 1.3 exaflop supercomputer can do if for you?

The world’s fastest deep learning supercomputer is being used to develop algorithms that can help researchers automatically design neural networks for cancer research, according to the Oak Ridge National Laboratory. The World Health Organisation estimates that by 2025, the number of diagnosed new cases of cancer will reach 21. …

  1. DCFusor

    Forgive my ignorance, but aren't NN's pretty quick to run unless they are outrageously big? Back when I messed with such things (the 90s), most PC's could run one at all, and of a decent size - and a PC from today would be loafing for "test this one case". OK, the order of thousands a second for a central super computer, but how about the doctor's office? Do they really need speed past 1 per few minutes, or even that?

    Heck, how many cancer tests per country per second anyway - A million tests could be done in 1000 seconds at 1k/sec. That's just nuts. Please check my math...

    Yes, NN's are a bear to train, and trying lots of different models and training each makes that load extreme and is supercomputer turf. But I see no mention of looking past the cool expensive toy to real world use in a specialist's office. Does everything have to be a subscription model on a centralized mainframe (again? Watching the wheel turn is getting boring...)

    1. TheSkunkyMonk

      Is the power not for the creating the software? AI making AI and all that scifi horror stuff?

  2. Count Ludwig

    Role of humans in a semi-automated detection system

    If you have a semi-automated cancer detection system then you need to pay careful attention to the job of the humans, otherwise you can render them useless and possibly kill more people than if you had not used the system at all.

    Seems counter-intuitive - how is this possible? Apologies for the long explanation:-

    Imagine a haystack that contains 1) some shiny needles, 2) some not so shiny needles, 3) some needles that could possibly be mistaken for straw, 4) quite a lot of straw that looks like needles 5) a massive load of straw that is obviously straw.

    Any needle that is not found in time will go rusty.

    Up till now you have had humans looking for the needles - a mind-numbing job but one that requires high qualification. They've been able to do it, kind of, so far, but it would be great if we could improve the situation.

    Now you get a machine to find and remove the shiniest needles, and the not so shiny needles.

    You are left with needles that look like straw, straw that looks like needles, and a load of stuff that is obviously straw.

    This leaves the humans with the job of sorting through a load of straw to find a few needles that look like straw.

    Those needles will probably not be found, and will go rusty. And the highly qualified humans whose mind-numbing job has just got far worse will go mad or leave.

    Better for the automated system to remove what is obviously straw and leave the interesting cases for the highly qualified humans.

    The proof is in the measuring: does the rate of cancer deaths go down when system X is used?

    We would normally measure this for a fully automated system anyway, but my point it's even more necessary to measure this for a semi-automated system given the temptation to assume adding flashy computers is bound to be better than using humans alone (also consider the high staff turnover in such a semi-automated system).


    1. This post has been deleted by its author

POST COMMENT House rules

Not a member of The Register? Create a new account here.

  • Enter your comment

  • Add an icon

Anonymous cowards cannot choose their icon

Other stories you might like