Hmmmm
Apparently one system that had been spectacularly good at spotting cancer was discovered to have realised that a lab slide barcode in shot was a good predictor that an image contained a carcinoma ...
Embracing the chatbot standard of unreliable information, Google has updated the Lens image recognition feature in its eponymous iOS and Android apps to possibly identify skin conditions. "Describing an odd mole or rash on your skin can be hard to do with words alone," said Lou Wang, senior director of product management for …
Some people will unnecessarily worry when a benign condition is indicated to be something that could kill them, others will decide they don't need to see a doctor when it incorrectly diagnoses skin cancer as something that can be treated with over the counter ointments.
You are right. And, to go back to a medical story earlier here on El Reg, grow practice waiting lines, for which the solution seems to be patients doing the webcam thing. Technology to the rescue. But...
"Welcome to the era of unreliable, disclaimed products, just as we're getting acclimated to social media misinformation for which platforms aren't really accountable.
...for me a(nother) big issue lies with this. In essence this is the same as me diagnosing you in my practice, and at the end of consult telling you "Oh, and forget what I said. I know you came to me for consult, but hey, I'm just winging it, have no clue, nor am qualified to do so. You do know BTW that you, or your family when you're dead, can't quote me on anything, right? So yeah, I suppose your guess is just as good as mine." I think many would not feel happy about that if I ran my practice like that. Well, the Alphies and their sheeple seem to differ. And then we are not even going into the whole "liability" area that physicians do, but big US tech doesn't seem to have. Whether it is you dying due to misdiagnosis or me saving your (at least IP marked) medical record/ picture...
But I'm sure it says "beta" too, just like lots of things in the computer world.
There should be laws about selling things with beta-level software that you're basically forced to rely on.
I bought a Tesla, without the self-driving software. But the basic cruise control functions are in beta, you basically have to accept to be able to drive the car...
Felt a bit weird.
Not at all, because getting to a dermatologist takes weeks to months in relatively high income countries. And probably not available at all in other countries or small towns. From my experience, even if you get to see one, their diagnosis is often incorrect or none. At least for non-cancer related conditions. So I would not give dermatologists much credit for detecting early stage cancer, as they would probably make biopsy only if something is alarmingly wrong (and late?). Maybe much more in the case an experienced oncologist, but getting that far in the chain of medical referrals is unlikely to happen for initial stages.
Whether they add a caveat to it or not, this is functioning as a medical device and should be registered as such.
Massive implications eg if it diagnoses a melanoma as benign; game over for the patient with no repercussions for Google.
There is a reason medical devices are regulated. Google is playing doctor but with no oversight. Totally irresponsible.
We all know what's going to happen when Google (mis)diagnoses even a hint of a possible carcinoma. Sell the information to every Tom, Dick and Harry agency they can, including Medical Insurance companies.
Welcome mysteriously massively hiked med insurance fees or none offered at all.
Actually Google's idea is good. Because a larger population on Earth do not have access to medical services. Even many US citizens have insufficient health insurance. Now, why self-diagnosis is bad, if error rate is reasonable and skewed towards false positives, for example.
Some uses of AI in this space could actually work - but only when combined with experts. for example, high quality digitisation would mean dermatologists and pathologists can share opinions and collaborate in real time, without having to wait for physical biopsies and slides to arrive; beyond that, the "simple" assessments (99%+ accuracy benign or malignant) could be removed from human assessment, maximising expert time on the complex, nuanced cases..
Later, samples can be matched with outcomes, but this has repercussions for GDPR and Personally Identifiable Information relating to health records, as well is IP and record stewardship. The urge to monetise these knowledge sets would be strong and should be rejected...Hannay Fry (of Rutherford and Fry fame) recorded an excellent piece on this on BBC sounds - https://www.bbc.co.uk/sounds/brand/m001mdn2