Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why go to the hospital at all? Just take the photo at home.


As someone with a phenotype more subject to skin cancer, I have been using an app called SkinVision for years.

From my anecdotal experience it tends to flag the same moles as dermatologists do, and they have actual dermatologists review images where the model has low confidence, so overall pretty happy with it.

Note that I am not affiliated with them in any way.


Interesting information snipped from their website .... I'm in Australia so will try it out. Yes specifically says its not available in US

About SkinVision SkinVision was founded in 2012 and provides a mobile phone application, which supports individuals with the early detection of the most common forms of skin cancer (melanoma, squamous cell carcinoma, basal cell carcinoma, and precancerous actinic keratosis). SkinVision is the first CE marked skin cancer application based on extensive clinical trials, conducted in partnership with Erasmus Medical Center (EMC) and the university clinic of Ludwig Maximilian University (LMU). Research shows the app has a sensitivity of 95% and a specificity of 78%. The SkinVision app is commercially available worldwide on iOS and Android except for a few countries, such as the United States and Canada. SkinVision is based in Amsterdam, the Netherlands.

The SkinVision Service is a Medical Device and is registered with the Australian Therapeutic Goods Administration (TGA).


Looks like it’s not available in the US unfortunately.


One of the problems with these sorts of machine learning applications, including this exact use case elsewhere, is that they have been extremely sensitive to the imaging equipment used. Train it on a dataset of images from one source and it is only accurate on images from that source. Possibly only accurate on images from the exact same device. For home use, it needs a huge training set of images taken by all sorts of devices in all sorts of lighting conditions. And then the system will need to be improved until the error rates become useful.


4 years ago I had my partner take photos of moles that had changed colour on my back. I used the MySkinDoctor app https://www.myskindoctor.co.uk/ to send them to my consultant dermatologist (who, at the time, I had never met -- having just moved to a new area). Upon review they arranged for an in-person appointment and I was seen within a week or so. All through the NHS.

So yes, taking the photo at home is perfectly doable. I still went to hospital though.


I develop the backend for that. Nice to hear it worked well for you.


Google announced this in 2021, but never released it AFAICT: https://blog.google/technology/health/ai-dermatology-preview...


I assume if you get a positive response you will want to speak with doctor about it


If the thermometer says 39 C/102 F I will want to speak with a doctor about it but that doesn't mean I want to go to the hospital every time I want my temperature checked.

I suspect it has a lot more to do with these lines:

> medical photographers taking photos of suspicious moles and lesions

I.e. it might not be ready enough or validated for an average person snapping their own photo and:

> The images are then transferred to a desktop computer for greater analysis before the tool determines the result

There is more to it than the phone app and it may not be packaged in a way that is currently worthwhile to distribute to home users.

Both of these (and other things) may change with time of course.


Article says medical photographers take the pictures. I'd guess a future iteration will have the app direct the patient to take the pictures.


Certain skin cancer detection methods necessitate direct contact with the affected area to identify any abnormalities.


Because you don't know what the lighting is, what the sensor is, and what kind of shit the camera app and ISP are doing to the image without telling you

I recently tried to show some images to a vet. Something in my phone fucked up the amount of red in the image, making them useless (guess what, figuring out how much blood is present is pretty important for medical applications)

Probably at some point we will all have a separate medical camera with specified response and with a specified led illuminator. Apple will probably get their phone certified in some medical camera mode. Right now I don't think phone cameras can be trusted

Not all applications need accurate colour, (no idea about cancer checks) but some really do.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: