A new artificial intelligence system can spot the tell-tale signs of skin cancer just as accurately as human doctors, say researchers, and the next step is to get the tech on a smartphone, so anyone can run a self-diagnosis.
Once the system is refined further and becomes portable, it could give many more people the chance to get screened with minimal cost, and without having to wait for an appointment with a doctor to confirm the symptoms.
The Stanford University researchers behind the deep learning system say the key to its success is an algorithm that enables it to apply what it knows from its existing database of skin cancer samples to pictures it hasn’t seen before.
“We made a very powerful machine learning algorithm that learns from data,” says one of the team, Andre Esteva. “Instead of writing into computer code exactly what to look for, you let the algorithm figure it out.”
To give the system its smarts, the researchers trained it using 129,450 close-up images of skin lesions covering more than 2,000 different diseases, providing a vast database of examples to learn from.
Next, the team borrowed an algorithm developed by Google to spot the difference between cats and dogs in images, and adapted it to tell the difference between skin marks.
They put their new device up against 21 qualified dermatologists, who were shown 376 images of skin lesions and asked to judge if they would refer the patient for further analysis, or give them the all-clear.
Across the board, the AI was able to match the success rate of the professionals.
But the technology isn’t designed to replace doctors – the researchers stress that it’s designed to give people easier access to the first two screening stages before getting expert help.
Spotting the difference between a deadly lesion and a benign one is no easy task, which makes the efforts of the AI system even more impressive.
But the researchers are cautious about releasing the tool to the public before they know it won’t make any false assessments, and real-world clinical testing should help improve it further.
Eventually, the team wants to make their device available through a phone app, so anyone can use it.
“My main eureka moment was when I realised just how ubiquitous smartphones will be,” says Esteva.
“Everyone will have a supercomputer in their pockets with a number of sensors in it, including a camera. What if we could use it to visually screen for skin cancer? Or other ailments?”
We’re now seeing numerous programs and apps, powered by the intuitive reasoning of artificial intelligence showing up on phones, and giving us cheap and easy ways of assessing our health at home – and that has to be better than just typing a few symptoms into Google.
And like many other diseases, early diagnosis of skin cancer crucial: if spotted early, 10-year survival rates are around 95 percent, but that drops to 10-15 percent if the cancer has reached its later stages before being treated.
Consultant dermatologist Anjali Mahto, spokesperson for the British Skin Foundation, told Nicola Davis at The Guardian that the findings were encouraging.
“This is an exciting new technology that has the potential to increase access to dermatology at a time where there is a national shortage in this speciality and the rates of skin cancer continue to rise,” she said.
The research has been published in Nature.
Subscribe to Blog via Email
Join our list
Subscribe to our mailing list and get interesting stuff and updates to your email inbox.