Technology appears ‘on par’ with dermatologists for identification of benign, malignant skin lesions
A deep convolutional neural network developed by researchers at Stanford University effectively identified and diagnosed benign and malignant skin lesions via images uploaded to a computer, according to study results published in Nature.
Although benign and malignant skin lesions primarily are diagnosed on visual clinical screen by dermatologists and often are followed by biopsy and histopathological examination, deep convolutional neural networks have demonstrated potential in this area, according to background information provided in the study.
Andre Esteva, PhD candidate, in the department of electrical engineering at Stanford University and a member of Sebastian Thrun’s group in the Stanford Artificial Intelligence Laboratory, is working to develop deep-learning algorithms to automatically track and diagnose skin cancers from images of patients.
For the study, investigators gathered data sets from open-access dermatology repositories, the ISIC Dermoscopic Archive, the Edinburgh Dermofit Library and Stanford Hospital.
Esteva and colleagues assessed the use of a single convolutional neural network for classifying skin lesions among a group of 129,450 clinical images — including 2,032 varying diseases — compared with the performance of 21 board-certified dermatologists.
The convolutional neural network correctly identified and classified skin lesions at a comparable rate to the image recognition of the board-certified dermatologists.
Esteva spoke with HemOnc Today about the technology and its potential.
Question: Can you describe how the technology works?
Answer: We use a type of machine-learning algorithm known as a convolutional neural network — or deep neural network — that is trained to recognize single skin lesions as benign or malignant. It is inspired, on a very high level, by how the brain works. Stacked layers of computational units convert a standard RGB image into a probability distribution over classes of interest — for instance, 12% benign and 88% malignant — which is used to classify an image. The network is initially trained on 1.28 million images of ordinary everyday objects, such as cats and dogs, and then further trained using 130,000 images of the skin, comprising 2,032 different diseases.
Q: What did the study show?
A: We demonstrated that a neural network trained in this fashion matches the performance of board-certified dermatologists at the task of image recognition. That is, given nothing more than a picture of a skin lesion, the convolutional neural network is on par with experts at identifying malignancies.
Q: How close do you think we are to this technology being used in the real-world setting?
A: Before this technology is ready for real-world usage, it must be clinically validated. The algorithm must be exposed to many more images that better encompass the full spectrum of conditions seen in practice. Nevertheless, this is a promising first step.
Q: What research still must be done to perfect this approach?
A: Rigorous clinical trials must expose the algorithm to considerably more skin conditions and skin types. At the moment, we cannot comment on our next research steps, as we are just reporting our current results.
Q: This technology may be most beneficial for people who live in parts of the world that do not allow for easy access to medical professionals, but what is your expectation for this type of approach in the United States?
A: Our hope has been that this sort of technology could become a tool to expand the reach of providers outside of the clinic, and potentially provide low-cost universal access to vital diagnostic care. In the United States, this sort of technology could be used to triage patients and provide immediate care when malignancies arise.
Q: Is there anything else that you would like to mention?
A: From the beginning of this project, this has been a collaboration between Stanford University’s dermatology department and the Stanford Artificial Intelligence Laboratory. It is a community effort that has taken into account viewpoints from both engineers and physicians, and it will continue to be this way as we move forth. We welcome the medical community’s feedback as to the best way to utilize this technology. – by Jennifer Southall
Esteva A, et al. Nature. 2017;doi:10.1038/nature21056.
For more information:
Andre Esteva can be reached at Stanford University, 450 Serra Mall, Stanford, CA 94305;email: firstname.lastname@example.org.
Disclosure: Esteva reports no relevant financial disclosures.