Computers are learning to distinguish whether skin lesion images are benign or malignant and require further treatment. It appears they are on par with, if not more accurate than, many dermatologists, a study shows.
Stanford University researchers compared a computer-driven deep neural network with dermatologists’ ability to visually classify possible melanoma and basal cell or squamous cell carcinoma lesions.
“We recruited 21 board-certified dermatologists and showed them 300 images, where they had to classify the lesions as benign or malignant, and whether they would biopsy or reassure the patient,” said the study’s lead author, Roberto A. Novoa, MD, of Stanford University in California. “From there, the algorithm performed about as well as the dermatologists, if not better. There were dermatologists who did perform better than the algorithm but, in general, this was a proof of concept study, so … we were demonstrating the efficacy of these algorithms for making the diagnosis.”
Computers That Think Like Humans
Deep neural networks are a type of computer algorithm within the field of artificial intelligence, which uses computers to mimic brain function, including reasoning.
While it might sound like a new concept, it’s not. Researchers reported on the potential for computers to diagnose facial tumors in 1986.
“The idea for neural networks has been around since the 1960s,” Novoa said. “But it was only in the last 5 years that the computing power and technological capabilities caught up to the math.”
Today’s neural networks take in vast amounts of information. They then learn the rules that lie behind the data to derive patterns and, eventually, correct answers, Novoa explained.
“Essentially, deep neural networks are mathematical equations that are stacked in layers. They start at the most basic level by learning the edges of all the objects in an image. Then, they move on to telling you this is a triangle or a square. The next layer might indicate whether an image is a cat or a dog. Finally, there’s a layer that says this is a Belgian Malinois or a German Shepherd … If it gets the answer wrong, it goes back through the equation and changes the values and weights of that equation until it gets the most answers correct for the most number of images. By doing so, it learns, over time, what’s important and what’s not.”
From Dogs to Skin Cancer
The genesis of the research, published in Nature last year, began 3 years ago, when Novoa said he saw the advances in the field of deep learning, where these computer programs could differentiate on their own between Belgian Shepherds and German Shepherds.
“I thought if we can do this for dogs, we can do this for skin cancer,” he said.
Novoa and his colleagues at Stanford gathered a dataset of nearly 130,000 images from the internet, including open source images and images from Stanford databases. The researchers went through the data and created a visual taxonomy for the more than 2,000 disease categories, whittling those down to 10 different general categories. The team then used the dataset in the deep neural network’s classifier.
The researchers measured the algorithm’s performance by creating a sensitivity-specificity curve. Sensitivity represented the algorithm’s ability to correctly identify malignant lesions, and specificity represented its ability to accurately identify benign lesions. Assessing the algorithm through the diagnostic tasks of keratinocyte carcinoma classification, melanoma classification, and melanoma classification when viewed using dermoscopy, the researchers found that the algorithm matched the dermatologists’ performance in all three tasks, with the area under the sensitivity-specificity curve amounting to at least 91% of the total area of the graph, according to a Stanford news release.
The deep neural network, in this case, had been trained to classify skin cancer, but the study was done on images of skin cancer that the network had not yet seen. “If you test it on images it has already seen, it already knows the answer,” Novoa said.
The images in the study already had biopsy-proven results. This allowed the researchers to know the true answers when testing the deep neural network and dermatologists.
The next step is to test the algorithm in the real world, Novoa said. “We’re currently putting together a clinical trial to see how well it will perform in the real world. We want to see how well it performs with real patients, with a smartphone camera and a variety of lighting.”
There have been other studies using artificial intelligence and deep neural networks to diagnose melanoma, but this one is different in a few ways, Novoa explained. “Most of the other studies were performed using dermoscopic images. Dermoscopy kind of limits the number of variables. It’s always taken from the same distance and with pretty similar lighting.
“This was looking at clinical images, but they were from a variety of different distances and angles. This also was looking at melanoma, squamous cell carcinoma, and basal cell carcinomas. So, there was a wider group of images. We used a pretty large dataset — it’s among the larger datasets we’ve seen to date.”
The basic idea of neural networks is that they do pattern recognition, just as a dermatologist would learn patterns and visual data, Novoa said.
Deep neural networks and artificial intelligence may have a growing role in dermatology to help dermatologists provide better care, but not to replace the need for human expertise. While the computer algorithms might be used to triage or screen patients, ultimately a healthcare professional will take responsibility (and liability) for the care.
Algorithms, though, still need improvement.
“If there are biases in the dataset, these can be introduced into the algorithm results,” he continued. “So, for a long time there is going to be a need for supervision of these results in order to have things work optimally. For example, if the algorithm has a ruler in the image, it’s more likely to call it cancer. Why? Because on average, images in our dataset that have rulers are more likely to be malignant. That’s just a small example of the kinds of biases that can be introduced into the data.”
The near future of the technology has far-reaching implications, including the potential development of such things as a smartphone-compatible algorithm, where consumers could capture a lesion in an image and the phone would screen it for skin cancer probability.
“Technology has been changing medicine and all of human endeavors for hundreds of years, but it hasn’t eliminated the need for doctors,” Novoa said. “I don’t think we’re going anywhere. I think dermatologists will be able to adapt and take advantage of these technologies to take care of patients.”
This article originally appeared on the website of our partner Dermatology Times, which is a part of UBM Medica. (Free registration is required.)