Artificial intelligence identifies prostate cancer with near-perfect accuracy
July 27 , 2020
by  


Dhir and his colleagues provided images from more than a million parts of stained 
 slides taken from patient biopsies. Each image was labeled by expert pathologists to teach the AI how to discriminate between healthy and abnormal tissue. The algorithm was then tested on a separate set of 1,600 slides taken from 100 consecutive patients seen at UPMC for suspected prostate cancer.

During testing, the AI demonstrated 98% sensitivity and 97% specificity at detecting —significantly higher than previously reported for algorithms working from tissue slides.

Also, this is the first algorithm to extend beyond cancer detection, reporting  for tumor grading, sizing and invasion of the surrounding nerves. These all are clinically important features required as part of the pathology report.

AI also flagged six slides that were not noted by the expert pathologists.

But Dhir explained that this doesn't necessarily mean that the machine is superior to humans. For example, in the course of evaluating these cases, the pathologist could have simply seen enough evidence of malignancy elsewhere in that patient's samples to recommend treatment. For less experienced pathologists, though, the  could act as a failsafe to catch cases that might otherwise be missed.

"Algorithms like this are especially useful in lesions that are atypical," Dhir said. "A nonspecialized person may not be able to make the correct assessment. That's a major advantage of this kind of system."

While these results are promising, Dhir cautions that new algorithms will have to be trained to detect different types of cancer. The pathology markers aren't universal across all tissue types. But he didn't see why that couldn't be done to adapt this technology to work with breast cancer, for example.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Read More
If you would like to get in touch with us, please click here. Contact us