AI, radiologists each bring capabilities to better diagnose breast cancer

An artificial intelligence tool that was trained on about a million mammography images is able to identify breast cancer with 90 percent accuracy combined with analysis by radiologists.


An artificial intelligence tool that was trained on about a million mammography images is able to identify breast cancer with 90 percent accuracy combined with analysis by radiologists.

These results were derived in a study by researchers from NYU School of Medicine and the NYU Center for data science, supporting the use of AI as a tool for radiologists in diagnosing breast cancer.

Specifically, the study examined the ability of a type of a machine learning program, to add value to the diagnoses of a group of 14 radiologists as they reviewed 720 mammogram images.


Both AI and radiologists’ capabilities are crucial in effective diagnosis of the disease, researchers found.

"Our study found that AI identified cancer-related patterns in the data that radiologists could not, and vice versa. AI detected pixel-level changes in tissue invisible to the human eye, while humans used forms of reasoning not available to AI," says senior study author Krzysztof J. Geras, assistant professor in the Department of Radiology at NYU Langone, also an affiliated faculty member at the NYU Center for Data Science. "The ultimate goal of our work is to augment, not replace, human radiologists."

In the study, published online recently by the journal IEEE Transactions on Medical Imaging, researchers designed statistical techniques that let their program "learn" how to get better at a task without being told exactly how.

For the study, the research team analyzed images that had been collected as part of routine clinical care at NYU Langone Health over seven years, sifting through the collected data and connecting the images with biopsy results. This effort created a large dataset for the AI tool to train on—consisting of 229,426 digital screening mammography exams and 1,001,093 images. Most databases used in studies to date have been limited to 10,000 images or fewer.

The researchers trained the neural network by programming it to analyze images from the database for which cancer diagnoses had already been determined. In addition, the researchers designed the study AI model to first consider very small patches of the full resolution image separately to create a heat map, a statistical picture of disease likelihood. Then the program considers the entire breast for structural features linked to cancer, paying closer attention to the areas flagged in the pixel-level heat map.

Rather than have the researchers identify image features for their AI to search for, the tool is discovering on its own which image features increase prediction accuracy.

More for you

Loading data for hdm_tax_topic #better-outcomes...