Disclosures: Lehman reports no relevant financial disclosures. Morris reports no relevant financial disclosures.
September 25, 2020
4 min read

Should AI interpretation of mammograms become widespread in clinical practice?

Disclosures: Lehman reports no relevant financial disclosures. Morris reports no relevant financial disclosures.
You've successfully added to your alerts. You will receive an email when new content is published.

Click Here to Manage Email Alerts

We were unable to process your request. Please try again later. If you continue to have this issue please contact customerservice@slackinc.com.



Imaging technology continues to advance at a rapid pace, acquiring more complex data from every part of the human body.

However, the human brain alone cannot analyze all the data nor leverage the full spectrum of information embedded in the images. So, it’s essential that we use the power of artificial intelligence (AI) tools to amplify the signals contained within the images and present that information to the radiologist for management decisions.

Constance Lehman, MD, PhD
Constance Lehman

We are advancing on a pathway to autonomous interpretation of mammograms with careful research and investigation. Studies done so far have used trial designs of simulated settings of image review, showing AI models clearly have the potential to perform as well as humans. For instance, we found that a deep learning model that triaged cancer-free mammograms improved specificity (P = .002) with noninferior sensitivity (P < .001) while reducing radiologists’ workloads. However, simulation studies don’t always translate to “real-world” clinical practice and, thus, prospective clinical trials are needed to supplement our current knowledge.

The question then becomes, “Could we have an AI tool or model that acts independently from a human radiologist?” Could deep learning become better than — and ultimately replace — radiologists? That’s where many radiologists may start to get nervous.

As a radiologist whose clinical practice is 100% breast imaging, I want to clarify that I am not at all fearful of being replaced. Like my colleagues, I am eager to have more time to engage with my patients and less time performing tasks that can be accomplished by a computer. There is so much I do that a computer, no matter how smart, can’t do. But I also want to be clear that we desperately need AI to help women around the world. AI tools are solving real problems our patients face. We didn’t pursue AI for autonomous interpretation because we expected to put radiologists out of jobs. Rather, we are implementing AI to tackle significant problems with human interpretation.

We have over 130 radiologists at one hospital in Boston. The country of Liberia has two. Most of the world doesn’t even have access to X-rays, because most of the world doesn’t have access to specialized radiologists. X-ray technology was discovered in the late 1890s to be able to image the human body and has radically changed the face of health care. But today, only one in three humans in the world has access to a specialized radiologist.

Moreover, in regions fortunate enough to have specialized radiologists interpreting mammograms, we found that only 60% of these specialists operated within the recommended guidelines. There is clearly variation in human performance, and some performance can fall below acceptable quality.

We also are dealing with national physician burnout and depression, especially pertaining to more tedious tasks such as reading large volumes of X-rays and mammograms. These repetitive tasks are better done by a machine, so that physicians have more time to interact with patients, perform biopsies and procedures, and help patients understand their journey from an abnormal test through diagnosis and treatment.

At the end of the day, I predict with AI integration we will need more, not fewer, radiologists, and those radiologists will have higher job satisfaction and infinitely higher impact on the health of their communities. Health care workers are overwhelmed by all their work, and AI can help. AI can take the repetitive, tedious tasks off their plates and allow them to go back to being doctors.


Ali FS, et al. J Glob Radiol. 2015;doi:10.7191/jgr.2015.1020.

Lehman CD, et al. Radiology. 2017;doi:10.1148/radiol.2016161174.

Yala A, et al. Radiology. 2019;doi:10.1148/radiol.2019182908.

Constance Lehman, MD, PhD, is professor of radiology at Harvard Medical School and director of breast imaging and co-director of Avon Comprehensive Breast Evaluation Center at Massachusetts General Hospital. She can be reached at 15 Parkman St., Boston, MA 02114-3117; email: clehman@partners.org.



The concept of AI interpretation of mammograms has a great deal of promise. However, there are several problems that will need to be overcome before it can be put into widespread practice.

Elizabeth Morris, MD, FACR, FSBI, FISMRM
Elizabeth Morris

Firstly, it is probably going to be expensive to transition to an AI-based interpretation system.

It also will be important to have a high-quality system. The data sets that the algorithms are based on need to be very robust and diverse and, so far, we don’t have diverse data sets; we don’t have data sets that can be universally applied to all-comers. We are dependent on these data sets to make algorithms that can do the job well.

Just trying to curate the data is a monumental process. Creating algorithms isn’t that hard. Making sure the data that go into the algorithms are pristine and high quality is the hard part. I don’t see us being able to transition over very easily.

We also will need large, multi-consortium studies to show that these algorithms work and can replace human interpretation. From what I understand, there is currently no national leader in this area. So, we’re going to need to find a way to work collaboratively on this.

A major downside is that the way we are practicing now is not the way we’re going to be practicing in the future. People really need to be adaptable and flexible, and people don’t like change. Moving to an AI-based system would require the way we work to change, and that might be a difficult transition.

Interpretation of normal mammograms probably can be outsourced to a computer. We have these long lists of mammograms, and it would be nice if the important ones could rise to the top of the list, and we could prioritize those. However, we are not ready to implement this in routine clinical practice any time soon.

Elizabeth Morris, MD, FACR, FSBI, FISMRM, is chief of the breast imaging service and Larry Norton endowed chair member at Memorial Sloan Kettering Cancer Center. She also is professor of radiology at Weill Cornell Medical College. She can be reached at 1275 York Ave., New York, NY 10065; email: morrise@mskcc.org.