You are free to share this article under the Attribution 4.0 International license. A new artificial intelligence system could help pathologists read biopsies more accurately, and lead to better detection and diagnosis of breast cancer, researchers say. Doctors examine images of breast tissue biopsies to diagnose breast cancer. But the differences between cancerous and benign images can be difficult for the human eye to classify. The new algorithm helps interpret them, and does so nearly as accurately or better than an experienced pathologist, depending on the task.
Researchers at University of Washington and University of California, Los Angeles, have developed an artificial intelligence system that could help pathologists read biopsies more accurately, and lead to better detection and diagnosis of breast cancer. Doctors examine images of breast tissue biopsies to diagnose breast cancer. But the differences between cancerous and benign images can be difficult for the human eye to classify. This new algorithm helps interpret them -- and it does so nearly as accurately or better than an experienced pathologist, depending on the task. The research team published its results Aug. 9 in the journal JAMA Network Open.
Brancati, Nadia, Anniciello, Anna Maria, Pati, Pushpak, Riccio, Daniel, Scognamiglio, Giosuè, Jaume, Guillaume, De Pietro, Giuseppe, Di Bonito, Maurizio, Foncubierta, Antonio, Botti, Gerardo, Gabrani, Maria, Feroce, Florinda, Frucci, Maria
Breast cancer is the most commonly diagnosed cancer and registers the highest number of deaths for women with cancer. Recent advancements in diagnostic activities combined with large-scale screening policies have significantly lowered the mortality rates for breast cancer patients. However, the manual inspection of tissue slides by the pathologists is cumbersome, time-consuming, and is subject to significant inter- and intra-observer variability. Recently, the advent of whole-slide scanning systems have empowered the rapid digitization of pathology slides, and enabled to develop digital workflows. These advances further enable to leverage Artificial Intelligence (AI) to assist, automate, and augment pathological diagnosis. But the AI techniques, especially Deep Learning (DL), require a large amount of high-quality annotated data to learn from. Constructing such task-specific datasets poses several challenges, such as, data-acquisition level constrains, time-consuming and expensive annotations, and anonymization of private information. In this paper, we introduce the BReAst Carcinoma Subtyping (BRACS) dataset, a large cohort of annotated Hematoxylin & Eosin (H&E)-stained images to facilitate the characterization of breast lesions. BRACS contains 547 Whole-Slide Images (WSIs), and 4539 Regions of Interest (ROIs) extracted from the WSIs. Each WSI, and respective ROIs, are annotated by the consensus of three board-certified pathologists into different lesion categories. Specifically, BRACS includes three lesion types, i.e., benign, malignant and atypical, which are further subtyped into seven categories. It is, to the best of our knowledge, the largest annotated dataset for breast cancer subtyping both at WSI- and ROI-level. Further, by including the understudied atypical lesions, BRACS offers an unique opportunity for leveraging AI to better understand their characteristics.
UCLA researchers have developed an artificial intelligence system that could help pathologists read biopsies more accurately and to better detect and diagnose breast cancer. The new system, described in a study published today in JAMA Network Open, helps interpret medical images used to diagnose breast cancer that can be difficult for the human eye to classify, and it does so nearly as accurately or better as experienced pathologists. "It is critical to get a correct diagnosis from the beginning so that we can guide patients to the most effective treatments," said Dr. Joann Elmore, the study's senior author and a professor of medicine at the David Geffen School of Medicine at UCLA. A 2015 study led by Elmore found that pathologists often disagree on the interpretation of breast biopsies, which are performed on millions of women each year. That earlier research revealed that diagnostic errors occurred in about one out of every six women who had ductal carcinoma in situ (a noninvasive type of breast cancer), and that incorrect diagnoses were given in about half of the biopsy cases of breast atypia (abnormal cells that are associated with a higher risk for breast cancer).
The new system, described in a study published in JAMA Network Open, helps interpret medical images used to diagnose breast cancer that can be difficult for the human eye to classify, and it does so nearly as accurately or better as experienced pathologists. "It is critical to get a correct diagnosis from the beginning so that we can guide patients to the most effective treatments," said Dr. Joann Elmore, the study's senior author and a professor of medicine at the David Geffen School of Medicine at UCLA. A 2015 study led by Elmore found that pathologists often disagree on the interpretation of breast biopsies, which are performed on millions of women each year. That earlier research revealed that diagnostic errors occurred in about one out of every six women who had ductal carcinoma in situ (a noninvasive type of breast cancer), and that incorrect diagnoses were given in about half of the biopsy cases of breast atypia (abnormal cells that are associated with a higher risk for breast cancer). "Medical images of breast biopsies contain a great deal of complex data and interpreting them can be very subjective," said Elmore, who is also a researcher at the UCLA Jonsson Comprehensive Cancer Center.