Accurately identifying and categorizing cancer structures/sub-types in histological images is an important clinical task involving a considerable workload and a specific subspecialty of pathologists. Digitizing pathology is a current trend that provides large amounts of visual data allowing a faster and more precise diagnosis through the development of automatic image analysis techniques. Recent studies have shown promising results for the automatic analysis of cancer tissue by using deep learning strategies that automatically extract and organize the discriminative information from the data. This paper explores deep learning methods for the automatic analysis of Hematoxylin and Eosin stained histological images of breast cancer and lymphoma. In particular, a deep learning approach is proposed for two different use cases: the detection of invasive ductal carcinoma in breast histological images and the classification of lymphoma sub-types. Both use cases have been addressed by adopting a Residual Convolutional Neural Network which is part of a Convolutional Autoencoder Network (i.e. FusionNet). The performances have been evaluated on public datasets of digital histological images and have been compared with those obtained by using different deep neural networks (UNet and ResNet). Additionally, comparisons with the state of the art have been considered, in accordance with different deep learning approaches. The experimental results show an improvement of 5:06% in F-measure score for the detection task, and an improvement of 1:09% in the accuracy measure for the classification task.

A Deep Learning approach for breast invasive ductal carcinoma detection and lymphoma multi-classification in histological images

Brancati N;De Pietro G;Frucci M;Riccio D
2019

Abstract

Accurately identifying and categorizing cancer structures/sub-types in histological images is an important clinical task involving a considerable workload and a specific subspecialty of pathologists. Digitizing pathology is a current trend that provides large amounts of visual data allowing a faster and more precise diagnosis through the development of automatic image analysis techniques. Recent studies have shown promising results for the automatic analysis of cancer tissue by using deep learning strategies that automatically extract and organize the discriminative information from the data. This paper explores deep learning methods for the automatic analysis of Hematoxylin and Eosin stained histological images of breast cancer and lymphoma. In particular, a deep learning approach is proposed for two different use cases: the detection of invasive ductal carcinoma in breast histological images and the classification of lymphoma sub-types. Both use cases have been addressed by adopting a Residual Convolutional Neural Network which is part of a Convolutional Autoencoder Network (i.e. FusionNet). The performances have been evaluated on public datasets of digital histological images and have been compared with those obtained by using different deep neural networks (UNet and ResNet). Additionally, comparisons with the state of the art have been considered, in accordance with different deep learning approaches. The experimental results show an improvement of 5:06% in F-measure score for the detection task, and an improvement of 1:09% in the accuracy measure for the classification task.
2019
Istituto di Calcolo e Reti ad Alte Prestazioni - ICAR
histological images
deep learning
multiclassification
detection
File in questo prodotto:
File Dimensione Formato  
prod_401545-doc_163476.pdf

accesso aperto

Descrizione: A Deep Learning approach for breast invasive ductal carcinoma detection and lymphoma multi-classification in histological images
Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 14.15 MB
Formato Adobe PDF
14.15 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/389031
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 66
  • ???jsp.display-item.citation.isi??? ND
social impact