Wireless capsule endoscopy is a non-invasive, wireless imaging tool that has developed rapidly over the last several years. One of the main limiting factors using this technology is that it produces a huge number of images, whose analysis, to be done by a doctor, is an extremely time-consuming process. In this research area, the management of this problem has been addressed with the development of Computer-aided Diagnosis systems thanks to which the automatic inspection and analysis of images acquired by the capsule has clearly improved. Recently, a big advance in classification of endoscopic images is achieved with the emergence of deep learning methods. The proposed expert system employs three pre-trained deep convolutional neural networks for feature extraction. In order to construct efficient feature sets, the features from VGG19, InceptionV3 and ResNet50 models are then selected and fused using the minimum Redundancy Maximum Relevance method and different fusion rules. Finally, supervised machine learning algorithms are employed to classify the images using the extracted features into two categories: bleeding and nonbleeding images. For performance evaluation a series of experiments are performed on two standard benchmark datasets. It has been observed that the proposed architecture outclass the single deep learning architectures, with an average accuracy in detection bleeding regions of 97.65 % and 95.70 % on well-known state-of-the-art datasets considering three different fusion rules, with the best combination in terms of accuracy and training time obtained using mean value pooling as fusion rule and Support Vector Machine as classifier.

Deep transfer learning approaches for bleeding detection in endoscopy images

Caroppo Andrea;Leone Alessandro;Siciliano Pietro
2021

Abstract

Wireless capsule endoscopy is a non-invasive, wireless imaging tool that has developed rapidly over the last several years. One of the main limiting factors using this technology is that it produces a huge number of images, whose analysis, to be done by a doctor, is an extremely time-consuming process. In this research area, the management of this problem has been addressed with the development of Computer-aided Diagnosis systems thanks to which the automatic inspection and analysis of images acquired by the capsule has clearly improved. Recently, a big advance in classification of endoscopic images is achieved with the emergence of deep learning methods. The proposed expert system employs three pre-trained deep convolutional neural networks for feature extraction. In order to construct efficient feature sets, the features from VGG19, InceptionV3 and ResNet50 models are then selected and fused using the minimum Redundancy Maximum Relevance method and different fusion rules. Finally, supervised machine learning algorithms are employed to classify the images using the extracted features into two categories: bleeding and nonbleeding images. For performance evaluation a series of experiments are performed on two standard benchmark datasets. It has been observed that the proposed architecture outclass the single deep learning architectures, with an average accuracy in detection bleeding regions of 97.65 % and 95.70 % on well-known state-of-the-art datasets considering three different fusion rules, with the best combination in terms of accuracy and training time obtained using mean value pooling as fusion rule and Support Vector Machine as classifier.
2021
Istituto per la Microelettronica e Microsistemi - IMM
Bleeding detection
Computer-aided
Convolutional neural network
Deep learning
Transfer learning
Wireless capsule endoscopy
File in questo prodotto:
File Dimensione Formato  
1-s2.0-S0895611120301476-main.pdf

solo utenti autorizzati

Tipologia: Versione Editoriale (PDF)
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 9.64 MB
Formato Adobe PDF
9.64 MB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/425166
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 54
  • ???jsp.display-item.citation.isi??? ND
social impact