Behavioral variant frontotemporal dementia (bvFTD) is a neurodegenerative syndrome whose clinical diagnosis remains a challenging task especially in the early stage of the disease. Currently, the presence of frontal and anterior temporal lobe atrophies on magnetic resonance imaging (MRI) is part of the diagnostic criteria for bvFTD. However, MRI data processing is usually dependent on the acquisition device and mostly require human-assisted crafting of feature extraction. Following the impressive improvements of deep architectures, in this study we report on bvFTD identification using various classes of artificial neural networks, and present the results we achieved on classification accuracy and obliviousness on acquisition devices using extensive hyperparameter search. In particular, we will demonstrate the stability and generalization of different deep networks based on the attention mechanism, where data intra-mixing confers models the ability to identify the disorder even on MRI data in inter-device settings, i.e., on data produced by different acquisition devices and without model fine tuning, as shown from the very encouraging performance evaluations that dramatically reach and overcome the 90% value on the AuROC and balanced accuracy metrics.

Deep networks for behavioral variant frontotemporal dementia identification from multiple acquisition sources

Di Benedetto M;Carrara F;Nigro S;Falchi F;Gigli G;Amato G
2022

Abstract

Behavioral variant frontotemporal dementia (bvFTD) is a neurodegenerative syndrome whose clinical diagnosis remains a challenging task especially in the early stage of the disease. Currently, the presence of frontal and anterior temporal lobe atrophies on magnetic resonance imaging (MRI) is part of the diagnostic criteria for bvFTD. However, MRI data processing is usually dependent on the acquisition device and mostly require human-assisted crafting of feature extraction. Following the impressive improvements of deep architectures, in this study we report on bvFTD identification using various classes of artificial neural networks, and present the results we achieved on classification accuracy and obliviousness on acquisition devices using extensive hyperparameter search. In particular, we will demonstrate the stability and generalization of different deep networks based on the attention mechanism, where data intra-mixing confers models the ability to identify the disorder even on MRI data in inter-device settings, i.e., on data produced by different acquisition devices and without model fine tuning, as shown from the very encouraging performance evaluations that dramatically reach and overcome the 90% value on the AuROC and balanced accuracy metrics.
2022
Istituto di Nanotecnologia - NANOTEC
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
3D convolution
Behavioral variant frontotempora
bvFTD
Classification
Deep learning
Logistic regression
Machine learning
Medical imaging
Multi-layer perceptron
Neural networks
Transformer
File in questo prodotto:
File Dimensione Formato  
prod_471832-doc_191788.pdf

Open Access dal 17/08/2023

Descrizione: Deep networks for behavioral variant frontotemporal dementia identification from multiple acquisition sources
Tipologia: Versione Editoriale (PDF)
Dimensione 1 MB
Formato Adobe PDF
1 MB Adobe PDF Visualizza/Apri
prod_471832-doc_191789.pdf

Open Access dal 17/08/2023

Descrizione: Postprint - Deep networks for behavioral variant frontotemporal dementia identification from multiple acquisition sources
Tipologia: Versione Editoriale (PDF)
Dimensione 1.3 MB
Formato Adobe PDF
1.3 MB Adobe PDF Visualizza/Apri
prod_471832-doc_191913.pdf

Open Access dal 17/08/2023

Descrizione: Preprint - Deep networks for behavioral variant frontotemporal dementia identification from multiple acquisition sources
Tipologia: Versione Editoriale (PDF)
Dimensione 1.42 MB
Formato Adobe PDF
1.42 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/417689
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 8
  • ???jsp.display-item.citation.isi??? ND
social impact