Early diagnosis of Alzheimer's disease (AD) is crucial for providing timely treatment and care to patients. However, current diagnostic methods rely on clinical symptoms and biomarkers, which are often unreliable and invasive. Brain networks model the brain's structure and function in AD and other brain diseases. To fully capture their complexity, we need multi-modal models that combine different types of data, such as structural and functional connectivity, clinical and genetic information. This gives us a holistic view of the disease's many aspects. In this paper, we argue that brain networks and multi-modal data fusion can improve early diagnosis of AD by capturing the complex and heterogeneous nature of the disease. Using brain network modeling and multi-modal data fusion, we envisage a novel framework for detecting AD and its prodromal stages. The framework can simultaneously capture network properties from multi-modal as well as longitudinal datasets, which provide complementary information.
Integrating Brain Networks and Multi-Modal Data for Early Detection of Alzheimer's Disease
Comito C.
Membro del Collaboration Group
;Pizzuti C.Membro del Collaboration Group
;Sammarra M.Membro del Collaboration Group
;Socievole A.Membro del Collaboration Group
2024
Abstract
Early diagnosis of Alzheimer's disease (AD) is crucial for providing timely treatment and care to patients. However, current diagnostic methods rely on clinical symptoms and biomarkers, which are often unreliable and invasive. Brain networks model the brain's structure and function in AD and other brain diseases. To fully capture their complexity, we need multi-modal models that combine different types of data, such as structural and functional connectivity, clinical and genetic information. This gives us a holistic view of the disease's many aspects. In this paper, we argue that brain networks and multi-modal data fusion can improve early diagnosis of AD by capturing the complex and heterogeneous nature of the disease. Using brain network modeling and multi-modal data fusion, we envisage a novel framework for detecting AD and its prodromal stages. The framework can simultaneously capture network properties from multi-modal as well as longitudinal datasets, which provide complementary information.File | Dimensione | Formato | |
---|---|---|---|
paper42DEBD.pdf
accesso aperto
Tipologia:
Versione Editoriale (PDF)
Licenza:
Creative commons
Dimensione
2.51 MB
Formato
Adobe PDF
|
2.51 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.