Parkinson's disease (PD) is characterised by a progressive worsening of motor functionalities. In particular, limited hand dexterity strongly correlates with PD diagnosis and staging. Objective detection of alterations in hand motor skills would allow, for example, prompt identification of the disease, its symptoms and the definition of adequate medical treatments. Among the clinical assessment tasks to diagnose and stage PD from hand impairment, the Finger Tapping (FT) task is a well-established tool. This preliminary study exploits a single RGB-Depth camera (Azure Kinect) and Google MediaPipe Hands to track and assess the Finger Tapping task. The system includes several stages. First, hand movements are tracked from FT video recordings and used to extract a series of clinically-relevant features. Then, the most significant features are selected and used to train and test several Machine Learning (ML) models, to distinguish subjects with PD from healthy controls. To test the proposed system, 35 PD subjects and 60 healthy volunteers were recruited. The best-performing ML model achieved a 94.4% Accuracy and 98.4% Fl score in a Leave-One-Subject-Out validation. Moreover, different clusters with respect to spatial and temporal variability in the FT trials among PD subjects were identified. This result suggests the possibility of exploiting the proposed system to perform an even finer identification of subgroups among the PD population.

Objective Assessment of the Finger Tapping Task in Parkinson's Disease and Control Subjects using Azure Kinect and Machine Learning

CFerraris;
2023

Abstract

Parkinson's disease (PD) is characterised by a progressive worsening of motor functionalities. In particular, limited hand dexterity strongly correlates with PD diagnosis and staging. Objective detection of alterations in hand motor skills would allow, for example, prompt identification of the disease, its symptoms and the definition of adequate medical treatments. Among the clinical assessment tasks to diagnose and stage PD from hand impairment, the Finger Tapping (FT) task is a well-established tool. This preliminary study exploits a single RGB-Depth camera (Azure Kinect) and Google MediaPipe Hands to track and assess the Finger Tapping task. The system includes several stages. First, hand movements are tracked from FT video recordings and used to extract a series of clinically-relevant features. Then, the most significant features are selected and used to train and test several Machine Learning (ML) models, to distinguish subjects with PD from healthy controls. To test the proposed system, 35 PD subjects and 60 healthy volunteers were recruited. The best-performing ML model achieved a 94.4% Accuracy and 98.4% Fl score in a Leave-One-Subject-Out validation. Moreover, different clusters with respect to spatial and temporal variability in the FT trials among PD subjects were identified. This result suggests the possibility of exploiting the proposed system to perform an even finer identification of subgroups among the PD population.
2023
Istituto di Elettronica e di Ingegneria dell'Informazione e delle Telecomunicazioni - IEIIT
979-8-3503-1224-9
Parkinson's Disease
Finger Tapping
Pervasive Health
Telemedicine
Machine Learning
Azure Kinect
Mediapipe
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/463615
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact