Accurate 3D tracking of hand and fingers movements poses significant challenges in computer vision. The potential applications span across multiple domains, including human–computer interaction, virtual reality, industry, and medicine. While gesture recognition has achieved remarkable accuracy, quantifying fine movements remains a hurdle, particularly in clinical applications where the assessment of hand dysfunctions and rehabilitation training outcomes necessitate precise measurements. Several novel and lightweight frameworks based on Deep Learning have emerged to address this issue; however, their performance in accurately and reliably measuring finger movements requires validation against well-established gold standard systems. In this paper, the aim is to validate the hand-tracking framework implemented by Google MediaPipe Hand (GMH) and an innovative enhanced version, GMH-D, that exploits the depth estimation of an RGB-Depth camera to achieve more accurate tracking of 3D movements. Three dynamic exercises commonly administered by clinicians to assess hand dysfunctions, namely hand opening–closing, single finger tapping and multiple finger tapping are considered. Results demonstrate high temporal and spectral consistency of both frameworks with the gold standard. However, the enhanced GMH-D framework exhibits superior accuracy in spatial measurements compared to the baseline GMH, for both slow and fast movements. Overall, our study contributes to the advancement of hand tracking technology, and the establishment of a validation procedure as a good-practice to prove efficacy of deep-learning-based hand-tracking. Moreover, it proves that GMH-D is a reliable framework for assessing 3D hand movements in clinical applications.
Hand tracking for clinical applications: Validation of the Google MediaPipe Hand (GMH) and the depth-enhanced GMH-D frameworks
Gianluca Amprimo
Primo
;Giuseppe Pettiti;Claudia Ferraris
Ultimo
2024
Abstract
Accurate 3D tracking of hand and fingers movements poses significant challenges in computer vision. The potential applications span across multiple domains, including human–computer interaction, virtual reality, industry, and medicine. While gesture recognition has achieved remarkable accuracy, quantifying fine movements remains a hurdle, particularly in clinical applications where the assessment of hand dysfunctions and rehabilitation training outcomes necessitate precise measurements. Several novel and lightweight frameworks based on Deep Learning have emerged to address this issue; however, their performance in accurately and reliably measuring finger movements requires validation against well-established gold standard systems. In this paper, the aim is to validate the hand-tracking framework implemented by Google MediaPipe Hand (GMH) and an innovative enhanced version, GMH-D, that exploits the depth estimation of an RGB-Depth camera to achieve more accurate tracking of 3D movements. Three dynamic exercises commonly administered by clinicians to assess hand dysfunctions, namely hand opening–closing, single finger tapping and multiple finger tapping are considered. Results demonstrate high temporal and spectral consistency of both frameworks with the gold standard. However, the enhanced GMH-D framework exhibits superior accuracy in spatial measurements compared to the baseline GMH, for both slow and fast movements. Overall, our study contributes to the advancement of hand tracking technology, and the establishment of a validation procedure as a good-practice to prove efficacy of deep-learning-based hand-tracking. Moreover, it proves that GMH-D is a reliable framework for assessing 3D hand movements in clinical applications.File | Dimensione | Formato | |
---|---|---|---|
1-s2.0-S1746809424005664-main_finale.pdf
accesso aperto
Descrizione: Hand tracking for clinical applications: Validation of the Google MediaPipe Hand (GMH) and the depth-enhanced GMH-D frameworks
Tipologia:
Versione Editoriale (PDF)
Licenza:
Creative commons
Dimensione
3.5 MB
Formato
Adobe PDF
|
3.5 MB | Adobe PDF | Visualizza/Apri |
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.