This paper presents a view-independent hand pose recognition system, which allows the recognition of a limited set of predefined postures from single, low resolution depth images in real time on standard hardware in unconstrained environments. The system consists of three modules: hand segmentation and pose compensation, feature extraction and processing, and hand pose recognition. We use principal component analysis to estimate the hand orientation in space and Flusser moment invariants as image features for visual recognition. The implementation details, classification accuracy and performance measures of the recognition system are reported and discussed. The experimental results show that the system can recognize the pose of two hands at full frame rate with an average total latency lower than 5 ms.

View-independent Hand Posture Recognition from Single Depth Images using PCA and Flusser moments

L Gallo;
2012

Abstract

This paper presents a view-independent hand pose recognition system, which allows the recognition of a limited set of predefined postures from single, low resolution depth images in real time on standard hardware in unconstrained environments. The system consists of three modules: hand segmentation and pose compensation, feature extraction and processing, and hand pose recognition. We use principal component analysis to estimate the hand orientation in space and Flusser moment invariants as image features for visual recognition. The implementation details, classification accuracy and performance measures of the recognition system are reported and discussed. The experimental results show that the system can recognize the pose of two hands at full frame rate with an average total latency lower than 5 ms.
2012
Istituto di Calcolo e Reti ad Alte Prestazioni - ICAR
978-1-4673-5152-2
feature extraction
gesture recognition
image classification
pose estimation
principal component analysis
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/175925
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 9
  • ???jsp.display-item.citation.isi??? ND
social impact