This proposal introduces a natural user interface (NUI) that allows users to rotate with 3 degrees of freedom, point to and crop 3D reconstructions of anatomical parts by using a Kinect as the only input device. The NUI is built upon a view-independent hand pose recognition module, which allows to recognize a limited set of pre-defined postures from single, low resolution depth images. We use principal component analysis to estimate the hand orientation in space, Flusser moment invariants as image features for visual recognition and a multiclass Support Vector Machine to classify the features extracted to a limited set of predefined static hand postures.

A Kinect NUI for 3D Medical Visualization

Luigi Gallo;Giuseppe De Pietro
2012

Abstract

This proposal introduces a natural user interface (NUI) that allows users to rotate with 3 degrees of freedom, point to and crop 3D reconstructions of anatomical parts by using a Kinect as the only input device. The NUI is built upon a view-independent hand pose recognition module, which allows to recognize a limited set of pre-defined postures from single, low resolution depth images. We use principal component analysis to estimate the hand orientation in space, Flusser moment invariants as image features for visual recognition and a multiclass Support Vector Machine to classify the features extracted to a limited set of predefined static hand postures.
2012
Istituto di Calcolo e Reti ad Alte Prestazioni - ICAR
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/215046
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact