This proposal introduces a natural user interface (NUI) that allows users to rotate with 3 degrees of freedom, point to and crop 3D reconstructions of anatomical parts by using a Kinect as the only input device. The NUI is built upon a view-independent hand pose recognition module, which allows to recognize a limited set of pre-defined postures from single, low resolution depth images. We use principal component analysis to estimate the hand orientation in space, Flusser moment invariants as image features for visual recognition and a multiclass Support Vector Machine to classify the features extracted to a limited set of predefined static hand postures.
A Kinect NUI for 3D Medical Visualization
Luigi Gallo;Giuseppe De Pietro
2012
Abstract
This proposal introduces a natural user interface (NUI) that allows users to rotate with 3 degrees of freedom, point to and crop 3D reconstructions of anatomical parts by using a Kinect as the only input device. The NUI is built upon a view-independent hand pose recognition module, which allows to recognize a limited set of pre-defined postures from single, low resolution depth images. We use principal component analysis to estimate the hand orientation in space, Flusser moment invariants as image features for visual recognition and a multiclass Support Vector Machine to classify the features extracted to a limited set of predefined static hand postures.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


