In this paper we present a gesture recognition system for a natural human robot interaction. Service robots are expected to be used in many household in the near future, provided that natural interfaces are developed especially for the interaction with elder or impaired people. The large availability of inexpensive depth sensors has provided new opportunities for real time gesture recognition systems that avoid the great limitations of complex background and lighting situations found in images acquired by RGB sensors. In this paper the Kinect Depth Camera, and the OpenNI framework were used to obtain real time tracking of human skeleton. Several gestures were performed by different persons. Then, robust and significant features were extracted and fed to a set of Neural Network Classifiers which were trained to recognize different gestures. The recognized gestures were associated to different robot commands and provided by a socket to the robot controller. The problems concerning the real time implementation of the gesture recognition system were considered, and real time tests with a mobile robot confirmed the robustness of the method for the realization of human robot interfaces.
Development of a natural interface to control a mobile platform
G Cicirelli;T D'Orazio
2014
Abstract
In this paper we present a gesture recognition system for a natural human robot interaction. Service robots are expected to be used in many household in the near future, provided that natural interfaces are developed especially for the interaction with elder or impaired people. The large availability of inexpensive depth sensors has provided new opportunities for real time gesture recognition systems that avoid the great limitations of complex background and lighting situations found in images acquired by RGB sensors. In this paper the Kinect Depth Camera, and the OpenNI framework were used to obtain real time tracking of human skeleton. Several gestures were performed by different persons. Then, robust and significant features were extracted and fed to a set of Neural Network Classifiers which were trained to recognize different gestures. The recognized gestures were associated to different robot commands and provided by a socket to the robot controller. The problems concerning the real time implementation of the gesture recognition system were considered, and real time tests with a mobile robot confirmed the robustness of the method for the realization of human robot interfaces.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


