In this paper we present a gesture recognition system for a natural human robot interaction. Service robots are expected to be used in many household in the near future, provided that natural interfaces are developed especially for the interaction with elder or impaired people. The large availability of inexpensive depth sensors has provided new opportunities for real time gesture recognition systems that avoid the great limitations of complex background and lighting situations found in images acquired by RGB sensors. In this paper the Kinect Depth Camera, and the OpenNI framework were used to obtain real time tracking of human skeleton. Several gestures were performed by different persons. Then, robust and significant features were extracted and fed to a set of Neural Network Classifiers which were trained to recognize different gestures. The recognized gestures were associated to different robot commands and provided by a socket to the robot controller. The problems concerning the real time implementation of the gesture recognition system were considered, and real time tests with a mobile robot confirmed the robustness of the method for the realization of human robot interfaces.

Development of a natural interface to control a mobile platform

G Cicirelli;T D'Orazio
2014

Abstract

In this paper we present a gesture recognition system for a natural human robot interaction. Service robots are expected to be used in many household in the near future, provided that natural interfaces are developed especially for the interaction with elder or impaired people. The large availability of inexpensive depth sensors has provided new opportunities for real time gesture recognition systems that avoid the great limitations of complex background and lighting situations found in images acquired by RGB sensors. In this paper the Kinect Depth Camera, and the OpenNI framework were used to obtain real time tracking of human skeleton. Several gestures were performed by different persons. Then, robust and significant features were extracted and fed to a set of Neural Network Classifiers which were trained to recognize different gestures. The recognized gestures were associated to different robot commands and provided by a socket to the robot controller. The problems concerning the real time implementation of the gesture recognition system were considered, and real time tests with a mobile robot confirmed the robustness of the method for the realization of human robot interfaces.
2014
Istituto di Studi sui Sistemi Intelligenti per l'Automazione - ISSIA - Sede Bari
Istituto di Sistemi e Tecnologie Industriali Intelligenti per il Manifatturiero Avanzato - STIIMA (ex ITIA)
gesture recognition
human robot interface
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/261248
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact