Conversational systems play an important role in scenarios without a keyboard, e.g., talking to a robot. Communication in human-robot interaction (HRI) ultimately involves a combination of verbal and non-verbal inputs and outputs. HRI systems must process verbal and non-verbal observations and execute verbal and non-verbal actions in parallel, to interpret and produce synchronized behaviours. The development of such systems involves the integration of potentially many components and ensuring a complex interaction and synchronization between them. Most work in spoken dialogue system development uses pipeline architectures. Some exceptions are [1, 17], which execute system components in parallel (weakly-coupled or tightly-coupled architectures). The latter are more promising for building adaptive systems, which is one of the goals of contemporary research systems. In this paper we present an event-based approach for integrating a conversational HRI system. This approach has been instantiated using the Urbi middleware [6] on a Nao robot, used as a testbed for investigating child-robot interaction in the ALIZ-E project. We focus on the implementation for two scenarios: an imitation game of arm movements and a quiz game.
An Event-Based Conversational System for the Nao Robot
Cosi Piero;Sommavilla Giacomo;Tesser Fabio;
2011
Abstract
Conversational systems play an important role in scenarios without a keyboard, e.g., talking to a robot. Communication in human-robot interaction (HRI) ultimately involves a combination of verbal and non-verbal inputs and outputs. HRI systems must process verbal and non-verbal observations and execute verbal and non-verbal actions in parallel, to interpret and produce synchronized behaviours. The development of such systems involves the integration of potentially many components and ensuring a complex interaction and synchronization between them. Most work in spoken dialogue system development uses pipeline architectures. Some exceptions are [1, 17], which execute system components in parallel (weakly-coupled or tightly-coupled architectures). The latter are more promising for building adaptive systems, which is one of the goals of contemporary research systems. In this paper we present an event-based approach for integrating a conversational HRI system. This approach has been instantiated using the Urbi middleware [6] on a Nao robot, used as a testbed for investigating child-robot interaction in the ALIZ-E project. We focus on the implementation for two scenarios: an imitation game of arm movements and a quiz game.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.