The paper describes a architecture for an assistive robot acting in a domestic environment aiming to establish a robust affec- tive and emotional relationship with the patient during rehabilitation at home. The robot has the aim to support the patient in the therapy, to monitor the patient health state, to give affective support increasing the motivation of the human in a period of two or three weeks. The affective based relationship will arise from an interaction based on natural lan- guage verbal interaction, on the acquisition of data and vital parameters by environmental and wearable sensors, on a robust human perception using the perceptive robot capabilities. An important issue is that robot activity should be understandable by the human: the proposed ar- chitecture enables the robot to express its (emotional) state, its planes, and its interpretation of perceptual data. The implicit goal is to obtain a human emotional involvement that causes the patient to "take care of" its artificial assistant, trying to satisfy the robot's expectation and motivation. The paper describes the proposed layered architecture (including modules components responsible for events and contexts detection, planning, complex verbal interaction, and artificial motivation, use cases, design patterns and control policies), discusses modeling use cases, and reports preliminary experimentation performed by simulation.

Towards an assistive social robot interacting with human patient to establish a mutual affective support

Ignazio Infantino;
2019

Abstract

The paper describes a architecture for an assistive robot acting in a domestic environment aiming to establish a robust affec- tive and emotional relationship with the patient during rehabilitation at home. The robot has the aim to support the patient in the therapy, to monitor the patient health state, to give affective support increasing the motivation of the human in a period of two or three weeks. The affective based relationship will arise from an interaction based on natural lan- guage verbal interaction, on the acquisition of data and vital parameters by environmental and wearable sensors, on a robust human perception using the perceptive robot capabilities. An important issue is that robot activity should be understandable by the human: the proposed ar- chitecture enables the robot to express its (emotional) state, its planes, and its interpretation of perceptual data. The implicit goal is to obtain a human emotional involvement that causes the patient to "take care of" its artificial assistant, trying to satisfy the robot's expectation and motivation. The paper describes the proposed layered architecture (including modules components responsible for events and contexts detection, planning, complex verbal interaction, and artificial motivation, use cases, design patterns and control policies), discusses modeling use cases, and reports preliminary experimentation performed by simulation.
2019
Istituto di Calcolo e Reti ad Alte Prestazioni - ICAR
assistive robotics
social robotics
affective computing
cognitive architecture
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/367901
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact