Innovative analysis methods applied to data extracted by off-the-shelf peripherals can provide useful results in activity recognition without requiring large computational resources. In this work a framework studied and developed for automated posture and gesture recognition, exploiting depth data provided by a commercial tracking device. The detection problem is handled as a semantic-based resource discovery. A general data model and a corresponding ontology provide the formal underpinning for automatic posture and gesture annotation via standard Semantic Web languages. Hence, a logic-based matchmaking exploiting non-standard inference services allows to: (i) detect postures by comparing on-the fly the retrieved annotations with standard posture descriptions stored as individuals in a proper Knowledge Base; (ii) compare subsequent postures in order to describe and recognize gestures; (iii) compare subsequent gestures in order to describe and recognize actions. The framework has been implemented in a prototypical tool and experimental tests have been carried out on a reference dataset. Preliminary results indicate the feasibility and usefulness of the proposed approach.

Semantic-Based Annotation and Inference to support Activity Detection / DI SUMMA, Maria. - (2014), pp. 1-50.

Semantic-Based Annotation and Inference to support Activity Detection

Maria di Summa
2014

Abstract

Innovative analysis methods applied to data extracted by off-the-shelf peripherals can provide useful results in activity recognition without requiring large computational resources. In this work a framework studied and developed for automated posture and gesture recognition, exploiting depth data provided by a commercial tracking device. The detection problem is handled as a semantic-based resource discovery. A general data model and a corresponding ontology provide the formal underpinning for automatic posture and gesture annotation via standard Semantic Web languages. Hence, a logic-based matchmaking exploiting non-standard inference services allows to: (i) detect postures by comparing on-the fly the retrieved annotations with standard posture descriptions stored as individuals in a proper Knowledge Base; (ii) compare subsequent postures in order to describe and recognize gestures; (iii) compare subsequent gestures in order to describe and recognize actions. The framework has been implemented in a prototypical tool and experimental tests have been carried out on a reference dataset. Preliminary results indicate the feasibility and usefulness of the proposed approach.
2014
Istituto di Sistemi e Tecnologie Industriali Intelligenti per il Manifatturiero Avanzato - STIIMA (ex ITIA)
Action recognition; resource discovery; semantic-based matchmaking; ubiquitous computing
Prof. Ing. Eugenio DI SCIASCIO
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/324752
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact