With the advent of ubiquitous and pervasive computing environ- ments, one of promising applications is a guidance system. In this paper, we propose a mobile mixed reality guide system for in- door environments, Chloe@University. A mobile computing de- vice (Sony's Ultra Mobile PC) is hidden inside a jacket and a user selects a destination inside a building through voice commands. A 3D virtual assistant then appears in the see-through HMD and guides him/her to destination. Thus, the user simply follows the virtual guide. Chloe@University also suggests the most suitable virtual character (e.g. human guide, dog, cat, etc.) based on user preferences and profiles. Depending on user profiles, different se- curity levels and authorizations for content are previewed. Con- cerning indoor location tracking, WiFi, RFID, and sensor-based methods are integrated in this system to have maximum flexibil- ity. Moreover smart and transparent wireless connectivity provides the user terminal with fast and seamless transition among Access Points (APs). Different AR navigation approaches have been stud- ied: [Olwal 2006], [Elmqvist et al. ] and [Newman et al. ] work indoors while [Bell et al. 2002] and [Reitmayr and Drummond 2006] are employed outdoors. Accurate tracking and registration is still an open issue and recently it has mostly been tackled by no single method, but mostly through aggregation of tracking and localization methods, mostly based on handheld AR. A truly wear- able, HMD based mobile AR navigation aid for both indoors and outdoors with rich 3D content remains an open issue and a very active field of multi-discipline research.

Chloe@University: an indoor, mobile mixed reality guidance system

Repetto M;Barsocchi P;
2007

Abstract

With the advent of ubiquitous and pervasive computing environ- ments, one of promising applications is a guidance system. In this paper, we propose a mobile mixed reality guide system for in- door environments, Chloe@University. A mobile computing de- vice (Sony's Ultra Mobile PC) is hidden inside a jacket and a user selects a destination inside a building through voice commands. A 3D virtual assistant then appears in the see-through HMD and guides him/her to destination. Thus, the user simply follows the virtual guide. Chloe@University also suggests the most suitable virtual character (e.g. human guide, dog, cat, etc.) based on user preferences and profiles. Depending on user profiles, different se- curity levels and authorizations for content are previewed. Con- cerning indoor location tracking, WiFi, RFID, and sensor-based methods are integrated in this system to have maximum flexibil- ity. Moreover smart and transparent wireless connectivity provides the user terminal with fast and seamless transition among Access Points (APs). Different AR navigation approaches have been stud- ied: [Olwal 2006], [Elmqvist et al. ] and [Newman et al. ] work indoors while [Bell et al. 2002] and [Reitmayr and Drummond 2006] are employed outdoors. Accurate tracking and registration is still an open issue and recently it has mostly been tackled by no single method, but mostly through aggregation of tracking and localization methods, mostly based on handheld AR. A truly wear- able, HMD based mobile AR navigation aid for both indoors and outdoors with rich 3D content remains an open issue and a very active field of multi-discipline research.
2007
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
978-1-59593-863-3
Real-time systems
Mixed reality
Virtual Human
Localization
Sensor Networks
File in questo prodotto:
File Dimensione Formato  
prod_120601-doc_130795.pdf

solo utenti autorizzati

Descrizione: Chloe@University: an indoor, mobile mixed reality guidance system
Tipologia: Versione Editoriale (PDF)
Dimensione 97.64 kB
Formato Adobe PDF
97.64 kB Adobe PDF   Visualizza/Apri   Richiedi una copia

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/85928
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact