In the last few years, several end-user tools have been designed to help people who are not professional developers in programming their smart environments. However, such tools are often based on structured visual editors providing abstract representations of the available connected sensors and objects, which can be problematic for end users, and do not particularly encourage their participation. This work aims to make the end-user experience of creating everyday automations involving various types of connected sensors and objects more engaging by replacing extensive, static, structured and comprehensive abstract visual tools with more narrowed, relevant, context-sensitive, dynamic, augmented reality-based representations. We present a solution for this purpose that mobile users can exploit through their smartphone. End users can use the smartphone camera to frame the relevant sensor or object through the developed prototype, then get the current automations associated with it, edit their definition, create new ones as well as monitor the automations involving the whole current environment. We also report a first user test of the developed prototype deployed in a home equipped with connected sensors and objects, which yielded positive feedback.

Smartphone-based augmented reality for end-user creation of home automations

Manca M.;Paterno' F.;Santoro C.
2022

Abstract

In the last few years, several end-user tools have been designed to help people who are not professional developers in programming their smart environments. However, such tools are often based on structured visual editors providing abstract representations of the available connected sensors and objects, which can be problematic for end users, and do not particularly encourage their participation. This work aims to make the end-user experience of creating everyday automations involving various types of connected sensors and objects more engaging by replacing extensive, static, structured and comprehensive abstract visual tools with more narrowed, relevant, context-sensitive, dynamic, augmented reality-based representations. We present a solution for this purpose that mobile users can exploit through their smartphone. End users can use the smartphone camera to frame the relevant sensor or object through the developed prototype, then get the current automations associated with it, edit their definition, create new ones as well as monitor the automations involving the whole current environment. We also report a first user test of the developed prototype deployed in a home equipped with connected sensors and objects, which yielded positive feedback.
2022
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
End-user development
Everyday automation
Smart home
File in questo prodotto:
File Dimensione Formato  
prod_462129-doc_180395.pdf

solo utenti autorizzati

Descrizione: Smartphone-based augmented reality for end-user creation of home automations
Tipologia: Versione Editoriale (PDF)
Licenza: NON PUBBLICO - Accesso privato/ristretto
Dimensione 2.31 MB
Formato Adobe PDF
2.31 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
prod_462129-doc_180394.pdf

Open Access dal 09/01/2023

Descrizione: Postprint - Smartphone-based augmented reality for end-user creation of home automations
Tipologia: Documento in Post-print
Licenza: Nessuna licenza dichiarata (non attribuibile a prodotti successivi al 2023)
Dimensione 1.35 MB
Formato Adobe PDF
1.35 MB Adobe PDF Visualizza/Apri
prod_462129-doc_180393.pdf

accesso aperto

Descrizione: Preprint - Smartphone-based augmented reality for end-user creation of home automations
Tipologia: Documento in Pre-print
Licenza: Nessuna licenza dichiarata (non attribuibile a prodotti successivi al 2023)
Dimensione 1.19 MB
Formato Adobe PDF
1.19 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/440713
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 23
  • ???jsp.display-item.citation.isi??? ND
social impact