Introduction: Emotion recognition systems have traditionally relied on basic visual elicitation. Virtual reality (VR) offers an immersive alternative that better resembles real-world emotional experiences. Objective: To develop and evaluate custom-built VR scenarios designed to evoke sadness, relaxation, happiness, and fear, and to utilize physiological signals together with machine learning techniques for accurate prediction and classification of emotional states. Methods: Physiological signals (electrocardiogram, blood volume pulse, galvanic skin response, and respiration) were acquired from 36 participants during VR experiences. Machine learning models, including Logistic Regression with Square Method feature selection, were applied in a subject-independent approach in order to discern the four emotional states. Results: Features extracted by physiological signal analysis highlighted significant differences among emotional states. The machine learning models achieved high accuracies of 80%, 85%, and 70% for arousal, valence, and 4-class emotion classification, respectively. Explainable AI techniques further provided insights into the decision-making processes and the relevance of specific physiological features, with galvanic skin response peaks emerging as the most significant feature for both valence and arousal dimensions. Conclusion: The proposed study demonstrates efficacy of VR in eliciting genuine emotions and the potential of using physiological signals for emotion recognition, with important implications for affective computing and psychological research. The non-invasive approach, robust subject-independent generalizability, and compatibility with wearable technology position this methodology favorably for practical applications in mental health contexts and user experience evaluation.

Advancing emotion recognition with Virtual Reality: A multimodal approach using physiological signals and machine learning

Paglialonga A.
;
2025

Abstract

Introduction: Emotion recognition systems have traditionally relied on basic visual elicitation. Virtual reality (VR) offers an immersive alternative that better resembles real-world emotional experiences. Objective: To develop and evaluate custom-built VR scenarios designed to evoke sadness, relaxation, happiness, and fear, and to utilize physiological signals together with machine learning techniques for accurate prediction and classification of emotional states. Methods: Physiological signals (electrocardiogram, blood volume pulse, galvanic skin response, and respiration) were acquired from 36 participants during VR experiences. Machine learning models, including Logistic Regression with Square Method feature selection, were applied in a subject-independent approach in order to discern the four emotional states. Results: Features extracted by physiological signal analysis highlighted significant differences among emotional states. The machine learning models achieved high accuracies of 80%, 85%, and 70% for arousal, valence, and 4-class emotion classification, respectively. Explainable AI techniques further provided insights into the decision-making processes and the relevance of specific physiological features, with galvanic skin response peaks emerging as the most significant feature for both valence and arousal dimensions. Conclusion: The proposed study demonstrates efficacy of VR in eliciting genuine emotions and the potential of using physiological signals for emotion recognition, with important implications for affective computing and psychological research. The non-invasive approach, robust subject-independent generalizability, and compatibility with wearable technology position this methodology favorably for practical applications in mental health contexts and user experience evaluation.
2025
Istituto di Elettronica e di Ingegneria dell'Informazione e delle Telecomunicazioni - IEIIT
Emotion recognition
Explainable AI
Machine learning
Physiological signals
Virtual Reality
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/551883
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ente

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? ND
social impact