Real-time emotion monitoring is increasingly relevant in healthcare, automotive, and workplace applications, where adaptive systems can enhance user experience and well-being. This study investigates the feasibility of classifying emotions along the valence–arousal dimensions of the Circumplex Model of Affect using EEG signals acquired from a single mastoid channel positioned near the ear. Twenty-four participants viewed emotion-eliciting videos and self-reported their affective states using the Self-Assessment Manikin. EEG data were recorded with an OpenBCI Cyton board and both spectral and temporal features (including power in multiple frequency bands and entropy-based complexity measures) were extracted from the single ear-channel. A dual analytical framework was adopted: classical statistical analyses (ANOVA, Mann–Whitney U) and artificial neural networks combined with explainable AI methods (Gradient × Input, Integrated Gradients) were used to identify features associated with valence and arousal. Results confirmed the physiological validity of single-channel ear-EEG, and showed that absolute 𝛽 - and 𝛾 -band power, spectral ratios, and entropy-based metrics consistently contributed to emotion classification. Overall, the findings demonstrate that reliable and interpretable affective information can be extracted from minimal EEG configurations, supporting their potential for wearable, real-world emotion monitoring. Nonetheless, practical considerations—such as long-term comfort, stability, and wearability of ear-EEG devices—remain important challenges and motivate future research on sustained use in naturalistic environments.

Development of a Measurement Procedure for Emotional States Detection Based on Single-Channel Ear-EEG: A Proof-of-Concept Study

Gargiulo, Ludovica;
2026

Abstract

Real-time emotion monitoring is increasingly relevant in healthcare, automotive, and workplace applications, where adaptive systems can enhance user experience and well-being. This study investigates the feasibility of classifying emotions along the valence–arousal dimensions of the Circumplex Model of Affect using EEG signals acquired from a single mastoid channel positioned near the ear. Twenty-four participants viewed emotion-eliciting videos and self-reported their affective states using the Self-Assessment Manikin. EEG data were recorded with an OpenBCI Cyton board and both spectral and temporal features (including power in multiple frequency bands and entropy-based complexity measures) were extracted from the single ear-channel. A dual analytical framework was adopted: classical statistical analyses (ANOVA, Mann–Whitney U) and artificial neural networks combined with explainable AI methods (Gradient × Input, Integrated Gradients) were used to identify features associated with valence and arousal. Results confirmed the physiological validity of single-channel ear-EEG, and showed that absolute 𝛽 - and 𝛾 -band power, spectral ratios, and entropy-based metrics consistently contributed to emotion classification. Overall, the findings demonstrate that reliable and interpretable affective information can be extracted from minimal EEG configurations, supporting their potential for wearable, real-world emotion monitoring. Nonetheless, practical considerations—such as long-term comfort, stability, and wearability of ear-EEG devices—remain important challenges and motivate future research on sustained use in naturalistic environments.
2026
Istituto di Sistemi e Tecnologie Industriali Intelligenti per il Manifatturiero Avanzato - STIIMA (ex ITIA)
EEG; ear-EEG, wearable EEG, emotion recognition, single-channel, physiological measurement, signal processing
File in questo prodotto:
File Dimensione Formato  
sensors-26-00385-v2.pdf

accesso aperto

Tipologia: Versione Editoriale (PDF)
Licenza: Creative commons
Dimensione 3.12 MB
Formato Adobe PDF
3.12 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/562957
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact