Many studies in neuropsychology have highlighted that expert musicians, who started learning music in childhood, present structural differences in their brains with respect to non-musicians. This indicates that early music learning affects the development of the brain. Also, musicians' neuronal activity is different depending on the played instrument and on the expertise. This difference can be analysed by processing electroencephalographic (EEG) signals through Artificial Intelligence models. This paper explores the feasibility to build an automatic model that distinguishes violinists from pianists based only on their brain signals. To this aim, EEG signals of violinists and pianists are recorded while they play classical music pieces and an Artificial Neural Network is trained through a cloud computing platform to build a binary classifier of segments of these signals. Our model has the best classification performance on 20 seconds EEG segments, but this performance depends on the involved musicians' expertise. Also, the brain signals of a cellist are demonstrated to be more similar to violinists' signals than to pianists' signals. In summary, this paper demonstrates that distinctive information is present in the two types of musicians' brain signals, and that this information can be detected even by an automatic model working with a basic EEG equipment.

Distinguishing Violinists and Pianists Based on Their Brain Signals

Coro G;
2019

Abstract

Many studies in neuropsychology have highlighted that expert musicians, who started learning music in childhood, present structural differences in their brains with respect to non-musicians. This indicates that early music learning affects the development of the brain. Also, musicians' neuronal activity is different depending on the played instrument and on the expertise. This difference can be analysed by processing electroencephalographic (EEG) signals through Artificial Intelligence models. This paper explores the feasibility to build an automatic model that distinguishes violinists from pianists based only on their brain signals. To this aim, EEG signals of violinists and pianists are recorded while they play classical music pieces and an Artificial Neural Network is trained through a cloud computing platform to build a binary classifier of segments of these signals. Our model has the best classification performance on 20 seconds EEG segments, but this performance depends on the involved musicians' expertise. Also, the brain signals of a cellist are demonstrated to be more similar to violinists' signals than to pianists' signals. In summary, this paper demonstrates that distinctive information is present in the two types of musicians' brain signals, and that this information can be detected even by an automatic model working with a basic EEG equipment.
2019
Istituto di Scienza e Tecnologie dell'Informazione "Alessandro Faedo" - ISTI
978-3-030-30487-4
Artificial neural networks
Brain signals
Music
File in questo prodotto:
File Dimensione Formato  
prod_406911-doc_142441.pdf

solo utenti autorizzati

Descrizione: Paper
Tipologia: Versione Editoriale (PDF)
Dimensione 3.9 MB
Formato Adobe PDF
3.9 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
prod_406911-doc_142922.pdf

accesso aperto

Descrizione: Distinguishing Violinists and Pianists Based on Their Brain Signals
Tipologia: Versione Editoriale (PDF)
Dimensione 3.63 MB
Formato Adobe PDF
3.63 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/361871
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 5
  • ???jsp.display-item.citation.isi??? ND
social impact