In this paper we describe the system used for the participation to the ABSITA, GxG, HaSpeeDe and IronITA shared tasks of the EVALITA 2018 conference. We developed a classifier that can be configured to use Bidirectional Long Short Term Memories and linear Support Vector Machines as learning algorithms. When using Bi-LSTMs we tested a multitask learning approach which learns the optimized parameters of the network exploiting simultaneously all the annotated dataset labels and a multiclassifier voting approach based on a k-fold technique. In addition, we developed generic and specific word embedding lexicons to further improve classification performances. When evaluated on the official test sets, our system ranked 1st in almost all subtasks for each shared task, showing the effectiveness of our approach.

Multi-task learning in deep neural networks at EVALITA 2018

Cimino A;Dell'Orletta F
2018

Abstract

In this paper we describe the system used for the participation to the ABSITA, GxG, HaSpeeDe and IronITA shared tasks of the EVALITA 2018 conference. We developed a classifier that can be configured to use Bidirectional Long Short Term Memories and linear Support Vector Machines as learning algorithms. When using Bi-LSTMs we tested a multitask learning approach which learns the optimized parameters of the network exploiting simultaneously all the annotated dataset labels and a multiclassifier voting approach based on a k-fold technique. In addition, we developed generic and specific word embedding lexicons to further improve classification performances. When evaluated on the official test sets, our system ranked 1st in almost all subtasks for each shared task, showing the effectiveness of our approach.
Campo DC Valore Lingua
dc.authority.anceserie CEUR WORKSHOP PROCEEDINGS -
dc.authority.anceserie CEUR Workshop Proceedings -
dc.authority.people Cimino A it
dc.authority.people De Mattei L it
dc.authority.people Dell'Orletta F it
dc.collection.id.s 71c7200a-7c5f-4e83-8d57-d3d2ba88f40d *
dc.collection.name 04.01 Contributo in Atti di convegno *
dc.contributor.appartenenza Istituto di linguistica computazionale "Antonio Zampolli" - ILC *
dc.contributor.appartenenza.mi 918 *
dc.date.accessioned 2024/02/21 02:44:11 -
dc.date.available 2024/02/21 02:44:11 -
dc.date.issued 2018 -
dc.description.abstracteng In this paper we describe the system used for the participation to the ABSITA, GxG, HaSpeeDe and IronITA shared tasks of the EVALITA 2018 conference. We developed a classifier that can be configured to use Bidirectional Long Short Term Memories and linear Support Vector Machines as learning algorithms. When using Bi-LSTMs we tested a multitask learning approach which learns the optimized parameters of the network exploiting simultaneously all the annotated dataset labels and a multiclassifier voting approach based on a k-fold technique. In addition, we developed generic and specific word embedding lexicons to further improve classification performances. When evaluated on the official test sets, our system ranked 1st in almost all subtasks for each shared task, showing the effectiveness of our approach. -
dc.description.affiliations Istituto di Linguistica Computazionale "Antonio Zampolli" (ILC), CNR, Pisa; Dipartimento di Informatica, Università di Pisa, Italy -
dc.description.allpeople Cimino A.; De Mattei L.; Dell'Orletta F. -
dc.description.allpeopleoriginal Cimino A.; De Mattei L.; Dell'Orletta F. -
dc.description.fulltext none en
dc.description.numberofauthors 2 -
dc.identifier.scopus 2-s2.0-85058664441 -
dc.identifier.uri https://hdl.handle.net/20.500.14243/392545 -
dc.identifier.url http://www.scopus.com/record/display.url?eid=2-s2.0-85058664441&origin=inward -
dc.language.iso eng -
dc.relation.conferencedate 12-13/12/2018 -
dc.relation.conferencename EVALITA '18, Evaluation of NLP and Speech Tools for Italian -
dc.relation.conferenceplace Torino -
dc.relation.volume 2263 -
dc.subject.keywords Multi-task Learning -
dc.subject.keywords Deep Neural Networks -
dc.subject.singlekeyword Multi-task Learning *
dc.subject.singlekeyword Deep Neural Networks *
dc.title Multi-task learning in deep neural networks at EVALITA 2018 en
dc.type.driver info:eu-repo/semantics/conferenceObject -
dc.type.full 04 Contributo in convegno::04.01 Contributo in Atti di convegno it
dc.type.miur 273 -
dc.type.referee Sì, ma tipo non specificato -
dc.ugov.descaux1 434876 -
iris.orcid.lastModifiedDate 2024/03/16 09:43:26 *
iris.orcid.lastModifiedMillisecond 1710578606417 *
iris.scopus.extIssued 2018 -
iris.scopus.extTitle Multi-task learning in deep neural networks at EVALITA 2018 -
iris.scopus.metadataErrorDescription 400 Bad Request: " Bad Request
Document ID is not valid
"
-
iris.scopus.metadataErrorType APPLICATION -
iris.scopus.metadataStatus ERROR -
iris.sitodocente.maxattempts 1 -
scopus.authority.anceserie CEUR WORKSHOP PROCEEDINGS###1613-0073 *
scopus.category 1700 *
scopus.contributor.affiliation ItaliaNLP Lab -
scopus.contributor.affiliation Università di Pisa -
scopus.contributor.affiliation ItaliaNLP Lab -
scopus.contributor.afid 60008941 -
scopus.contributor.afid 60028868 -
scopus.contributor.afid 60008941 -
scopus.contributor.auid 57002803800 -
scopus.contributor.auid 57204921228 -
scopus.contributor.auid 57540567000 -
scopus.contributor.country Italy -
scopus.contributor.country Italy -
scopus.contributor.country Italy -
scopus.contributor.dptid 114087935 -
scopus.contributor.dptid 109696702 -
scopus.contributor.dptid 114087935 -
scopus.contributor.name Andrea -
scopus.contributor.name Lorenzo -
scopus.contributor.name Felice -
scopus.contributor.subaffiliation Istituto di Linguistica Computazionale “Antonio Zampolli” (ILC–CNR); -
scopus.contributor.subaffiliation Dipartimento di Informatica; -
scopus.contributor.subaffiliation Istituto di Linguistica Computazionale “Antonio Zampolli” (ILC–CNR); -
scopus.contributor.surname Cimino -
scopus.contributor.surname De Mattei -
scopus.contributor.surname Dell’Orletta -
scopus.date.issued 2018 *
scopus.description.abstract In this paper we describe the system used for the participation to the ABSITA, GxG, HaSpeeDe and IronITA shared tasks of the EVALITA 2018 conference. We developed a classifier that can be configured to use Bidirectional Long Short Term Memories and linear Support Vector Machines as learning algorithms. When using Bi-LSTMs we tested a multitask learning approach which learns the optimized parameters of the network exploiting simultaneously all the annotated dataset labels and a multiclassifier voting approach based on a k-fold technique. In addition, we developed generic and specific word embedding lexicons to further improve classification performances. When evaluated on the official test sets, our system ranked 1st in almost all subtasks for each shared task, showing the effectiveness of our approach. *
scopus.description.allpeopleoriginal Cimino A.; De Mattei L.; Dell'Orletta F. *
scopus.differences scopus.relation.conferencename *
scopus.differences scopus.authority.anceserie *
scopus.differences scopus.publisher.name *
scopus.differences scopus.relation.conferencedate *
scopus.differences scopus.relation.conferenceplace *
scopus.document.type cp *
scopus.document.types cp *
scopus.funding.funders 100007065 - Nvidia; *
scopus.identifier.pui 625516033 *
scopus.identifier.scopus 2-s2.0-85058664441 *
scopus.journal.sourceid 21100218356 *
scopus.language.iso eng *
scopus.publisher.name CEUR-WS *
scopus.relation.conferencedate 2018 *
scopus.relation.conferencename 6th Evaluation Campaign of Natural Language Processing and Speech Tools for Italian. Final Workshop, EVALITA 2018 *
scopus.relation.conferenceplace ita *
scopus.relation.volume 2263 *
scopus.title Multi-task learning in deep neural networks at EVALITA 2018 *
scopus.titleeng Multi-task learning in deep neural networks at EVALITA 2018 *
Appare nelle tipologie: 04.01 Contributo in Atti di convegno
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/392545
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 4
  • ???jsp.display-item.citation.isi??? ND
social impact