The paper focuses on what two different types of Recurrent Neural Networks, namely arecurrent Long Short-Term Memory and a recurrent variant of self-organizing memories, a TemporalSelf-Organizing Map, can tell us about speakers' learning and processing a set of fully inflectedverb forms selected from the top-frequency paradigms of Italian and German. Both architectures,due to the re-entrant layer of temporal connectivity, can develop a strong sensitivity to sequentialpatterns that are highly attested in the training data. The main goal is to evaluate learningand processing dynamics of verb inflection data in the two neural networks by focusing onthe effects of morphological structure on word production and word recognition, as well as onword generalization for untrained verb forms. For both models, results show that productionand recognition, as well as generalization, are facilitated for verb forms in regular paradigms.However, the two models are differently influenced by structural effects, with the TemporalSelf-Organizing Map more prone to adaptively find a balance between processing issues of learnabilityand generalization, on the one side, and discriminability on the other side.

Modeling Word Learning and Processing with Recurrent Neural Networks

Marzi C
Primo
2020

Abstract

The paper focuses on what two different types of Recurrent Neural Networks, namely arecurrent Long Short-Term Memory and a recurrent variant of self-organizing memories, a TemporalSelf-Organizing Map, can tell us about speakers' learning and processing a set of fully inflectedverb forms selected from the top-frequency paradigms of Italian and German. Both architectures,due to the re-entrant layer of temporal connectivity, can develop a strong sensitivity to sequentialpatterns that are highly attested in the training data. The main goal is to evaluate learningand processing dynamics of verb inflection data in the two neural networks by focusing onthe effects of morphological structure on word production and word recognition, as well as onword generalization for untrained verb forms. For both models, results show that productionand recognition, as well as generalization, are facilitated for verb forms in regular paradigms.However, the two models are differently influenced by structural effects, with the TemporalSelf-Organizing Map more prone to adaptively find a balance between processing issues of learnabilityand generalization, on the one side, and discriminability on the other side.
2020
Istituto di linguistica computazionale "Antonio Zampolli" - ILC
word-learning
serial word processing
recurrent neural networks
long short-term memories
temporal self-organizing memories
File in questo prodotto:
File Dimensione Formato  
prod_424281-doc_151274.pdf

accesso aperto

Descrizione: Modeling Word Learning and Processing with Recurrent Neural Networks_Marzi_2020
Tipologia: Versione Editoriale (PDF)
Licenza: Dominio pubblico
Dimensione 574.46 kB
Formato Adobe PDF
574.46 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/409499
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 1
social impact