Psycholinguistic evidence based on inflectional and derivationalword families has emphasised the combined role of Paradigm Entropy andInflectional Entropy in human word processing. Although the way frequencydistributions affect behavioural evidence is clear in broad outline, we stillmiss a clear algorithmic model of how such a complex interaction takes placeand why. The main challenge is to understand how the local interaction oflearning and processing principles in morphology can result in global effectsthat require knowledge of the overall distribution of stems and affixes in wordfamilies. We show that principles of discriminative learning can shed light onthis issue. We simulate learning of verb inflection with a discriminativerecurrent network of specialised processing units, whose level of temporalconnectivity reflects the frequency distribution of input symbols in context.We analyse the temporal dynamic with which connection weights areadjusted during discriminative learning, to show that self-organisedconnections are optimally functional to word processing when thedistribution of inflected forms in a paradigm (Paradigm Entropy) and thedistribution of their inflectional affixes across paradigms (InflectionalEntropy) diverge minimally.

Discriminative word learning is sensitive to inflectional entropy

Ferro M
Co-primo
;
Marzi C
Co-primo
;
Pirrelli V
Co-primo
2018

Abstract

Psycholinguistic evidence based on inflectional and derivationalword families has emphasised the combined role of Paradigm Entropy andInflectional Entropy in human word processing. Although the way frequencydistributions affect behavioural evidence is clear in broad outline, we stillmiss a clear algorithmic model of how such a complex interaction takes placeand why. The main challenge is to understand how the local interaction oflearning and processing principles in morphology can result in global effectsthat require knowledge of the overall distribution of stems and affixes in wordfamilies. We show that principles of discriminative learning can shed light onthis issue. We simulate learning of verb inflection with a discriminativerecurrent network of specialised processing units, whose level of temporalconnectivity reflects the frequency distribution of input symbols in context.We analyse the temporal dynamic with which connection weights areadjusted during discriminative learning, to show that self-organisedconnections are optimally functional to word processing when thedistribution of inflected forms in a paradigm (Paradigm Entropy) and thedistribution of their inflectional affixes across paradigms (InflectionalEntropy) diverge minimally.
2018
Istituto di linguistica computazionale "Antonio Zampolli" - ILC
discriminative learning
word processing
recurrent neural networks
relative entropy
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/356242
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 4
social impact