Psycholinguistic evidence based on inflectional and derivationalword families has emphasised the combined role of Paradigm Entropy andInflectional Entropy in human word processing. Although the way frequencydistributions affect behavioural evidence is clear in broad outline, we stillmiss a clear algorithmic model of how such a complex interaction takes placeand why. The main challenge is to understand how the local interaction oflearning and processing principles in morphology can result in global effectsthat require knowledge of the overall distribution of stems and affixes in wordfamilies. We show that principles of discriminative learning can shed light onthis issue. We simulate learning of verb inflection with a discriminativerecurrent network of specialised processing units, whose level of temporalconnectivity reflects the frequency distribution of input symbols in context.We analyse the temporal dynamic with which connection weights areadjusted during discriminative learning, to show that self-organisedconnections are optimally functional to word processing when thedistribution of inflected forms in a paradigm (Paradigm Entropy) and thedistribution of their inflectional affixes across paradigms (InflectionalEntropy) diverge minimally.
Discriminative word learning is sensitive to inflectional entropy
Ferro MCo-primo
;Marzi CCo-primo
;Pirrelli VCo-primo
2018
Abstract
Psycholinguistic evidence based on inflectional and derivationalword families has emphasised the combined role of Paradigm Entropy andInflectional Entropy in human word processing. Although the way frequencydistributions affect behavioural evidence is clear in broad outline, we stillmiss a clear algorithmic model of how such a complex interaction takes placeand why. The main challenge is to understand how the local interaction oflearning and processing principles in morphology can result in global effectsthat require knowledge of the overall distribution of stems and affixes in wordfamilies. We show that principles of discriminative learning can shed light onthis issue. We simulate learning of verb inflection with a discriminativerecurrent network of specialised processing units, whose level of temporalconnectivity reflects the frequency distribution of input symbols in context.We analyse the temporal dynamic with which connection weights areadjusted during discriminative learning, to show that self-organisedconnections are optimally functional to word processing when thedistribution of inflected forms in a paradigm (Paradigm Entropy) and thedistribution of their inflectional affixes across paradigms (InflectionalEntropy) diverge minimally.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.