The advent of connectionism in the 80's popularised the idea that the lexical processor consists of a network of parallel processing units selectively firing in response to sensory stimuli. In the light of these assumptions, the most important contribution of connectionism to the theoretical debate on lexical modelling at the time was the utter rejection of the widely accepted idea that word recognition and production require a dichotomous choice between storage and processing. However, in spite of the prima facie psycho-computational allure of this view of the lexicon, early connectionist models also embraced a number of unsatisfactory assumptions about word learning and processing. More recently, a growing number of approaches to inflection in both Psycholinguistics and Theoretical Linguistics developed the view that surface word relations represent a fundamental domain of morphological competence. Learning the morphology of a language amounts to acquiring relations between fully stored lexical forms, which are concurrently available in the speaker's mental lexicon and jointly facilitate processing of morphologically related forms through patterns of emergent self-organisation. This novel view presupposes an integrative language architecture, where storage and processing, far from being conceived of as insulated and poorly interacting modules, are the short-term and the long-term dynamics of the same underlying process of adaptive specialisation of synaptic connections. This view, upheld by recent evidence of the neuro-anatomical bases of short-term and long-term memory processes, crucially hinges on Hebbian principles of synaptic plasticity, which are, in turn, in keeping with mathematical models of discriminative learning. I contend that integrative computer models of Hebbian language learning represent an exciting way forward in current neuro-computational research on word processing, and a persistently fertile legacy of the connectionist revolution.

Storage vs. Processing in Models of Word Inflection. A Neuro-computational Hebbian Perspective

Pirrelli;Vito
2017

Abstract

The advent of connectionism in the 80's popularised the idea that the lexical processor consists of a network of parallel processing units selectively firing in response to sensory stimuli. In the light of these assumptions, the most important contribution of connectionism to the theoretical debate on lexical modelling at the time was the utter rejection of the widely accepted idea that word recognition and production require a dichotomous choice between storage and processing. However, in spite of the prima facie psycho-computational allure of this view of the lexicon, early connectionist models also embraced a number of unsatisfactory assumptions about word learning and processing. More recently, a growing number of approaches to inflection in both Psycholinguistics and Theoretical Linguistics developed the view that surface word relations represent a fundamental domain of morphological competence. Learning the morphology of a language amounts to acquiring relations between fully stored lexical forms, which are concurrently available in the speaker's mental lexicon and jointly facilitate processing of morphologically related forms through patterns of emergent self-organisation. This novel view presupposes an integrative language architecture, where storage and processing, far from being conceived of as insulated and poorly interacting modules, are the short-term and the long-term dynamics of the same underlying process of adaptive specialisation of synaptic connections. This view, upheld by recent evidence of the neuro-anatomical bases of short-term and long-term memory processes, crucially hinges on Hebbian principles of synaptic plasticity, which are, in turn, in keeping with mathematical models of discriminative learning. I contend that integrative computer models of Hebbian language learning represent an exciting way forward in current neuro-computational research on word processing, and a persistently fertile legacy of the connectionist revolution.
2017
Istituto di linguistica computazionale "Antonio Zampolli" - ILC
Hebbian Learning
Recurrent Neural Networks
Word Inflection
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/327041
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact