One of the most interesting engineering applications of neural networks is the design of recognition architectures for classifing pattern vectors. The architectures are characterized by their topology and, if the pattern vector space partitioning is not predetermined, by the learning algorithm. The complexity of a neural architecture depends on the partitioning of the pattern vector space into class regions and on the learning algorithm. However the hypothesis of stationary probabilistic characteristics of the input vector and of the classification correspondence is sometimes over-simplifying. Moreover each learning algorithm has its specific advantages and limitations so that it is useful to investigate hybrid architectures that take the advantages of the specific algorithms while avoiding their limitations. As we are interested in limiting the architecture complexity, we investigate algorithms with a low number of nodes and/or interconnections, with the additional benefit of a reduced cost of the learning phase. Architectures based on the Adaptive-Resonance-Theory (ART) are based on unsupervised learning and avoid the stability/plasticity problem, that is they allow the number of recognition nodes to be augmented when the probabilistic characteristics change. No loss of the previous recognition capabilities occurs so that the learning and recognition phases are purposely intermingled.
A Hybrid Adaptive-Resonance-Theory/Error-Back-Propagation (ART/EBP) architecture for supervised learning
V Rampa;
1991
Abstract
One of the most interesting engineering applications of neural networks is the design of recognition architectures for classifing pattern vectors. The architectures are characterized by their topology and, if the pattern vector space partitioning is not predetermined, by the learning algorithm. The complexity of a neural architecture depends on the partitioning of the pattern vector space into class regions and on the learning algorithm. However the hypothesis of stationary probabilistic characteristics of the input vector and of the classification correspondence is sometimes over-simplifying. Moreover each learning algorithm has its specific advantages and limitations so that it is useful to investigate hybrid architectures that take the advantages of the specific algorithms while avoiding their limitations. As we are interested in limiting the architecture complexity, we investigate algorithms with a low number of nodes and/or interconnections, with the additional benefit of a reduced cost of the learning phase. Architectures based on the Adaptive-Resonance-Theory (ART) are based on unsupervised learning and avoid the stability/plasticity problem, that is they allow the number of recognition nodes to be augmented when the probabilistic characteristics change. No loss of the previous recognition capabilities occurs so that the learning and recognition phases are purposely intermingled.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


