We analyse an artificial neural network which deviates from biological behaviour in two aspects. First, the process of activation of a generic neuron is not described by a monotone increasing output function. This means that, while artificial neurons modelled on biological behaviour are active when the sum of the postsynaptic potentials is larger than a given threshold and quiescient in the opposite case, in our model the state of the neuron depends non-monotonically on its argument, i.e. the activation function is not an increasing function, but has a mom complicated behaviour which reduces to the usual (step or sigmoid) function for some particular values of the parameters describing its shape. Second, we assume as a learning rule of the network a modification of the Hebb rule, namely we choose an iterative algorithm (Edinburgh algorithm) that constructs a synaptic matrix with a given set of stored memories and given retrieval properties, i.e. given domains of attraction of the stored patterns. The non-monotonicity of the output function results, for some values of the parameters, in a larger storage capacity than in conventional models, whereas the choice of the learning rule allows to control the domains of attraction of the memories.

DYNAMICS OF NEURAL NETWORKS WITH NONMONOTONE ACTIVATION FUNCTION

MARANGI C;PASQUARIELLO G;
1993

Abstract

We analyse an artificial neural network which deviates from biological behaviour in two aspects. First, the process of activation of a generic neuron is not described by a monotone increasing output function. This means that, while artificial neurons modelled on biological behaviour are active when the sum of the postsynaptic potentials is larger than a given threshold and quiescient in the opposite case, in our model the state of the neuron depends non-monotonically on its argument, i.e. the activation function is not an increasing function, but has a mom complicated behaviour which reduces to the usual (step or sigmoid) function for some particular values of the parameters describing its shape. Second, we assume as a learning rule of the network a modification of the Hebb rule, namely we choose an iterative algorithm (Edinburgh algorithm) that constructs a synaptic matrix with a given set of stored memories and given retrieval properties, i.e. given domains of attraction of the stored patterns. The non-monotonicity of the output function results, for some values of the parameters, in a larger storage capacity than in conventional models, whereas the choice of the learning rule allows to control the domains of attraction of the memories.
1993
ATTRACTION
MODELS
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/198170
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? 7
social impact