We analyse an artificial neural network which deviates from biological behaviour in two aspects. First, the process of activation of a generic neuron is not described by a monotone increasing output function. This means that, while artificial neurons modelled on biological behaviour are active when the sum of the postsynaptic potentials is larger than a given threshold and quiescient in the opposite case, in our model the state of the neuron depends non-monotonically on its argument, i.e. the activation function is not an increasing function, but has a mom complicated behaviour which reduces to the usual (step or sigmoid) function for some particular values of the parameters describing its shape. Second, we assume as a learning rule of the network a modification of the Hebb rule, namely we choose an iterative algorithm (Edinburgh algorithm) that constructs a synaptic matrix with a given set of stored memories and given retrieval properties, i.e. given domains of attraction of the stored patterns. The non-monotonicity of the output function results, for some values of the parameters, in a larger storage capacity than in conventional models, whereas the choice of the learning rule allows to control the domains of attraction of the memories.
DYNAMICS OF NEURAL NETWORKS WITH NONMONOTONE ACTIVATION FUNCTION
MARANGI C;PASQUARIELLO G;
1993
Abstract
We analyse an artificial neural network which deviates from biological behaviour in two aspects. First, the process of activation of a generic neuron is not described by a monotone increasing output function. This means that, while artificial neurons modelled on biological behaviour are active when the sum of the postsynaptic potentials is larger than a given threshold and quiescient in the opposite case, in our model the state of the neuron depends non-monotonically on its argument, i.e. the activation function is not an increasing function, but has a mom complicated behaviour which reduces to the usual (step or sigmoid) function for some particular values of the parameters describing its shape. Second, we assume as a learning rule of the network a modification of the Hebb rule, namely we choose an iterative algorithm (Edinburgh algorithm) that constructs a synaptic matrix with a given set of stored memories and given retrieval properties, i.e. given domains of attraction of the stored patterns. The non-monotonicity of the output function results, for some values of the parameters, in a larger storage capacity than in conventional models, whereas the choice of the learning rule allows to control the domains of attraction of the memories.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.