A new evolutionary method for the global optimization of functions with continuous variables is proposed. This algorithm can be viewed as an efficient parallelization of the simulated annealing technique, although a suitable interval coding shows a close analogy between real-coded genetic algorithms and the proposed method, called {\sl interval genetic algorithm}. Some well defined genetic operators allow a considerable improvement in reliability and efficiency with respect to a conventional simulated annealing even on a sequential computer. Results of simulations on Rosenbrock valleys and cost functions with flat areas or fine-grained local minima are reported. Furthermore, tests on classical problems in the field of neural networks are presented; they show a possible practical application of the interval genetic algorithm.
Global optimization of functions with the Interval Genetic Algorithm
M Muselli
1992
Abstract
A new evolutionary method for the global optimization of functions with continuous variables is proposed. This algorithm can be viewed as an efficient parallelization of the simulated annealing technique, although a suitable interval coding shows a close analogy between real-coded genetic algorithms and the proposed method, called {\sl interval genetic algorithm}. Some well defined genetic operators allow a considerable improvement in reliability and efficiency with respect to a conventional simulated annealing even on a sequential computer. Results of simulations on Rosenbrock valleys and cost functions with flat areas or fine-grained local minima are reported. Furthermore, tests on classical problems in the field of neural networks are presented; they show a possible practical application of the interval genetic algorithm.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.


