Gradient-based optimization algorithms are probably the most ecient option for the solution of a local optimization problem. These methods are intrinsically limited in the search of a local optimum of the objective function: if a global optimum is searched, the application of local optimization algorithms can be still successful if the algorithm is initialized starting from a large number of dierent points in the design space (multistart algorithms). As a counterpart, the cost of the exploration is further increased, linearly with the number of adopted starting points. Consequently, the use of a multistart local optimization algorithm is rarely adopted, mainly for two reasons: i) the large computa- tional cost and ii) the absence of a guarantee about the success of the search (in fact, there is not a general indication about the minimum number of starting points able to guarantee the success of global optimization). In this paper, techniques for reducing the computational cost of the full process to- gether with some techniques able to maximize the eciency of a parallel multistart search are described and tested. An extensive use of surrogate models is applied in order to dras- tically reduce the computational eort in practical applications, where the computational cost of a single objective function evaluation is high. De-clustering techniques are also adopted in order to exploit at best the dierent local searches.

A multistart gradient-based algorithm with surrogate model for global optimization

Daniele Peri;
2012

Abstract

Gradient-based optimization algorithms are probably the most ecient option for the solution of a local optimization problem. These methods are intrinsically limited in the search of a local optimum of the objective function: if a global optimum is searched, the application of local optimization algorithms can be still successful if the algorithm is initialized starting from a large number of dierent points in the design space (multistart algorithms). As a counterpart, the cost of the exploration is further increased, linearly with the number of adopted starting points. Consequently, the use of a multistart local optimization algorithm is rarely adopted, mainly for two reasons: i) the large computa- tional cost and ii) the absence of a guarantee about the success of the search (in fact, there is not a general indication about the minimum number of starting points able to guarantee the success of global optimization). In this paper, techniques for reducing the computational cost of the full process to- gether with some techniques able to maximize the eciency of a parallel multistart search are described and tested. An extensive use of surrogate models is applied in order to dras- tically reduce the computational eort in practical applications, where the computational cost of a single objective function evaluation is high. De-clustering techniques are also adopted in order to exploit at best the dierent local searches.
2012
Istituto di iNgegneria del Mare - INM (ex INSEAN)
Global optimization
gradient-based methods
trust region methods
surrogate models.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14243/218742
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact