A Stochastic Adaptive Radial Basis Function Algorithm for Costly Black-Box Optimization

  • PDF / 658,142 Bytes
  • 23 Pages / 439.37 x 666.142 pts Page_size
  • 5 Downloads / 182 Views

DOWNLOAD

REPORT


A Stochastic Adaptive Radial Basis Function Algorithm for Costly Black-Box Optimization Zhe Zhou1 · Fu-Sheng Bai1

Received: 19 March 2017 / Revised: 20 March 2018 / Accepted: 11 April 2018 © Operations Research Society of China, Periodicals Agency of Shanghai University, Science Press, and Springer-Verlag GmbH Germany, part of Springer Nature 2018

Abstract In this paper, we present a stochastic adaptive algorithm using radial basis function models for global optimization of costly black-box functions. The exploration radii in local searches are generated adaptively. Each iteration point is selected from some randomly generated trial points according to certain criteria. A restarting strategy is adopted to build the restarting version of the algorithm. The performance of the presented algorithm and its restarting version are tested on 13 standard numerical examples. The numerical results suggest that the algorithm and its restarting version are very effective. Keywords Global optimization · Costly black-box optimization · Radial basis function · Stochastic algorithm Mathematics Subject Classification 90C26 · 90C56 · 90C59

1 Introduction In this paper, we consider the following global optimization problem: (P)

B

min f (x) s.t. − ∞ < ai  xi  bi < ∞, i = 1, · · · , d,

(1.1)

Fu-Sheng Bai [email protected] Zhe Zhou [email protected]

1

School of Mathematical Sciences, Chongqing Normal University, Chongqing 401331, China

123

Z. Zhou, F.-S. Bai

where f (x) ∈ R is a deterministic black-box function that is computationally expensive, ai and bi , i = 1, · · · , d, are the lower and upper variable bounds, respectively, and d denotes the problem dimension. We assume that the derivative of the objective function f is unavailable, which is common in many practical applications. Let a = (a1 , a2 , · · · , ad )T , b = (b1 , b2 , · · · , bd )T , x = (x1 , x2 , · · · , xd )T , then D := {x ∈ Rd : a  x  b} is the search space or the feasible set of the problem (P). Moreover, we assume that f is continuous on [a, b]. Then, the problem (P) is guaranteed to have a global minimizer over the compact set D. One family of methods to solve costly black-box optimization problems is surrogate model-based methods. A surrogate model is generally an approximation of the costly objective function. Different types of surrogate models have been developed including low-order polynomial regressions [1–3], kriging [4–6], radial basis functions (RBFs) [7,8], and neural networks. Jones et al. [6] developed the efficient global optimization (EGO) method based on kriging interpolation, which is a well-known global optimization method where the next iterate point is sampled by maximizing the expected improvement in the objective function over the present best solution. Gutmann [9] developed an RBF method where the next iterate point is sampled by minimizing a bumpiness function. Later on, variants of Gutmann’s RBF methods were developed, see e.g. [10,11]. Optimization methods based on RBF surrogate models have been developed to deal with const