Expected improvement for expensive optimization: a review
- PDF / 5,417,436 Bytes
- 38 Pages / 439.37 x 666.142 pts Page_size
- 35 Downloads / 314 Views
Expected improvement for expensive optimization: a review Dawei Zhan1
· Huanlai Xing1
Received: 15 May 2019 / Accepted: 29 June 2020 © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract The expected improvement (EI) algorithm is a very popular method for expensive optimization problems. In the past twenty years, the EI criterion has been extended to deal with a wide range of expensive optimization problems. This paper gives a comprehensive review of the EI extensions designed for parallel optimization, multiobjective optimization, constrained optimization, noisy optimization, multi-fidelity optimization and high-dimensional optimization. The main challenges of extending the EI approach to solve these complex optimization problems are pointed out, and the ideas proposed in literature to tackle these challenges are highlighted. For each reviewed algorithm, the surrogate modeling method, the computation of the infill criterion and the internal optimization of the infill criterion are carefully studied and compared. In addition, the monotonicity properties of the multiobjective EI criteria and constrained EI criteria are analyzed in detail. Through this review, we give an organized summary about the EI developments in the past twenty years and show a clear picture about how the EI approach has advanced. In the end of this paper, several interesting problems and future research topics about the EI developments are given. Keywords Expected improvement · Parallel computing · Constrained optimization · Multiobjective optimization · Noisy optimization · Multi-fidelity optimization
1 Introduction The objectives and constraints of many optimization problems are calculated using computationally expensive simulations or costly physical experiments [135,150,155]. These optimization problems are often referred as expensive optimization problems. The number of expensive evaluations is often restricted to a few hundreds or thousands. Since the derivatives of computer simulations or physical experiments are difficult and expensive to obtain, the usage of derivative-based optimization algorithms is often prohibited. Evolutionary algorithms and swarm intelligence algorithms, on the other hand, require a large number
B
Dawei Zhan [email protected] Huanlai Xing [email protected]
1
School of Information Science and Technology, Southwest Jiaotong University, Chengdu 611756, China
123
Journal of Global Optimization
of evaluations to produce reliable results. As a result, they are not suitable for expensive optimization problems as well. A practical way to tackle the expensive optimization problems is using cheap-to-evaluate surrogate models to approximate the computationally expensive functions [39,131]. In these approaches, an initial surrogate model is first built based on some previously evaluated samples. In each iteration, an infill sampling criterion or an acquisition function is used to select a candidate point, which is evaluated using the original expensive function and is then used to upda
Data Loading...