A multi-fidelity Bayesian optimization approach based on the expected further improvement
- PDF / 1,182,778 Bytes
- 11 Pages / 595.276 x 790.866 pts Page_size
- 75 Downloads / 193 Views
RESEARCH PAPER
A multi-fidelity Bayesian optimization approach based on the expected further improvement Leshi Shu 1,2 & Ping Jiang 1 & Yan Wang 2 Received: 7 June 2020 / Revised: 21 September 2020 / Accepted: 26 October 2020 # Springer-Verlag GmbH Germany, part of Springer Nature 2020
Abstract Sampling efficiency is important for simulation-based design optimization. While Bayesian optimization (BO) has been successfully applied in engineering problems, the cost associated with large-scale simulations has not been fully addressed. Extending the standard BO approaches to multi-fidelity optimization can utilize the information of low-fidelity models to further reduce the optimization cost. In this work, a multi-fidelity Bayesian optimization approach is proposed, in which hierarchical Kriging is used for constructing the multi-fidelity metamodel. The proposed approach quantifies the effect of HF and LF samples in multi-fidelity optimization based on a new concept of expected further improvement. A novel acquisition function is proposed to determine both the location and fidelity level of the next sample simultaneously, with the consideration of balance between the value of information provided by the new sample and the associated sampling cost. The proposed approach is compared with some state-of-the-art methods for multi-fidelity global optimization with numerical examples and an engineering case. The results show that the proposed approach can obtain global optimal solutions with reduced computational costs. Keywords Bayesian optimization . Efficient global optimization . Multi-fidelity optimization . Hierarchical kriging model . Sequential sampling . Constrained optimization
1 Introduction Bayesian optimization (BO) is a metamodel-based global optimization approach, where the search process is assisted by constructing and updating a metamodel iteratively, and the sequential sampling is guided by an acquisition function to incorporate uncertainty (Ghoreishi and Allaire 2019; Tran et al. 2019b). The construction of metamodels helps improve the search efficiency, while the sequential sampling guided by the acquisition function reduces the overall number of samples. The sequential sampling strategy is particularly helpful
Responsible Editor: Nestor V Queipo * Yan Wang [email protected] 1
The State Key Laboratory of Digital Manufacturing Equipment and Technology, School of Mechanical Science and Engineering, Huazhong University of Science and Technology, Wuhan 430074, People’s Republic of China
2
Woodruff School of Mechanical Engineering, Georgia Institute of Technology, 813 Ferst Drive NW, Atlanta, GA 30332-0405, USA
when high-cost simulations or physical experiments are involved. Different definitions of acquisition functions have been developed to balance between exploration and exploitation, such as expected improvement (EI), probability of improvement, and lower confidence bound. BO with the EI acquisition function is also called efficient global optimization (EGO) by some researchers. Similar to other
Data Loading...