Hybrid gradient simulated annealing algorithm for finding the global optimal of a nonlinear unconstrained optimization p
- PDF / 4,985,676 Bytes
- 26 Pages / 595.276 x 790.866 pts Page_size
- 27 Downloads / 218 Views
METHODOLOGIES AND APPLICATION
Hybrid gradient simulated annealing algorithm for finding the global optimal of a nonlinear unconstrained optimization problem M. EL-Alem1
· A. Aboutahoun1,2 · S. Mahdi3
© Springer-Verlag GmbH Germany, part of Springer Nature 2020
Abstract A new hybrid gradient simulated annealing algorithm is introduced. The algorithm is designed to find the global minimizer of a nonlinear function of many variables. The function is assumed to be smooth. The algorithm uses the gradient method together with a line search to ensure convergence from a remote starting point. It is hybridized with a simulated annealing algorithm to ensure convergence to the global minimizer. The performance of the algorithm is demonstrated through extensive numerical experiments on some well-known test problems. Comparisons of the performance of the suggested algorithm and other meta-heuristics methods were reported. It validates the effectiveness of our approach and shows that the suggested algorithm is promising and merits to be implemented in practice. Keywords Nonlinear function · Unconstrained minimization · Hybrid algorithm · Global optima · Gradient method · Line search · Meta-heuristics · Simulated annealing · Numerical comparisons · Test problems
1 Introduction In this paper, the following optimization problem is considered, g minn f (x) : n → , x∈
(1)
where “g min” means finding the global minimizer. We assume that f (x) is continuously differentiable. Recently, there has been a great development of optimization algorithms that are designed to find the global minimizer Communicated by V. Loia.
B
M. EL-Alem [email protected]; [email protected] A. Aboutahoun [email protected] S. Mahdi [email protected]
1
Department of Mathematics and Computer Science, Faculty of Science, Alexandria University, Alexandria, Egypt
2
Applied Mathematics and Information Science Department, Zewail City of Science and Technology, 6th of October City, Giza, Egypt
3
Educational Research and Development Center, Sanaa, Yemen
of a continuous function. One of the most successful metaheuristic algorithms is the simulated annealing algorithm. As a matter of fact, the numerical results show that the simulated annealing algorithm is very efficient and effective for finding the global minimizer. See, for example, Ayumi et al. (2016), Certa et al. (2015), Chakraborti and Sanyal (2015), Gonzales et al. (2015), Guodong et al. (2015), Poorjafari et al. (2016), Rere et al. (2014, 2016), Samora et al. (2016), Wang et al. (2013), Xu et al. (2015), Yarmohamadi and Mirhosseini (2015). Gradient method, on the other hand, is an inexpensive algorithm for finding a local minimizer of a continuously differentiable function. It was proved that the gradient method converges locally to a local minimizer (Armijo 1966). If a line search is adding to a local method as a globalization strategy, the resulting algorithm is globally convergent to a local minimizer (Bertsekas 1999; Bonnans et al. 2006; Dennis Jr and Schnabel 1996; Fletcher 2013; H
Data Loading...