A Discussion on Variational Analysis in Derivative-Free Optimization
- PDF / 387,098 Bytes
- 17 Pages / 439.642 x 666.49 pts Page_size
- 72 Downloads / 198 Views
A Discussion on Variational Analysis in Derivative-Free Optimization Warren Hare1 Received: 23 March 2020 / Accepted: 13 September 2020 / Published online: 25 September 2020 © Springer Nature B.V. 2020
Abstract Variational Analysis studies mathematical objects under small variations. With regards to optimization, these objects are typified by representations of first-order or second-order information (gradients, subgradients, Hessians, etc). On the other hand, Derivative-Free Optimization studies algorithms for continuous optimization that do not use first-order information. As such, researchers might conclude that Variational Analysis plays a limited role in Derivative-Free Optimization research. In this paper we argue the contrary by showing that many successful DFO algorithms rely heavily on tools and results from Variational Analysis. Keywords Derivative-Free Optimization · Variational Analysis · Direct-search method · model-based methods · order-N accuracy Mathematics Subject Classification (2010) Primary 49-01 · 90C52; Secondary 49J52 · 65K05
1 Introduction Let us begin by defining Variational Analysis as the study of how mathematical objects (functions, sets, functionals, etc) behave under small variations. Variational Analysis is the foundation of single-variable Calculus, where small variations are used to define derivatives and integrals. Modern research in Variational Analysis has developed concepts of directional derivatives, subdifferentials, normal and tangent cones, co-derivatives, etc. All of these have numerous real-world applications, while simultaneously being fundamentally interesting in a purely mathematical sense. One major use of Variational Analysis is the field of Mathematical Optimization. At the most basic level, Fermat’s theorem links the derivative (gradient) of a function to its Hare’s research is partially supported by Natural Sciences and Engineering Research Council of Canada (NSERC) Discovery Grant #2018-03865. Warren Hare
[email protected] 1
Department of Mathematics, University of British Columbia, Kelowna, British Columbia, Canada
644
W. Hare
minimizers. More advances concept like directional derivatives and subdifferentials allow the idea of critical points to be extended to nonsmooth functions. Derivative-Free Optimization (DFO) is the mathematical study of algorithms for continuous optimization that do not use first-order information [8]. While DFO arguably dates back to the 1960’s (see, for example, [42, 51, 70]), it was not until 1998 that the term “DerivativeFree Optimization” was first formally used [26]. Since then, DFO has grown enormously. Indeed, since any optimization problem where the objective function is computed via simulation software can be approached by DFO methods, as simulation software is becoming a more common, the value and interest DFO is similarly increasing. By definition, DFO studies of algorithms that do not use derivatives, gradients, directional derivatives, subgradients, normal cones, tangent cones, etc. As such, it might seem that Variati
Data Loading...