A Novel Sampling Technique for Multistart-Based Methods

  • PDF / 1,426,809 Bytes
  • 10 Pages / 595.276 x 790.866 pts Page_size
  • 104 Downloads / 209 Views

DOWNLOAD

REPORT


ORIGINAL RESEARCH

A Novel Sampling Technique for Multistart‑Based Methods Ioannis G. Tsoulos1 · Evangelos Karvounis1 · Alexandros Tzallas1 Received: 4 September 2020 / Accepted: 5 November 2020 © Springer Nature Singapore Pte Ltd 2020

Abstract The problem of locating the global minimum of a function is a challenging one with application in many problems. A common method to tackle this problem is the so-called multistart method, which is the base method for many modern optimization methods. This article proposes a new sampling technique for multistart-based methods, that utilizes an artificial neural network as an approximator of the original objective function. The proposed sampling technique is tested against uniform sampling on a wide set of well-known benchmark optimization problems from the relevant literature and the results are reported. Keywords  Global optimization · Sampling techniques · Neural networks

Introduction A problem that arises frequently is to estimate the global minimum of a multi-dimensional function f(x). The problem is formulated as:

x∗ = arg min f (x), x∈S

where S ⊂ Rn is defined by: ] ] [ ] [ [ S = a1 , b1 ⊗ a2 , b2 ⊗ ⋯ an , bn .

(1)

(2)

The estimation of global minimum finds application in many areas such as physics [1, 2], chemistry [3, 4], economics [5], etc. During the past years, many methods have been proposed in the relevant literature to tackle the global optimization problem and especially stochastic optimization methods that are easier to implement and they do not require a priory knowledge of the objective function. Examples of the stochastic methods are the Random Line Search [6], Adaptive Random Search [7], Competitive Evolution [8], Controlled Random Search [9], Simulated Annealing [10], Genetic Algorithms [11], Differential Evolution [12], Tabu Search, Pso method [13], etc.

* Ioannis G. Tsoulos [email protected] 1



This article proposes a new sampling technique for multistart based methods [14] aimed to enhance the speed and the efficiency of the underlying method. In this article, a new sampling method that utilizes artificial neural networks [15] is used. The neural network is used as an approximator of the original objective function and the multistart draws samples using just the output function of the neural network. Neural networks are considered as parametric tools that can be used for classification and data fitting problems. They have been with success in many cases such as pattern recognition [16], signal processing [17], astronomy [18], solution of ordinary and partial differential equations [19], etc. The rest of this article is organized as follows: in "Literature Review", a literature review about the multistart-based methods is provided, in "Method Description", the description of the proposed method is given, in "Experiments", the benchmark functions as well as the experimental results are provided and finally in "Conclusions", some conclusions are given.

Literature Review The multistart method starts a local search optimizer such as BFGS [20] from