arGA: Adaptive Resolution Micro-genetic Algorithm with Tabu Search to Solve MINLP Problems Using GPU

In this chapter, we propose a new approach to solve the most general form of global optimization problems, namely, non-convex Mixed-Integer Nonlinear Programming (MINLP) and non-convex Nonlinear Programming (NLP) problems. The target is to solve MINLP/NLP

  • PDF / 1,582,207 Bytes
  • 22 Pages / 439.36 x 666.15 pts Page_size
  • 28 Downloads / 282 Views

DOWNLOAD

REPORT


Abstract In this chapter, we propose a new approach to solve the most general form of global optimization problems, namely, non-convex Mixed-Integer Nonlinear Programming (MINLP) and non-convex Nonlinear Programming (NLP) problems. The target is to solve MINLP/NLP problems from different domains of research in fewer fitness evaluations as compared to other state-of-the-art stochastic algorithms in the area. The algorithm is GPU compatible and shows remarkable speedups over nVidia’s CUDA-compatible GPUs. MINLP problems involve both discrete and continuous variables with several active nonlinear equality and inequality constraints making them extremely difficult to solve. The proposed algorithm is named the adaptive resolution Genetic Algorithm (arGA) as it exploits the concept of controlling the search space size and resolution in an adaptive manner. Using entropy measures, the proposed algorithm adaptively controls the intensity of the genetic search in a given sub-solution space, i.e., promising regions are searched more intensively as compared to other regions. The algorithm is equipped with an asynchronous adaptive local search operator to further improve the performance. Niching is incorporated by using a technique inspired from the Tabu search. The algorithm reduces the chances of convergence to local optima by maintaining a list of already visited optima and penalizing their neighborhoods. The proposed technique was able to find the best-known solutions to extremely difficult MINLP/NLP problems in fewer fitness evaluations and in a competitive amount of time. The results section discusses the performance of the algorithm and the effect of different operators by using a variety of MINLP/NLPs from different problem domains. GPU

A. Munawar  M. Wahib Graduate School of Information Science & Technology, Hokkaido University, Sapporo, Japan e-mail: [email protected]; [email protected] M. Munetomo ()  K. Akama Information Initiative Center, Hokkaido University, Sapporo, Japan e-mail: [email protected]; [email protected] S. Tsutsui and P. Collet (eds.), Massively Parallel Evolutionary Computation on GPGPUs, Natural Computing Series, DOI 10.1007/978-3-642-37959-8 5, © Springer-Verlag Berlin Heidelberg 2013

83

84

A. Munawar et al.

implementation shows a speedup of up to 42 for single precision and 20 for double precision implementation over the nVidia C2050 GPU architecture.

1 Introduction Mixed-Integer Nonlinear Programming (MINLP) problems are the most generalized form of single-objective global optimization problems. They contain both continuous and integer decision variables and involve nonlinear objective functions and constraints setting no limit on the complexity of the problems. MINLPs are difficult to solve as [2]: 1. They involve both discrete (integer) and continuous (real) variables. 2. They involve active equality and inequality constraints. 3. Objective function and constraints are nonlinear, generating potential nonconvexities. Many real-world constrained optimization