On the analysis of hyper-parameter space for a genetic programming system with iterated F-Race

  • PDF / 1,498,080 Bytes
  • 14 Pages / 595.276 x 790.866 pts Page_size
  • 108 Downloads / 180 Views

DOWNLOAD

REPORT


METHODOLOGIES AND APPLICATION

On the analysis of hyper-parameter space for a genetic programming system with iterated F-Race Leonardo Trujillo1

· Ernesto Álvarez González1 · Edgar Galván2

· Juan J. Tapia3

· Antonin Ponsich4

© Springer-Verlag GmbH Germany, part of Springer Nature 2020

Abstract Evolutionary algorithms (EAs) have been with us for several decades and are highly popular given that they have proved competitive in the face of challenging problems’ features such as deceptiveness, multiple local optima, among other characteristics. However, it is necessary to define multiple hyper-parameter values to have a working EA, which is a drawback for many practitioners. In the case of genetic programming (GP), an EA for the evolution of models and programs, hyper-parameter optimization has been extensively studied only recently. This work builds on recent findings and explores the hyper-parameter space of a specific GP system called neat-GP that controls model size. This is conducted using two large sets of symbolic regression benchmark problems to evaluate system performance, while hyper-parameter optimization is carried out using three variants of the iterated F-Race algorithm, for the first time applied to GP. From all the automatic parametrizations produced by optimization process, several findings are drawn. Automatic parametrizations do not outperform the manual configuration in many cases, and overall, the differences are not substantial in terms of testing error. Moreover, finding parametrizations that produce highly accurate models that are also compact is not trivially done, at least if the hyper-parameter optimization process (F-Race) is only guided by predictive error. This work is intended to foster more research and scrutiny of hyper-parameters in EAs, in general, and GP, in particular. Keywords Hyper-parameter optimization · Iterated F-Race · Genetic programming

1 Introduction

They are also referred to as parameters, but the distinction between parameters and hyper-parameters is important, particularly when the EA is performing a learning process, searching for models that might also include their own parameters.

mating rates, number of generations and others. In all meta-heuristic search techniques, the proper setting of hyperparameter values can be a difficult and tedious endeavor (Birattari 2009). This is also true for EAs that often lack a theoretical foundation to derive their optimal parametrization values analytically, with few exceptions (Hansen and Ostermeier 2001). This work focuses on hyper-parameter tuning or optimization (Birattari 2009; Neumüller et al. 2012), also referred to as meta-optimization, which is performed offline, in contrast to hyper-parameter control where an EA, or

Communicated by V. Loia.

1

Leonardo Trujillo · Ernesto Álvarez González Tecnológico Nacional de México/IT de Tijuana, Tijuana, BC, Mexico

2

Department of Computer Science, Maynooth University, Maynooth, Ireland

3

Instituto Politécnico Nacional - CITEDI, Av. Instituto Politécnico Nacional No. 131