A Genetic Designed Beta Basis Function Neural Network for Approximating Multi-Variables Functions

We propose in this paper a new genetic algorithm for Beta basis function neural networks (BBFNN). The properties of this ’genetic algorithm are the representation used and the ability to obtain the optimal structure of the BBFNN for approximating a multi-

  • PDF / 918,049 Bytes
  • 4 Pages / 595.276 x 790.866 pts Page_size
  • 59 Downloads / 158 Views

DOWNLOAD

REPORT


A Genetic Designed Beta Basis Function Neural Network for Approximating Multi -Variables Functions Chaouki Aouiti*, Adel M. Alimit, Aref Maalef *LASEM: Laboratory of Electromechanical Systems University of Sfax ENIS, Department of Mechanical Egineering, BP W - 3038, Sfax, Tunisia. e-mail: [email protected] ; [email protected], tREGIM: Research Group on Intelligent Machines, University of Sfax, ENIS, Department of Electrical Engineering, BP W - 3038, Sfax, Tunisia. e-mail: [email protected]

Abstract We propose in this paper a new genetic algorithm for Beta basis function neural networks (BBFNN). The proprieties of this 'genetic algorithm are the representation used and the ability to obtain the optimal structure of the BBFNN for approximating a multi-variable function. Each network is coded as a matrix for which the number of rows is equal to the number of parameters in the function. The genetic algorithm operators change the number of neurons in the hidden layer. Some applications to functions with one and two variables are considered to demonstrate the performance of the BBFNN and of their genetic algorithm based design.

This method is the direct encoding. For the second method, which is called indirect encoding, only the most important parameters of architecture are encoded. For the BBFNN we have used in [3] a real method for the representation. In this paper, we will present a new method and we will prove that we can use GA to approximate a function with many variables. The rest of the paper is organized as follows: section 2 is devoted to describe the BBFNN; in section 3, we present our GA for the design of BBFNN; the experimental results and discussion are in section 4.

1 Introduction 2 Beta Basis Function Neural Network Gas are useful for dealing with large complex problems with many local optima. So, GAs are used to perform various tasks in the case of designing a neural network, such as connection weight training, architecture design, etc. The first step in the implementation of GAs is the choice of the representation. If the goal is the evolution of connection weights, two methods are used. In the first representation used in canonical Gas, binary strings are used to encode alternative solutions [4]. The second method is the real-number representation, i.e., one real number per connection weight [5] and [6]. GA are used for the evolution of architectures for many reason, like the surface where each point represents an architecture is infinitely large, the surface is non-differentiable since changes in the number of node or connections are discrete, the surface is deceptive since similar architectures may have quit different performance and the surface is multi modal since different architectures may have similar performance [7]. When we use GAs for the evolution of architectures, we decide how much information about architectures should be encoded in the chromosome. In the first method the chromosome can specify the entire connexion weights and nodes of architecture. V. Krková et al.