Learning Parameters in Deep Belief Networks Through Firefly Algorithm
Restricted Boltzmann Machines (RBMs) are among the most widely pursed techniques in the context of deep learning-based applications. Their usage enables sundry parallel implementations, which have become pivotal in nowadays large-scale-oriented applicatio
- PDF / 545,463 Bytes
- 12 Pages / 439.37 x 666.142 pts Page_size
- 2 Downloads / 190 Views
2
Department of Computing, S˜ ao Paulo State University, S˜ ao Paulo, Brazil [email protected], [email protected], [email protected] Department of Computing, Federal University of S˜ ao Carlos, S˜ ao Carlos, Brazil [email protected], [email protected] 3 School of Science and Technology, Middlesex University, London, UK [email protected]
Abstract. Restricted Boltzmann Machines (RBMs) are among the most widely pursed techniques in the context of deep learning-based applications. Their usage enables sundry parallel implementations, which have become pivotal in nowadays large-scale-oriented applications. In this paper, we propose to address the main shortcoming of such models, i.e. how to properly fine-tune their parameters, by means of the Firefly Algorithm, as well as we also consider Deep Belief Networks, a stackeddriven version of the RBMs. Additionally, we also take into account Harmony Search, Improved Harmony Search and the well-known Particle Swarm Optimization for comparison purposes. The results obtained showed the Firefly Algorithm is suitable to the context addressed in this paper, since it obtained the best results in all datasets.
Keywords: Deep Belief Networks
1
· Deep learning · Firefly algorithm
Introduction
Even today, there are still some open computer vision-related problems concerning on how to create and produce good representations of the real world, such as machine learning systems that can detect and further classify objects [4]. These techniques have been paramount during the last years, since there is an increasing number of applications that require intelligent-based decision-making processes. An attractive skill of pattern recognition techniques related to deep learning has drawn a considerable amount of interest in the last years [3], since their outstanding results have settled a hallmark for several applications, such as speech, face and emotion recognition, among others. Roughly speaking, deep learning algorithms are shaped by means of several layers of a predefined set of operations. Restricted Boltzmann Machines (RBMs), for instance, have attracted considerable focus in the last years due to their simplicity, high level of parallelism and strong representation ability [7]. RBMs can c Springer International Publishing AG 2016 F. Schwenker et al. (Eds.): ANNPR 2016, LNAI 9896, pp. 138–149, 2016. DOI: 10.1007/978-3-319-46182-3 12
Learning Parameters in Deep Belief Networks Through Firefly Algorithm
139
be interpreted as stochastic neural networks, being mainly used for image reconstruction and collaborative filtering through unsupervised learning [2]. Later on, Hinton et al. [8] realized one can obtain more complex representations by stacking a few RBMs on top of each other, thus leading to the so-called Deep Belief Networks (DBNs). One of the main loopholes of DBNs concerns with the proper calibration of their parameters. The task of fine-tuning parameters in machine learning aims at finding suitable values for that parameters in order to maximize some fitness fun
Data Loading...