WOLIF: An efficiently tuned classifier that learns to classify non-linear temporal patterns without hidden layers
- PDF / 1,874,138 Bytes
- 15 Pages / 595.224 x 790.955 pts Page_size
- 3 Downloads / 161 Views
WOLIF: An efficiently tuned classifier that learns to classify non-linear temporal patterns without hidden layers Irshed Hussain1
· Dalton Meitei Thounaojam1
Accepted: 7 September 2020 © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract We present in this paper a computationally efficient and biologically plausible classifier WOLIF, using Grey Wolf Optimizer (GWO) tuned error function obtained from Leaky-Integrate-and-Fire (LIF) spiking neuron. Unlike traditional artificial neuron, spiking neuron is capable of intelligently classifying non-linear temporal patterns without hidden layer(s), which makes a Spiking Neural Network (SNN) computationally efficient. There is no additional cost of adding hidden layer(s) in SNN, it is also biologically plausible, and energy efficient. Since supervised learning rule for SNN is still in infancy stage, we introduced WOLIF classifier and its supervised learning rule based on GWO algorithm. WOLIF uses a single LIF neuron thereby use less network parameters, and homo-synaptic static long-term synaptic weights (both excitatory and inhibitory). Note that, WOLIF also reduces the total simulation time which improves computational efficiency. It is benchmarked on seven different datasets drawn from the UCI machine learning repository and found better results both in terms of accuracy and computational cost than state-of-the-art methods. Keywords LIF neuron · Hidden layer · GWO · Temporal pattern · Non-linear · Static long-term plasticity
1 Introduction Sigmoidal neural network, referred to as the second generation of Artificial Neural Network (ANN) [1], is enjoying a great time in the field of computational intelligence, especially in the domain of classification, due to the capability of handling non-linear data and well known learning algorithm back-propagation. Moreover, with the flow of time, other algorithms based on metaheuristic approach is developed in the field of ANN to improve the optimisation capability of synaptic weights such as training ANN using the optimisation method based on Asexual Reproduction Optimization (ACO) [2] where ACO is applied initially to accomplish global searching of optimum synaptic weights and then back-propagation is applied to reduce overall training error, training ANN using the Butterfly Optimization Algorithm (BOA) [3] where BOA is efficiently applied to speed up the convergence rate and to minimise the risk of stagnating into local minimum. Irshed Hussain
[email protected] 1
National Institute of Technology Silchar, (Computer Science and Engineering), Silchar, 788010, Assam, India
In addition, there are several other evolutionary algorithms used to efficiently train ANN for classification problems including some of neuroevolution based method for specific robotic task, in [4], a review of various such algorithms are discussed. However, the traditional rate coding concept of sharing information among neurons in ANN through synapses is proven unlikely in neuroscience [5, 6] and because of this reason ANN fails
Data Loading...