Exploratory Data Analysis and Foreground Detection with the Growing Hierarchical Neural Forest
- PDF / 1,701,384 Bytes
- 27 Pages / 439.37 x 666.142 pts Page_size
- 27 Downloads / 183 Views
Exploratory Data Analysis and Foreground Detection with the Growing Hierarchical Neural Forest Esteban J. Palomo1,2 · Ezequiel López-Rubio1,2 · Francisco Ortega-Zamorano1,2 · Rafaela Benítez-Rochel1,2 Accepted: 24 September 2020 © Springer Science+Business Media, LLC, part of Springer Nature 2020
Abstract In this paper, a new self-organizing artificial neural network called growing hierarchical neural forest (GHNF) is proposed. The GHNF is a hierarchical model based on the growing neural forest, which is a tree-based model that learns a set of trees (forest) instead of a general graph so that the forest can grow in size. This way, the GHNF faces three important limitations regarding the self-organizing map: fixed size, fixed topology, and lack of hierarchical representation for input data. Hence, the GHNF is especially amenable to datasets containing clusters where each cluster has a hierarchical structure since each tree of the GHNF forest can adapt to one of the clusters. Experimental results show the goodness of our proposal in terms of self-organization and clustering capabilities. In particular, it has been applied to text mining of tweets as a typical exploratory data analysis application, where a hierarchical representation of concepts present in tweets has been obtained. Moreover, it has been applied to foreground detection in video sequences, outperforming several methods specialized in foreground detection. Keywords Self-organization · Clustering · Text mining · Image segmentation
1 Introduction Artificial neural networks have been extensively used for exploratory data analysis applications, such as clustering, data mining, document retrieval, or image segmentation. These kind of artificial neural networks are different from the convolutional neural networks (CNNs), which are commonly used for supervised learning tasks such as image classification or object detection [14,22,42]. Among these artificial neural networks, self-organizing neural networks have been successfully applied to a wide variety of unsupervised learning tasks [8,11,20,43],
B
Esteban J. Palomo [email protected]
1
Department of Computer Languages and Computer Science, University of Málaga, Bulevar Louis Pasteur 35, 29071 Málaga, Spain
2
Biomedic Research Institute of Málaga (IBIMA), C/ Doctor Miguel Díaz Recio, 28, 29010 Málaga, Spain
123
E. J. Palomo et al.
where a cooperation mechanism among neurons is included besides the Hebbian competition mechanism. The best known self-organizing neural model is the self-organizing map (SOM) [19], which maps high-dimensional data into a two-dimensional representation space, usually a rectangular or hexagonal grid. The projection preserves the topology of the data so that similar data items will be mapped to nearby locations on the map. However, some inherent properties of this neural model may be too restrictive in specific applications. First, the size of the map (number of neurons) has to be established in advance. Second, the fixed SOM topology (rectangular or hexagonal grid) can un
Data Loading...