A capsule-unified framework of deep neural networks for graphical programming
- PDF / 2,976,179 Bytes
- 23 Pages / 595.276 x 790.866 pts Page_size
- 98 Downloads / 153 Views
(0123456789().,-volV)(0123456789(). ,- volV)
METHODOLOGIES AND APPLICATION
A capsule-unified framework of deep neural networks for graphical programming Yujian Li1,2 • Chuanhui Shan2 • Houjun Li3 • Jun Ou2
Springer-Verlag GmbH Germany, part of Springer Nature 2020
Abstract Recently, the growth of deep learning has produced a large number of deep neural networks. How to describe these networks unifiedly is becoming an important issue. To make difference from capsule networks, we first formalize neuronal (plain) networks in a mathematical definition, give their representational graphs, and prove a generation theorem about the induced networks of the graphs. Then, we extend plain networks to capsule networks, and set up a capsule-unified framework for deep learning, including a mathematical definition of capsules, an induced model for capsule networks and a universal backpropagation algorithm for training them. Moreover, we present a set of standard graphical symbols of capsules, neurons, and connections for application of the framework to graphical programming. Finally, we design and implement a demo platform to show the graphical programming practicability of deep neural networks in mouse-click drawing experiments. Keywords Unified framework Deep neural network Connected directed acyclic graph Generation theorem Capsule network Universal backpropagation Graphical programming
1 Introduction Artificial neural networks are mathematical models that are inspired by biological neural networks. As a new era of neural networks, deep learning has become a powerful technology for artificial intelligence. Since presented by Hinton et al. [1], it has made a great amount of successes in image classification, speech recognition, and natural Communicated by V. Loia.
Electronic supplementary material The online version of this article (https://doi.org/10.1007/s00500-020-05412-7) contains supplementary material, which is available to authorized users. & Jun Ou [email protected] 1
School of Artificial Intelligence, Guilin University of Electronic Technology, Guilin 541004, Guangxi, China
2
College of Computer Science, Faculty of Information Technology, Beijing University of Technology, Beijing 100124, China
3
School of Computer Science and Communication Engineering, Guangxi University of Science and Technology, Liuzhou 545006, Guangxi, China
language processing [2–5], with dramatic influence on both academia and industry. Essentially, deep learning is a collection of various methods for effectively training neural networks with deep structures. A neural network is usually regarded as a hierarchical system composed of many nonlinear computing units (or neurons, nodes). For example, a single-neuron network is the MP model proposed by McCulloch and Pitts in 1943 [6], with the function of performing logical operations. Although the MP model is unable to learn, it started the age of neural networks. In 1949, Hebb first proposed the idea of learning about biological neural networks [7]. In 1958, Rosenblatt inven
Data Loading...