Some Aspects of Associative Memory Construction Based on a Hopfield Network
- PDF / 345,982 Bytes
- 7 Pages / 612 x 792 pts (letter) Page_size
- 18 Downloads / 169 Views
me Aspects of Associative Memory Construction Based on a Hopfield Network Yu. L. Karpova,*, L. E. Karpovb,c,**, and Yu. G. Smetanind,e,***† a
Luxoft Professional LLC, 1-i Volokolamskii proezd 10, Moscow, 123060 Russia b Ivannikov Institute for System Programming, Russian Academy of Sciences, ul. Solzhenitsyna 25, Moscow, 109004 Russia c Moscow State University, Moscow, 119991 Russia d Federal Research Center Computer Science and Control, Russian Academy of Sciences, ul. Vavilova 44/2, Moscow, 119333 Russia e Moscow Institute of Physics and Technology, Institutskii per. 9, Dolgoprudnyi, Moscow oblast, 141701 Russia *e-mail: [email protected] **e-mail: [email protected] ***e-mail: [email protected] Received March 5, 2020; revised March 25, 2020; accepted April 11, 2020
After the publication of this paper, the inspirer of our work, Yurii Gennad’evich Smetanin, untimely passed away. He did not leave his work until the very end, discussing plans for future researches and experiments by telephone from the hospital. The bright memory of our friend, a true scientist and a very good person, will remain with us forever. Abstract—An implementation of associative memory based on a Hopfield network is described. In the proposed approach, memory addresses are regarded as training vectors of the artificial neural network. The efficiency of memory search is directly associated with solving the overfitting problem. A method for dividing the training and input network vectors into parts, the processing of which requires a smaller number of neurons, is proposed. Results of a series of experiments conducted on Hopfield network models with different numbers of neurons trained with different numbers of vectors and operated under different noise conditions are presented. DOI: 10.1134/S0361768820050023
1. INTRODUCTION Software models of artificial neural networks are extremely interesting for research. On the one hand, these models are software systems the design, development, and operation of which obey all laws, rules, and standards that the other software systems conform to. On the other hand, a specific feature of these models is that the structural organization of their numerous components reaches the highest level of importance as perhaps in no other systems. With each individual component of an artificial neural network being quite a complex computing system, a large number of components in the network and the variety of methods for organizing their interaction make the problems of their selection and subsequent testing extremely important and difficult to solve. † Deceased.
We already drew attention to the problem of organizing the structural testing of artificial neural networks [1–3]. We consider a Hopfield model, a specific model of recurrent artificial neural network [4], i.e., a network of n neurons with a complete set of symmetrical connections and the updating rules xi(t + 1) = n sgn wij x j (t ) , where n is the number of neurons j =1 and wij is the weight of the connection between the neurons i and j, w
Data Loading...