Probabilistically segregated neural circuits and subcritical linguistics
- PDF / 1,215,557 Bytes
- 12 Pages / 595.276 x 790.866 pts Page_size
- 90 Downloads / 240 Views
(0123456789().,-volV)(0123456789(). ,- volV)
RESEARCH ARTICLE
Probabilistically segregated neural circuits and subcritical linguistics Yoram Baram1 Received: 25 November 2019 / Revised: 27 April 2020 / Accepted: 19 May 2020 Ó Springer Nature B.V. 2020
Abstract Early studies of cortical information codes and memory capacity have assumed large neural networks, which, subject to evenly probable binary (on/off) activity, were found to be endowed with large storage and retrieval capacities under the Hebbian paradigm. Here, we show that such networks are plagued with exceedingly high cross-network connectivity, yielding long code words, which are linguistically non-realistic and difficult to memorize and comprehend. Noting that the neural circuit activity code is jointly governed by somatic and synaptic activity states, termed neural circuit polarities, we show that, subject to subcritical polarity probability, random-graph-theoretic considerations imply small neural circuit segregation. Such circuits are shown to represent linguistically plausible cortical code words which, in turn, facilitate storage and retrieval of both circuit connectivity and firing-rate dynamics. Keywords Cortical linguistics Neural circuit polarity Random graphs Subcritical cortical connectivity Segregated neural circuits Subcritical Hebbian capacity Firing-rate dynamics
Introduction Any representation of information involves a language. Natural languages, such as English, consist of elementary alphabets, comprising letters, which connect into words. Words, being the smallest embodiment of meaning, are normally short, consisting of two to about ten letters. Numbering in the millions and normally grouped into sentences, paragraphs, sections, chapters, articles and whole books, words carry the entire burden of linguistic information. Without such structures, information would be difficult to comprehend or memorize. Computer languages have conceptually similar structures. Their lowest-level alphabet consists of (0, 1) bits. Computer words normally consist of 2–32 bits. Yet, computer programs, consisting of ‘‘commands’’ can be arbitrarily long. Both natural and artificial manifestations of information extend beyond the formal linguistic domain, employing vision, hearing, smell, touch and motion in the generation and memorization of information. Highly inspiring progress has been made in
& Yoram Baram [email protected] 1
Computer Science Department, Technion- Israel Institute of Technology, 32000 Haifa, Israel
the formal conceptualization of cognitive functions, including behavioral decision making (Wei et al. 2017), communicational behaviour linkage to neural dynamics (Bonzon 2017), feeling of understanding (Mizraji and Lin 2017), multisensory learning (Rao 2018) and bilingual language control (Tong et al. 2019). Early studies of cortical information codes and memory capacity (McEliece et al. 1987; Amit et al. 1987; Baram and Sal’ee 1992) have pertained to a particular, largely simplified and unified binary (on/off) neural ne
Data Loading...