Results 1 - 10
of
15
ARTICLE Communicated by Terrence Sejnowski Computing with a Canonical Neural Circuits Model with Pool Normalization and Modulating Feedback
"... Evidence suggests that the brain uses an operational set of canonical com-putations like normalization, input filtering, and response gain enhance-ment via reentrant feedback. Here, we propose a three-stage columnar architecture of cascaded model neurons to describe a core circuit com-bining signal ..."
Abstract
- Add to MetaCart
Evidence suggests that the brain uses an operational set of canonical com-putations like normalization, input filtering, and response gain enhance-ment via reentrant feedback. Here, we propose a three-stage columnar architecture of cascaded model neurons to describe a core circuit com-bining signal pathways of feedforward and feedback processing and the inhibitory pooling of neurons to normalize the activity.We present an an-alytical investigation of such a circuit by first reducing its detail through the lumping of initial feedforward response filtering and reentrant mod-ulating signal amplification. The resulting excitatory-inhibitory pair of neurons is analyzed in a 2D phase-space. The inhibitory pool activa-tion is treated as a separate mechanism exhibiting different effects. We analyze subtractive as well as divisive (shunting) interaction to imple-ment center-surround mechanisms that include normalization effects in the characteristics of real neurons. Different variants of a core model architecture are derived and analyzed—in particular, individual excita-
Formal Aspects of Computing
"... Abstract. Two important issues in computational modelling in cognitive neuroscience are: first, how to formally describe neuronal networks (i.e. biologically plausible models of the central nervous system), and second, how to analyse complexmodels, in particular, their dynamics and capacity to learn ..."
Abstract
- Add to MetaCart
Abstract. Two important issues in computational modelling in cognitive neuroscience are: first, how to formally describe neuronal networks (i.e. biologically plausible models of the central nervous system), and second, how to analyse complexmodels, in particular, their dynamics and capacity to learn.Wemake progress towards these goals by presenting a communicating automata perspective on neuronal networks. Specifically, we describe neuronal networks and their biological mechanisms using Data-rich Communicating Automata, which extend classic automata theory with rich data types and communication. We use two case studies to illustrate our approach. In the first case study, wemodel a number of learning frameworks, which vary in respect of their biological detail, for instance the Backpropagation (BP) and the Generalized Recirculation (GeneRec) learning algorithms. We then used the SPIN model checker to investigate a number of behavioral properties of the neural learning algorithms. SPIN is a well-known model checker for reactive distributed systems, which has been successfully applied to many non-trivial problems. The verification results show that the biologically plausible GeneRec learning is less stable than BP learning. In the second case study, we presented a large scale (cognitive-level) neuronal network, which models an attentional spotlight mechanism in the visual system. A set of properties of this model was verified using Uppaal, a popular real-time model checker. The results show that the asynchronous processing supported by concurrency theory is not only a more biologically plausible way to model neural systems, but also
A Unified Quantitative Model of Vision and Audition
"... Abstract: We have put forwards a unified quantitative framework of vision and audition, based on existing data and theories. According to this model, the retina is a feedforward network self-adaptive to inputs in a specific period. After fully grown, cells become specialized detectors based on stati ..."
Abstract
- Add to MetaCart
Abstract: We have put forwards a unified quantitative framework of vision and audition, based on existing data and theories. According to this model, the retina is a feedforward network self-adaptive to inputs in a specific period. After fully grown, cells become specialized detectors based on statistics of stimulus history. This model has provided explanations for perception mechanisms of colour, shape, depth and motion. Moreover, based on this ground we have put forwards a bold conjecture that single ear can detect sound’s direction. This is complementary to existing theories and has provided better explanations for sound localization. Great strides have been made in vision and audition research 1-6. Nevertheless, there remain substantial gaps between physiological evidences and existing models 6-11. We wish to put forwards a unified quantitative model on the cell scale. It has provided explanations for possible biological perception mechanisms of shape 8, colour
unknown title
"... Abstract: Based on existing data, we wish to put forward a biological model of motor system on the neuron scale. Then we indicate its implications in statistics and learning. Specifically, neuron’s firing frequency and synaptic strength are probability estimates in essence. And the lateral inhibitio ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract: Based on existing data, we wish to put forward a biological model of motor system on the neuron scale. Then we indicate its implications in statistics and learning. Specifically, neuron’s firing frequency and synaptic strength are probability estimates in essence. And the lateral inhibition also has statistical implications. From the standpoint of learning, dendritic competition through retrograde messengers is the foundation of conditional reflex and “grandmother cell ” coding. And they are the kernel mechanisms of motor learning and sensory-motor integration respectively. Finally, we compare motor system with sensory system. In short, we would like to bridge the gap between molecule evidences and computational models. Main Text: Great strides have been made in the research of motor learning (1-5). Until now however, there still exists a gap between existing models and physiological data (6-8). Control models such as the internal forward models (1, 2) and optimal control models (3) are mainly on the module scale. Learning models such as CMAC are difficult for biological implementation (4, 5). Based on existing data and theories, we wish to put forward a quantitative motor learning model on the neuron scale. Inspired by the “self-organization ” idea (9), we only make local rules about neuron and synapse, and the neural network will emerge automatically. Moreover, both excitory and inhibitory neurons share the same framework, merely different in details. Motor neurons in this paper include those in the cerebellum, DCN (deep cerebellar nuclei) and basal ganglia, but excluding Pyramid cells in the cerebral motor area. All ci in this paper are constants, and they have different meanings in different paragraphs. Information about sensory memory can be found in our previous work (10). Motor neuron model is as follows (N): N1) 2
Reviewed by:
, 2014
"... Where’s Waldo? How perceptual, cognitive, and emotional brain processes cooperate during learning to categorize and find desired objects in a cluttered scene Hung-Cheng Chang†, Stephen Grossberg* † and Yongqiang Cao† ..."
Abstract
- Add to MetaCart
Where’s Waldo? How perceptual, cognitive, and emotional brain processes cooperate during learning to categorize and find desired objects in a cluttered scene Hung-Cheng Chang†, Stephen Grossberg* † and Yongqiang Cao†
unknown title
, 2014
"... Copyright and moral rights for this thesis are retained by the author A copy can be downloaded for personal non-commercial research or ..."
Abstract
- Add to MetaCart
(Show Context)
Copyright and moral rights for this thesis are retained by the author A copy can be downloaded for personal non-commercial research or
rstb.royalsocietypublishing.org
"... Regulation of rhythm genesis by volume-limited, astroglia-like signals in neurallimited, astroglia-like signals in neural ..."
Abstract
- Add to MetaCart
Regulation of rhythm genesis by volume-limited, astroglia-like signals in neurallimited, astroglia-like signals in neural
unknown title
"... Abstract: The coding mechanism of sensory memory on the neuron scale is one of the most important questions in neuroscience. We have put forward a quantitative neural network model, which is self-organized, self-similar, and self-adaptive, just like an ecosystem following Darwin's theory. Accor ..."
Abstract
- Add to MetaCart
(Show Context)
Abstract: The coding mechanism of sensory memory on the neuron scale is one of the most important questions in neuroscience. We have put forward a quantitative neural network model, which is self-organized, self-similar, and self-adaptive, just like an ecosystem following Darwin's theory. According to this model, neural coding is a “mult-to-one ” mapping from objects to neurons. And the whole cerebrum is a real-time statistical Turing Machine, with powerful representing and learning ability. This model can reconcile some important disputations, such as: temporal coding versus rate-based coding, grandmother cell versus population coding, and decay theory versus interference theory. And it has also provided explanations for some key questions such as memory consolidation, episodic memory, consciousness, and sentiment. Philosophical significance is indicated at last. Main Text: Great strides have been made in neuroscience and cognitive science (1-8). Until now however, the coding mechanism of sensory memory on the neuron scale is still unclear. A gap exists between the molecular and whole brain research (9). We wish to bridge this gap through a quantitative coding model. Inspired by the “self-organization ” idea (10, 11), we only make local rules about neuron and synapse based on existing data and theories. Then the hierarchical neural