Results 1  10
of
191
Spatiotemporal correlations and visual signaling in a complete neuronal population
, 2008
"... ..."
RS: Correcting for the sampling bias problem in spike train information measures
 J Neurophysiol
"... for the sampling bias problem in spike train information measures. J ..."
Abstract

Cited by 60 (9 self)
 Add to MetaCart
(Show Context)
for the sampling bias problem in spike train information measures. J
A new look at statespace models for neural data
 Journal of Computational Neuroscience
, 2010
"... State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these appro ..."
Abstract

Cited by 53 (25 self)
 Add to MetaCart
(Show Context)
State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these approximations, but that nonetheless retain the computational efficiency of the approximate methods. We discuss a variety of examples, applying these direct optimization techniques to problems in spike train smoothing, stimulus decoding, parameter estimation, and inference of synaptic properties. Along the way, we point out connections to some related standard statistical methods, including spline smoothing and isotonic regression. Finally, we note that the computational methods reviewed here do not in fact depend on the statespace setting at all; instead, the key property we are exploiting involves the bandedness of certain matrices. We close by discussing some applications of this more general point of view, including Markov chain Monte Carlo methods for neural decoding and efficient estimation of spatiallyvarying firing rates.
Learning horizontal connections in a sparse coding model of natural images
 IN: ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NIPS), 2008. REFERENCES 125
"... It has been shown that adapting a dictionary of basis functions to the statistics of natural images so as to maximize sparsity in the coefficients results in a set of dictionary elements whose spatial properties resemble those of V1 (primary visual cortex) receptive fields. However, the resulting sp ..."
Abstract

Cited by 33 (2 self)
 Add to MetaCart
It has been shown that adapting a dictionary of basis functions to the statistics of natural images so as to maximize sparsity in the coefficients results in a set of dictionary elements whose spatial properties resemble those of V1 (primary visual cortex) receptive fields. However, the resulting sparse coefficients still exhibit pronounced statistical dependencies, thus violating the independence assumption of the sparse coding model. Here, we propose a model that attempts to capture the dependencies among the basis function coefficients by including a pairwise coupling term in the prior over the coefficient activity states. When adapted to the statistics of natural images, the coupling terms learn a combination of facilitatory and inhibitory interactions among neighboring basis functions. These learned interactions may offer an explanation for the function of horizontal connections in V1 in terms of a prior over natural images.
Constraint satisfaction problems and neural networks: statistical physics approach
 J. Physiol. Paris
, 2009
"... A new field of research is rapidly expanding at the crossroad between statistical physics, information theory and combinatorial optimization. In particular, the use of cutting edge statistical physics concepts and methods allow one to solve very large constraint satisfaction problems like random sat ..."
Abstract

Cited by 27 (0 self)
 Add to MetaCart
A new field of research is rapidly expanding at the crossroad between statistical physics, information theory and combinatorial optimization. In particular, the use of cutting edge statistical physics concepts and methods allow one to solve very large constraint satisfaction problems like random satisfiability, coloring, or error correction. Several aspects of these developments should be relevant for the understanding of functional complexity in neural networks. On the one hand the message passing procedures which are used in these new algorithms are based on local exchange of information, and succeed in solving some of the hardest computational problems. On the other hand some crucial inference problems in neurobiology, like those generated in multielectrode recordings, naturally translate into hard constraint satisfaction problems. This paper gives a nontechnical introduction to this field, emphasizing the main ideas at work in message passing strategies and their possible relevance to neural networks modelling. It also introduces a new message passing algorithm for inferring interactions between variables from correlation data, which could be useful in the analysis of multielectrode recording data.
Correlations and population dynamics in cortical networks
 Neural Comput
, 2008
"... Kriener et al. The function of cortical networks depends on the collective interplay between neurons and neuronal populations, which is reflected in the correlation of signals that can be recorded at different levels. To correctly interpret these observations it is important to understand the origin ..."
Abstract

Cited by 26 (13 self)
 Add to MetaCart
(Show Context)
Kriener et al. The function of cortical networks depends on the collective interplay between neurons and neuronal populations, which is reflected in the correlation of signals that can be recorded at different levels. To correctly interpret these observations it is important to understand the origin of neuronal correlations. Here we study how cells in large recurrent networks of excitatory and inhibitory neurons interact and how the associated correlations affect stationary states of idle network activity. We demonstrate that the structure of the connectivity matrix of such networks induces considerable correlations between synaptic currents as well as between subthreshold membrane potentials, provided Dale’s principle is respected.If, in contrast, synaptic weights are randomly distributed, input correlations can vanish, even for densely connected networks. Although correlations are strongly attenuated when proceding from membrane potentials to action potentials (spikes), the resulting weak correlations in the spike output can cause substantial fluctuations in the population activity, even in highly diluted networks. We show that simple meanfield models that take the structure of the coupling matrix into account can adequately describe the power spectra of the population activity. The consequences of Dale’s principle on correlations and rate fluctuations are discussed in the light of recent experimental findings. 1
Approaches to InformationTheoretic Analysis of Neural Activity
 Biol Theory
, 2006
"... Abstract Understanding how neurons represent, process, and manipulate information is one of the main goals of neuroscience. These issues are fundamentally abstract, and information theory plays a key role in formalizing and addressing them. However, application of information theory to experimental ..."
Abstract

Cited by 23 (1 self)
 Add to MetaCart
(Show Context)
Abstract Understanding how neurons represent, process, and manipulate information is one of the main goals of neuroscience. These issues are fundamentally abstract, and information theory plays a key role in formalizing and addressing them. However, application of information theory to experimental data is fraught with many challenges. Meeting these challenges has led to a variety of innovative analytical techniques, with complementary domains of applicability, assumptions, and goals.
A small world of neuronal synchrony
 Cereb. Cortex
, 2008
"... A smallworld network has been suggested to be an efficient solution for achieving both modular and global processing—a property highly desirable for brain computations. Here, we investigated functional networks of cortical neurons using correlation analysis to identify functional connectivity. To r ..."
Abstract

Cited by 23 (3 self)
 Add to MetaCart
(Show Context)
A smallworld network has been suggested to be an efficient solution for achieving both modular and global processing—a property highly desirable for brain computations. Here, we investigated functional networks of cortical neurons using correlation analysis to identify functional connectivity. To reconstruct the interaction network, we applied the Ising model based on the principle of maximum entropy. This allowed us to assess the interactions by measuring pairwise correlations and to assess the strength of coupling from the degree of synchrony. Visual responses were recorded in visual cortex of anesthetized cats, simultaneously from up to 24 neurons. First, pairwise correlations captured most of the patterns in the population’s activity and, therefore, provided a reliable basis for the reconstruction of the interaction networks. Second, and most importantly, the resulting networks had smallworld properties; the average path lengths were as short as in simulated random networks, but the clustering coefficients were larger. Neurons differed considerably with respect to the number and strength of interactions, suggesting the existence of ‘‘hubs’ ’ in the network. Notably, there was no evidence for scalefree properties. These results suggest that cortical networks are optimized for the coexistence of local and global computations: feature detection and feature integration or binding.
Thalamic synchrony and the adaptive gating of information flow to cortex
 Nat. Neurosci
, 2010
"... Although it has long been posited that sensory adaptation serves to enhance information flow in sensory pathways, the neural basis remains elusive. Simultaneous single–unit recordings in the thalamus and cortex in anesthetized rats reveal that adaptation differentially influences thalamus and cortex ..."
Abstract

Cited by 17 (3 self)
 Add to MetaCart
Although it has long been posited that sensory adaptation serves to enhance information flow in sensory pathways, the neural basis remains elusive. Simultaneous single–unit recordings in the thalamus and cortex in anesthetized rats reveal that adaptation differentially influences thalamus and cortex in a manner that fundamentally changes the nature of information conveyed about vibrissae motion. Utilizing an ideal observer of cortical activity, performance in detecting vibrissa deflections degrades with adaptation, while performance in discriminating between vibrissa deflections of different velocities is enhanced, a trend not observed in thalamus. Analysis of simultaneously recorded thalamic neurons does reveal, however, an analogous adaptive change in thalamic synchrony that mirrors the cortical response. An integrate–and–fire model using experimentally measured thalamic input reproduces the observed transformations. The results here suggest a shift in coding strategy with adaptation that directly controls information relayed to cortex, which could have implications for encoding velocity signatures of textures.