Results 1  10
of
13
A new look at statespace models for neural data
 Journal of Computational Neuroscience
, 2010
"... State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these appro ..."
Abstract

Cited by 53 (25 self)
 Add to MetaCart
(Show Context)
State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these approximations, but that nonetheless retain the computational efficiency of the approximate methods. We discuss a variety of examples, applying these direct optimization techniques to problems in spike train smoothing, stimulus decoding, parameter estimation, and inference of synaptic properties. Along the way, we point out connections to some related standard statistical methods, including spline smoothing and isotonic regression. Finally, we note that the computational methods reviewed here do not in fact depend on the statespace setting at all; instead, the key property we are exploiting involves the bandedness of certain matrices. We close by discussing some applications of this more general point of view, including Markov chain Monte Carlo methods for neural decoding and efficient estimation of spatiallyvarying firing rates.
Modelbased decoding, information estimation, and changepoint detection in multineuron spike trains
 UNDER REVIEW, NEURAL COMPUTATION
, 2007
"... Understanding how stimulus information is encoded in spike trains is a central problem in computational neuroscience. Decoding methods provide an important tool for addressing this problem, by allowing us to explicitly read out the information contained in spike responses. Here we introduce several ..."
Abstract

Cited by 38 (18 self)
 Add to MetaCart
Understanding how stimulus information is encoded in spike trains is a central problem in computational neuroscience. Decoding methods provide an important tool for addressing this problem, by allowing us to explicitly read out the information contained in spike responses. Here we introduce several decoding methods based on pointprocess neural encoding models (i.e. “forward ” models that predict spike responses to novel stimuli). These models have concave loglikelihood functions, allowing for efficient fitting via maximum likelihood. Moreover, we may use the likelihood of the observed spike trains under the model to perform optimal decoding. We present: (1) a tractable algorithm for computing the maximum a posteriori (MAP) estimate of the stimulus — the most probable stimulus to have generated the observed single or multiplespike train response, given some prior distribution over the stimulus; (2) a Gaussian approximation to the posterior distribution, which allows us to quantify the fidelity with which various stimulus features are encoded; (3) an efficient method for estimating the mutual information between the stimulus and the response; and (4) a framework for the detection of changepoint times (e.g. the time at which the stimulus undergoes a change in mean or variance), by marginalizing over the posterior distribution of stimuli. We show several examples illustrating the performance of these estimators with simulated data.
Efficient, adaptive estimation of twodimensional firing rate surfaces via gaussian process methods. Network: Comput. Neural Syst
, 2010
"... Estimating twodimensional firing rate maps is a common problem, arising in a number of contexts: the estimation of place fields in hippocampus, the analysis of temporally nonstationary tuning curves in sensory and motor areas, the estimation of firing rates following spiketriggered covariance anal ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Estimating twodimensional firing rate maps is a common problem, arising in a number of contexts: the estimation of place fields in hippocampus, the analysis of temporally nonstationary tuning curves in sensory and motor areas, the estimation of firing rates following spiketriggered covariance analyses, etc. Here we introduce methods based on GaussianprocessnonparametricBayesiantechniquesforestimatingthesetwodimensional rate maps. These techniques offer a number of advantages: the estimates may be computed efficiently, come equipped with natural errorbars, adapt their smoothness automatically to the local density and informativeness of the observed data, and permit direct fitting of the model hyperparameters (e.g., the prior smoothness of the rate map) via maximum marginal likelihood. We illustrate the flexibility and performance of the new techniques on a variety of simulated and real data. 1
Coding efficiency and detectability of rate fluctuations with nonPoisson neuronal
"... firing ..."
(Show Context)
Statistical analysis of neural data: Continuousspace models (First 2/3)
, 2009
"... 1 Autoregressive models and Kalman filter models are Gaussian Markov and hidden Markov models, respectively 3 1.1 Example: voltage smoothing and interpolation; inferring biophysical parameters 5 1.2 We may perform inference in the Kalman model either via the forwardbackwards ..."
Abstract
 Add to MetaCart
(Show Context)
1 Autoregressive models and Kalman filter models are Gaussian Markov and hidden Markov models, respectively 3 1.1 Example: voltage smoothing and interpolation; inferring biophysical parameters 5 1.2 We may perform inference in the Kalman model either via the forwardbackwards
Contents lists available at ScienceDirect Journal of Neuroscience Methods
"... Population decoding of motor cortical activity using a generalized linear model ..."
Abstract
 Add to MetaCart
Population decoding of motor cortical activity using a generalized linear model
Action Editor: Israel Nelken
"... Abstract State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract State space methods have proven indispensable in neural data analysis. However, common methods for performing inference in statespace models with nonGaussian observations rely on certain approximations which are not always accurate. Here we review direct optimization methods that avoid these approximations, but that nonetheless retain the computational efficiency of the approximate methods. We discuss a variety of examples, applying these direct optimization techniques to problems in spike train smoothing, stimulus decoding, parameter estimation, and inference of synaptic properties. Along the way, we point out connections to some related standard statistical methods, including spline smoothing and isotonic regression. Finally, we note that the computational methods reviewed here do not in fact depend on the statespace
Edited by:
, 2010
"... doi: 10.3389/fncom.2010.00012 Bayesian inference for generalized linear models for spiking neurons ..."
Abstract
 Add to MetaCart
(Show Context)
doi: 10.3389/fncom.2010.00012 Bayesian inference for generalized linear models for spiking neurons
Information Rates and Optimal Decoding in Large Neural Populations
"... Many fundamental questions in theoretical neuroscience involve optimal decoding and the computation of Shannon information rates in populations of spiking neurons. In this paper, we apply methods from the asymptotic theory of statistical inference to obtain a clearer analytical understanding of thes ..."
Abstract
 Add to MetaCart
(Show Context)
Many fundamental questions in theoretical neuroscience involve optimal decoding and the computation of Shannon information rates in populations of spiking neurons. In this paper, we apply methods from the asymptotic theory of statistical inference to obtain a clearer analytical understanding of these quantities. We find that for large neural populations carrying a finite total amount of information, the full spiking population response is asymptotically as informative as a single observation from a Gaussian process whose mean and covariance can be characterized explicitly in terms of network and single neuron properties. TheGaussianform of this asymptotic sufficient statistic allows us in certain cases to perform optimal Bayesian decoding by simple linear transformations, and to obtain closedform expressions of the Shannon information carried by the network. One technical advantage of the theory is that it may be applied easily even to nonPoissonpoint process network models; for example, we find that under some conditions, neural populations with strong historydependent (nonPoisson) effects carry exactly the same information as do simpler equivalent populations of noninteracting Poisson neurons with matched firing rates. We argue that our findings help to clarify some results from the recent literature on neural decoding and neuroprosthetic design.
SUBMITTED TO IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING 1 EMG prediction from Motor Cortical Recordings via a NonNegative Point Process Filter
"... Abstract—A constrained point process filtering mechanism for prediction of electromyogram (EMG) signals from multichannel neural spike recordings is proposed here. Filters from the Kalman family are inherently suboptimal in dealing with nonGaussian observations, or a state evolution that deviates ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract—A constrained point process filtering mechanism for prediction of electromyogram (EMG) signals from multichannel neural spike recordings is proposed here. Filters from the Kalman family are inherently suboptimal in dealing with nonGaussian observations, or a state evolution that deviates from the Gaussianity assumption. To address these limitations, we modeled the nonGaussian neural spike train observations by using a generalized linear model (GLM) that encapsulates covariates of neural activity, including the neurons ’ own spiking history, concurrent ensemble activity, and extrinsic covariates (EMG signals). In order to predict the envelopes of EMGs, we reformulated the Kalman filter (KF) in an optimization framework and utilized a nonnegativity constraint. This structure characterizes the nonlinear correspondence between neural activity and EMG signals reasonably. The EMGs were recorded from twelve forearm and hand muscles of a behaving monkey during a gripforce task. For the case of limited training data, the constrained point process filter improved the prediction accuracy when compared to a conventional Wiener cascade filter (a linear causal filter followed by a static nonlinearity) for different bin sizes and delays between input spikes and EMG output. For longer training data sets, results of the proposed filter and that of the Wiener cascade filter were comparable. Index Terms—Brainmachine interface, electromyogram signal, generalized linear model, Kalman filter, optimization. I.