Results 1  10
of
20
A reproducing kernel Hilbert space framework for spike train signal processing
 Neural Comp
, 2009
"... This paper presents a general framework based on reproducing kernel Hilbert spaces (RKHS) to mathematically describe and manipulate spike trains. The main idea is the definition of inner products to allow spike train signal processing from basic principles while incorporating their statistical descr ..."
Abstract

Cited by 23 (11 self)
 Add to MetaCart
(Show Context)
This paper presents a general framework based on reproducing kernel Hilbert spaces (RKHS) to mathematically describe and manipulate spike trains. The main idea is the definition of inner products to allow spike train signal processing from basic principles while incorporating their statistical description as point processes. Moreover, because many inner products can be formulated, a particular definition can be crafted to best fit an application. These ideas are illustrated by the definition of a number of spike train inner products. To further elicit the advantages of the RKHS framework, a family of these inner products, called the crossintensity (CI) kernels, is further analyzed in detail. This particular inner product family encapsulates the statistical description from conditional intensity functions of spike trains. The problem of their estimation is also addressed. The simplest of the spike train kernels in this family provides an interesting perspective to other works presented in the literature, as will be illustrated in terms of spike train distance measures. Finally, as an application example, the presented RKHS framework is used to derive from simple principles a clustering algorithm for spike trains.
Quantifying Statistical Interdependence by Message Passing on Graphs  PART II: MultiDimensional Point Processes
, 2009
"... Stochastic event synchrony is a technique to quantify the similarity of pairs of signals. First, “events” are extracted from the two given time series. Next, one tries to align events from one time series with events from the other. The better the alignment, the more similar the two time series are ..."
Abstract

Cited by 20 (12 self)
 Add to MetaCart
Stochastic event synchrony is a technique to quantify the similarity of pairs of signals. First, “events” are extracted from the two given time series. Next, one tries to align events from one time series with events from the other. The better the alignment, the more similar the two time series are considered to be. In Part I, onedimensional events are considered, this paper (Paper II) concerns multidimensional events. Although the basic idea is similar, the extension to multidimensional point processes involves a significantly harder combinatorial problem, and therefore, it is nontrivial. Also in the multidimensional, the problem of jointly computing the pairwise alignment and SES parameters is cast as a statistical inference problem. This problem is solved by coordinate descent, more specifically, by alternating the following two steps: (i) one estimates the SES parameters from a given pairwise alignment; (ii) with the resulting estimates, one refines the pairwise alignment. The SES parameters are computed by maximum a posteriori (MAP) estimation (Step 1), in
A comparison of binless spike train measures
, 2009
"... Several binless spike train measures which avoid the limitations of binning have been recently been proposed in the literature. This paper presents a systematic comparison of these measures in three simulated paradigms designed to address specific situations of interest in spike train analysis where ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
Several binless spike train measures which avoid the limitations of binning have been recently been proposed in the literature. This paper presents a systematic comparison of these measures in three simulated paradigms designed to address specific situations of interest in spike train analysis where the relevant feature may be in the form of firing rate, firing rate modulations and/or synchrony. The measures are first disseminated and extended for ease of comparison. It is also discussed how the measures can be used to measure dissimilarity in spike trains’ firing rate despite their explicit formulation for synchrony.
Reservoir computing approaches to recurrent neural network training
 COMPUT SCI REV 2009;3(3):127–49
, 2009
"... Echo State Networks and Liquid State Machines introduced a new paradigm in artificial recurrent neural network (RNN) training, where an RNN (the reservoir) is generated randomly and only a readout is trained. The paradigm, becoming known as reservoir computing, greatly facilitated the practical appl ..."
Abstract

Cited by 10 (0 self)
 Add to MetaCart
(Show Context)
Echo State Networks and Liquid State Machines introduced a new paradigm in artificial recurrent neural network (RNN) training, where an RNN (the reservoir) is generated randomly and only a readout is trained. The paradigm, becoming known as reservoir computing, greatly facilitated the practical application of RNNs and outperformed classical fully trained RNNs in many tasks. It has lately become a vivid research field with numerous extensions of the basic idea, including reservoir adaptation, thus broadening the initial paradigm to using different methods for training the reservoir and the readout. This review systematically surveys both: current ways of generating/adapting the reservoirs and training different types of readouts. It offers a natural conceptual classification of the techniques, which transcends boundaries of the current “brandnames” of reservoir methods, and thus aims to help unifying the field and providing the reader with a detailed “map” of it.
A fast Lp spike alignment metric
"... Abstract. The metrization of the space of neural responses is an ongoing research program seeking to find natural ways to describe, in geometrical terms, the sets of possible activities in the brain. One component of this program are the spike metrics, notions of distance between two spike trains re ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Abstract. The metrization of the space of neural responses is an ongoing research program seeking to find natural ways to describe, in geometrical terms, the sets of possible activities in the brain. One component of this program are the spike metrics, notions of distance between two spike trains recorded from a neuron. Alignment spike metrics work by identifying “equivalent ” spikes in one train and the other. We present an alignment spike metric having Lp underlying geometrical structure; the L2 version is Euclidean and is suitable for further embedding in Euclidean spaces by Multidimensional Scaling methods or related procedures. We show how to implement a fast algorithm for the computation of this metric based on bipartite graph matching theory. 1.
Inner products for representation and learning in the spike
, 2010
"... In many neurophysiological studies and braininspired computation paradigms, there is still a need for new spike train analysis and learning algorithms because current methods tend to be limited in terms of the tools they provide and are not easily extended. This chapter presents a general framework ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
In many neurophysiological studies and braininspired computation paradigms, there is still a need for new spike train analysis and learning algorithms because current methods tend to be limited in terms of the tools they provide and are not easily extended. This chapter presents a general framework to develop spike train machine learning methods by defining inner product operators for spike trains. They build on the mathematical theory of reproducing kernel Hilbert spaces (RKHS) and kernel methods, allowing a multitude of analysis and learning algorithms to be easily developed. The inner products utilize functional representations of spike trains, which we motivate from two perspectives: as a biologicalmodeling problem, and as a statistical description. The biologicalmodeling approach highlights the potential biological mechanisms taking place at the neuron level and that are quantified by the inner product. On the other hand, by interpreting the representation from a statistical perspective, one relates to other work in the literature. Moreover, the statistical description characterizes which information can be detected by the spike train inner product. The applications of the given inner products for development of machine learning methods are demonstrated in two
Reproducing Kernel Hilbert Spaces for Point Processes, with Applications to Neural Activity Analysis
, 2008
"... having accepted me as his student, and for his experienced guidance and advice. His incentive to creativity, breath of knowledge, and critical reaching thinking are, I believe, some of the most valuable lessons I will retain from my doctoral education. Without him, this dissertation would not have b ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
having accepted me as his student, and for his experienced guidance and advice. His incentive to creativity, breath of knowledge, and critical reaching thinking are, I believe, some of the most valuable lessons I will retain from my doctoral education. Without him, this dissertation would not have been possible. I also thank Dr. John G. Harris, for serving as my committee member, his interest in my research, and providing an essential practical perspective to much of my work. I also thank Dr. Justin C. Sanchez for his valuable time to read and comment on many of the results shown here. His expertise on neural activity analysis and often complementary perspective can be encountered throughout this dissertation. I also thank Dr. Jianbo Gao for all the advice and interest in serving in my committee. I am forever indebted to Dr. Francisco Vaz, for first creating the opportunity for me to come to CNEL and for all the help in obtaining funding from FCT. I will never forget that without Dr. Vaz’s assistance, I would have missed the wonderful opportunity to get a Ph.D. at the University of Florida. My friends and colleagues at CNEL deserve credit for many of the joys and for
On the efficient calculation of van Rossum distances
 Network
, 2012
"... Abstract The van Rossum metric measures the distance between two spike trains. Measuring a single van Rossum distance between one pair of spike trains is not a computationally expensive task, however, many applications require a matrix of distances between all the spike trains in a set or the calcu ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
Abstract The van Rossum metric measures the distance between two spike trains. Measuring a single van Rossum distance between one pair of spike trains is not a computationally expensive task, however, many applications require a matrix of distances between all the spike trains in a set or the calculation of a multineuron distance between two populations of spike trains. Moreover, often these calculations need to be repeated for many different parameter values. An algorithm is presented here to render these calculation less computationally expensive, making the complexity linear in the number of spikes rather than quadratic.
Optimization in reproducing kernel Hilbert spaces of spike trains
 IN COMPUTATIONAL NEUROSCIENCE
, 2010
"... This paper presents a framework based on reproducing kernel Hilbert spaces (RKHS) for optimization with spike trains. To establish the RKHS for optimization we start by introducing kernels for spike trains. It is shown that spike train kernels can be built from ideas of kernel methods, or from the i ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
This paper presents a framework based on reproducing kernel Hilbert spaces (RKHS) for optimization with spike trains. To establish the RKHS for optimization we start by introducing kernels for spike trains. It is shown that spike train kernels can be built from ideas of kernel methods, or from the intensity functions underlying the spike trains. However, the later approach shall be the main focus of this study. We introduce the memoryless crossintensity (mCI) kernel as an example of an inner product of spike trains, which defines the RKHS bottomup as an inner product of intensity functions. Being defined in terms of the intensity functions, this approach towards defining spike train kernels has the advantage that points in the RKHS incorporate a statistical description of the spike trains, and the statistical model is explicitly stated. Some properties of the mCI kernel and the RKHS it induces will be given to show that this RKHS has the necessary structure for optimization. The issue of estimation from data is also addressed. We finalize with an example of optimization in the RKHS by deriving an algorithm for principal component analysis (PCA) of spike trains.