Results 1  10
of
47
New results in linear filtering and prediction theory
 TRANS. ASME, SER. D, J. BASIC ENG
, 1961
"... A nonlinear differential equation of the Riccati type is derived for the covariance matrix of the optimal filtering error. The solution of this "variance equation " completely specifies the optimal filter for either finite or infinite smoothing intervals and stationary or nonstationary sta ..."
Abstract

Cited by 581 (0 self)
 Add to MetaCart
A nonlinear differential equation of the Riccati type is derived for the covariance matrix of the optimal filtering error. The solution of this "variance equation " completely specifies the optimal filter for either finite or infinite smoothing intervals and stationary or nonstationary statistics. The variance equation is closely related to the Hamiltonian (canonical) differential equations of the calculus of variations. Analytic solutions are available in some cases. The significance of the variance equation is illustrated by examples which duplicate, simplify, or extend earlier results in this field. The Duality Principle relating stochastic estimation and deterministic control problems plays an important role in the proof of theoretical results. In several examples, the estimation problem and its dual are discussed sidebyside. Properties of the variance equation are of great interest in the theory of adaptive systems. Some aspects of this are considered briefly.
A reproducing kernel Hilbert space framework for spike train signal processing
 Neural Comp
, 2009
"... This paper presents a general framework based on reproducing kernel Hilbert spaces (RKHS) to mathematically describe and manipulate spike trains. The main idea is the definition of inner products to allow spike train signal processing from basic principles while incorporating their statistical descr ..."
Abstract

Cited by 22 (11 self)
 Add to MetaCart
(Show Context)
This paper presents a general framework based on reproducing kernel Hilbert spaces (RKHS) to mathematically describe and manipulate spike trains. The main idea is the definition of inner products to allow spike train signal processing from basic principles while incorporating their statistical description as point processes. Moreover, because many inner products can be formulated, a particular definition can be crafted to best fit an application. These ideas are illustrated by the definition of a number of spike train inner products. To further elicit the advantages of the RKHS framework, a family of these inner products, called the crossintensity (CI) kernels, is further analyzed in detail. This particular inner product family encapsulates the statistical description from conditional intensity functions of spike trains. The problem of their estimation is also addressed. The simplest of the spike train kernels in this family provides an interesting perspective to other works presented in the literature, as will be illustrated in terms of spike train distance measures. Finally, as an application example, the presented RKHS framework is used to derive from simple principles a clustering algorithm for spike trains.
Stochastic Processes With Sample Paths In Reproducing Kernel Hilbert Spaces
 Transactions of the American Mathematical Society
, 2001
"... A theorem of M. F. Driscoll says that, under certain restrictions, the probability that a given Gaussian process has its sample paths almost surely in a given reproducing kernel Hilbert space (RKHS) is either 0 or 1. ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
A theorem of M. F. Driscoll says that, under certain restrictions, the probability that a given Gaussian process has its sample paths almost surely in a given reproducing kernel Hilbert space (RKHS) is either 0 or 1.
A Reproducing Kernel Hilbert Space Framework for InformationTheoretic Learning
"... Abstract—This paper provides a functional analysis perspective of informationtheoretic learning (ITL) by defining bottomup a reproducing kernel Hilbert space (RKHS) uniquely determined by the symmetric nonnegative definite kernel function known as the crossinformation potential (CIP). The CIP as ..."
Abstract

Cited by 13 (8 self)
 Add to MetaCart
(Show Context)
Abstract—This paper provides a functional analysis perspective of informationtheoretic learning (ITL) by defining bottomup a reproducing kernel Hilbert space (RKHS) uniquely determined by the symmetric nonnegative definite kernel function known as the crossinformation potential (CIP). The CIP as an integral of the product of two probability density functions characterizes similarity between two stochastic functions. We prove the existence of a onetoone congruence mapping between the ITL RKHS and the Hilbert space spanned by square integrable probability density functions. Therefore, all the statistical descriptors in the original informationtheoretic learning formulation can be rewritten as algebraic computations on deterministic functional vectors in the ITL RKHS, instead of limiting the functional view to the estimators as is commonly done in kernel methods. A connection between the ITL RKHS and kernel approaches interested in quantifying the statistics of the projected data is also established. Index Terms—Crossinformation potential, informationtheoretic learning (ITL), kernel function, probability density function,
A FORTRAN IV Program for the Determination of the Anomalous Potential Using Stepwise Least Squares Collocation
 The Ohio State University
, 1974
"... ..."
The Henderson smoother in reproducing kernel hilbert space
 Journal of Business and Economic Statistics. (forthcoming
, 2007
"... The Henderson smoother has been traditionally applied for trendcycle estimation in the context of nonparametric seasonal adjustment software ofcially adopted by statistical agencies. This study introduces a Henderson third order kernel representation by means of the Reproducing Kernel Hilbert Spa ..."
Abstract

Cited by 6 (5 self)
 Add to MetaCart
The Henderson smoother has been traditionally applied for trendcycle estimation in the context of nonparametric seasonal adjustment software ofcially adopted by statistical agencies. This study introduces a Henderson third order kernel representation by means of the Reproducing Kernel Hilbert Space (RKHS) methodology. Two density functions and corresponding orthonormal polynomials have been calculated. Both are shown to give excellent representations for short and medium length lters. Theoretical and empirical comparisons of the Henderson third order kernel asymmetric lters are made with the classical ones. The former are shown to be superior in terms of signal passing, noise suppression and revision size. KEY WORDS: Symmetric and asymmetric weighting systems, biweight density function, higher order kernels, local weighted least squares, spectral prop
Inner products for representation and learning in the spike
, 2010
"... In many neurophysiological studies and braininspired computation paradigms, there is still a need for new spike train analysis and learning algorithms because current methods tend to be limited in terms of the tools they provide and are not easily extended. This chapter presents a general framework ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
In many neurophysiological studies and braininspired computation paradigms, there is still a need for new spike train analysis and learning algorithms because current methods tend to be limited in terms of the tools they provide and are not easily extended. This chapter presents a general framework to develop spike train machine learning methods by defining inner product operators for spike trains. They build on the mathematical theory of reproducing kernel Hilbert spaces (RKHS) and kernel methods, allowing a multitude of analysis and learning algorithms to be easily developed. The inner products utilize functional representations of spike trains, which we motivate from two perspectives: as a biologicalmodeling problem, and as a statistical description. The biologicalmodeling approach highlights the potential biological mechanisms taking place at the neuron level and that are quantified by the inner product. On the other hand, by interpreting the representation from a statistical perspective, one relates to other work in the literature. Moreover, the statistical description characterizes which information can be detected by the spike train inner product. The applications of the given inner products for development of machine learning methods are demonstrated in two
Asymptotic Optimality Of Regular Sequence Designs
 Ann. Statist
, 1995
"... . We study linear estimators for the weighted integral of a stochastic process. The process may only be observed on a finite sampling design. The error is defined in mean square sense, and the process is assumed to satisfy SacksYlvisaker regularity conditions of order r 2 N 0 . We show that samplin ..."
Abstract

Cited by 6 (2 self)
 Add to MetaCart
. We study linear estimators for the weighted integral of a stochastic process. The process may only be observed on a finite sampling design. The error is defined in mean square sense, and the process is assumed to satisfy SacksYlvisaker regularity conditions of order r 2 N 0 . We show that sampling at the quantiles of a particular density already yields asymptotically optimal estimators. Hereby we extend results by Sacks and Ylvisaker for regularity r = 0 or 1, and we confirm a conjecture by Eubank, Smith, and Smith. 1. Introduction Let X(t), t 2 [0; 1], be a centered stochastic process which is at least continuous in quadratic mean. For a known function ae 2 L 2 ([0; 1]) we want to estimate the weighted integral Int ae (X) = Z 1 0 X(t) \Delta ae(t) dt: We consider linear estimators I n which are based on n observations of X. Hence I n (X) = n X i=1 X(t i ) \Delta a i with sampling points 0 t 1 ! \Delta \Delta \Delta ! t n 1 and coefficients a i 2 R. The error of I n is de...
Reproducing kernel Hilbert spaces for spike train analysis
 In Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP2008, Las Vegas
, 2008
"... This paper introduces a generalized crosscorrelation (GCC) measure for spike train analysis derived from reproducing kernel Hilbert spaces (RKHS) theory. An estimator for GCC is derived that does not depend on binning or a specific kernel and it operates directly and efficiently on spike times. For ..."
Abstract

Cited by 5 (5 self)
 Add to MetaCart
(Show Context)
This paper introduces a generalized crosscorrelation (GCC) measure for spike train analysis derived from reproducing kernel Hilbert spaces (RKHS) theory. An estimator for GCC is derived that does not depend on binning or a specific kernel and it operates directly and efficiently on spike times. For instantaneous analysis as required for realtime use, an instantaneous estimator is proposed and proved to yield the GCC on average. We finalize with two experiments illustrating the usefulness of the techniques derived. Index Terms — Spike train analysis, reproducing kernel Hilbert spaces, crosscorrelation, synchrony detection. 1.