Results 1 
4 of
4
Sampling expansions in reproducing kernel Hilbert and Banach spaces
 Numer. Funct. Anal. Optim
"... Abstract. We investigate the construction of all reproducing kernel Hilbert spaces of functions on a domain Ω ⊂ R d that have a countable sampling set Λ ⊂ Ω. We also characterize all the reproducing kernel Hilbert spaces that have a prescribed sampling set. Similar problems are considered for reprod ..."
Abstract

Cited by 7 (3 self)
 Add to MetaCart
(Show Context)
Abstract. We investigate the construction of all reproducing kernel Hilbert spaces of functions on a domain Ω ⊂ R d that have a countable sampling set Λ ⊂ Ω. We also characterize all the reproducing kernel Hilbert spaces that have a prescribed sampling set. Similar problems are considered for reproducing kernel Banach spaces, but now with respect to Λ as a psampling set. Unlike the general pframes, we prove that every psampling set for a reproducing kernel Banach space yields a reconstruction formula. Some applications are given to demonstrate the general construction. The results of this paper uncover precisely the affinity between stable sampling expansions and reproducing kernel Hilbert and Banach spaces.
A method for generating infinite positive selfadjoint test matrices and Riesz bases
 SIAM J. Matrix Anal. Appl
, 2005
"... Dedicated to Laura Gori on the occasion of her 70th birthday Abstract. In this article we propose a method to easily generate infinite multiindex positive definite selfadjoint matrices as well as Riesz bases in suitable subspaces of L 2 (R d). The method is then applied to obtain some classes of m ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Dedicated to Laura Gori on the occasion of her 70th birthday Abstract. In this article we propose a method to easily generate infinite multiindex positive definite selfadjoint matrices as well as Riesz bases in suitable subspaces of L 2 (R d). The method is then applied to obtain some classes of multiindex Toeplitz matrices which are bounded and strictly positive on ℓ 2 (Z d). The condition number of some of these matrices is also computed.
Multiscale Asymmetric Orthogonal Wavelet Kernel for Linear Programming Support Vector Learning and Nonlinear Dynamic Systems Identification
"... Abstract—Support vector regression for approximating nonlinear dynamic systems is more delicate than the approximation of indicator functions in support vector classification, particularly for systems that involve multitudes of time scales in their sampled data. The kernel used for support vector l ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
Abstract—Support vector regression for approximating nonlinear dynamic systems is more delicate than the approximation of indicator functions in support vector classification, particularly for systems that involve multitudes of time scales in their sampled data. The kernel used for support vector learning determines the class of functions from which a support vector machine can draw its solution, and the choice of kernel significantly influences the performance of a support vector machine. In this paper, to bridge the gap between wavelet multiresolution analysis and kernel learning, the closedform orthogonal wavelet is exploited to construct new multiscale asymmetric orthogonal wavelet kernels for linear programming support vector learning. The closedform multiscale orthogonal wavelet kernel provides a systematic framework to implement multiscale kernel learning via dyadic dilations and also enables us to represent complex nonlinear dynamics effectively. To demonstrate the superiority of the proposed multiscale wavelet kernel in identifying complex nonlinear dynamic systems, two case studies are presented that aim at building parallel models on benchmark datasets. The development of parallel models that address the longterm/midterm prediction issue is more intricate and challenging than the identification of seriesparallel models where only onestep ahead prediction is required. Simulation results illustrate the effectiveness of the proposed multiscale kernel learning. Index Terms—Linear programming support vector regression, model sparsity, multiscale orthogonal wavelet kernel, NARX model, parallel model, typeII raised cosine wavelet. I.
SAMPLING AND RECONSTRUCTION OF SIGNALS IN A REPRODUCING KERNEL SUBSPACE OF L p (R d)
"... Abstract. In this paper, we consider sampling and reconstruction of signals in a reproducing kernel subspace of L p (R d), 1 ≤ p ≤ ∞, associated with an idempotent integral operator whose kernel has certain offdiagonal decay and regularity. The space of pintegrable nonuniform splines and the shif ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract. In this paper, we consider sampling and reconstruction of signals in a reproducing kernel subspace of L p (R d), 1 ≤ p ≤ ∞, associated with an idempotent integral operator whose kernel has certain offdiagonal decay and regularity. The space of pintegrable nonuniform splines and the shiftinvariant spaces generated by finitely many localized functions are our model examples of such reproducing kernel subspaces of L p (R d). We show that a signal in such reproducing kernel subspaces can be reconstructed in a stable way from its samples taken on a relativelyseparated set with sufficiently small gap. We also study the exponential convergence, consistency, and the asymptotic pointwise error estimate of the iterative approximationprojection algorithm and the iterative frame algorithm for reconstructing a signal in those reproducing kernel spaces from its samples with sufficiently small gap. 1.