Results 1  10
of
7,747
Least Median of Squares Regression
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1984
"... ..."
Compressed LeastSquares Regression
"... We consider the problem of learning, from K data, a regression function in a linear space of high dimension N using projections onto a random subspace of lower dimension M. From any algorithm minimizing the (possibly penalized) empirical risk, we provide bounds on the excess risk of the estimate com ..."
Abstract

Cited by 31 (4 self)
 Add to MetaCart
controlled) approximation error. We apply the analysis to LeastSquares (LS) regression and discuss the excess risk and numerical complexity of the resulting “Compressed Least Squares Regression” (CLSR) in terms of N, K, and M. When we choose M = O ( √ K), we show that CLSR has an estimation error of order
Least Squares Regression Partial Least Squares Regression
"... • The Degrees of Freedom of Kernel Partial Least Squares (KPLS) require all eigenvalues of the kernel matrix K, hence the computation is cubic in the number of observations n. • We use Kernel PLS itself to approximate the eigenvalues of the kernel matrix. ..."
Abstract
 Add to MetaCart
• The Degrees of Freedom of Kernel Partial Least Squares (KPLS) require all eigenvalues of the kernel matrix K, hence the computation is cubic in the number of observations n. • We use Kernel PLS itself to approximate the eigenvalues of the kernel matrix.
Least Squares Regression
"... A drawback of many voice conversion algorithms is that they rely on linear models and/or require a lot of tuning. In addition, many of them ignore the inherent timedependency between speech features. To address these issues, we propose to use dynamic kernel partial least squares (DKPLS) technique t ..."
Abstract
 Add to MetaCart
to model nonlinearities as well as to capture the dynamics in the data. The method is based on a kernel transformation of the source features to allow nonlinear modeling and concatenation of previous and next frames to model the dynamics. Partial least squares regression is used to find a conversion
Collinearity and Least Squares Regression
 Statistical Science
, 1987
"... this paper we introduce certain numbers, called collinearity indices, which are useful in detecting near collinearities in regression problems. The coefficients enter adversely into formulas concerning significance testing and the effects of errors in the regression variables. Thus they provide simp ..."
Abstract

Cited by 30 (2 self)
 Add to MetaCart
this paper we introduce certain numbers, called collinearity indices, which are useful in detecting near collinearities in regression problems. The coefficients enter adversely into formulas concerning significance testing and the effects of errors in the regression variables. Thus they provide
Asymptotics of least trimmed squares regression
, 2004
"... Link to publication Citation for published version (APA): Cizek, P. (2004). Asymptotics of Least Trimmed Squares Regression. (CentER Discussion Paper; Vol. 200472). Tilburg: Econometrics. General rights Copyright and moral rights for the publications made accessible in the public portal are retaine ..."
Abstract
 Add to MetaCart
Link to publication Citation for published version (APA): Cizek, P. (2004). Asymptotics of Least Trimmed Squares Regression. (CentER Discussion Paper; Vol. 200472). Tilburg: Econometrics. General rights Copyright and moral rights for the publications made accessible in the public portal
Scrambled Objects for LeastSquares Regression
"... We consider leastsquares regression using a randomly generated subspace GP ⊂ F of finite dimension P, where F is a function space of infinite dimension, e.g. L2([0, 1] d). GP is defined as the span of P random features that are linear combinations of the basis functions of F weighted by random Gaus ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
We consider leastsquares regression using a randomly generated subspace GP ⊂ F of finite dimension P, where F is a function space of infinite dimension, e.g. L2([0, 1] d). GP is defined as the span of P random features that are linear combinations of the basis functions of F weighted by random
Datadriven calibration of penalties for leastsquares regression
, 2009
"... Penalization procedures often suffer from their dependence on multiplying factors, whose optimal values are either unknown or hard to estimate from data. We propose a completely datadriven calibration algorithm for these parameters in the leastsquares regression framework, without assuming a parti ..."
Abstract

Cited by 56 (13 self)
 Add to MetaCart
Penalization procedures often suffer from their dependence on multiplying factors, whose optimal values are either unknown or hard to estimate from data. We propose a completely datadriven calibration algorithm for these parameters in the leastsquares regression framework, without assuming a
Kernel partial least squares regression in reproducing kernel Hilbert space
 JOURNAL OF MACHINE LEARNING RESEARCH
, 2001
"... A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables to the late ..."
Abstract

Cited by 154 (10 self)
 Add to MetaCart
A family of regularized least squares regression models in a Reproducing Kernel Hilbert Space is extended by the kernel partial least squares (PLS) regression model. Similar to principal components regression (PCR), PLS is a method based on the projection of input (explanatory) variables
Results 1  10
of
7,747