Results 1 - 10
of
240
3 Sharp analysis of low-rank kernel matrix approximations
, 2013
"... Sharp analysis of low-rank kernel matrix approximations ..."
Low Rank Kernel Learning
"... The kernel update presented in Jain et al. [1] for solving the problem (30 in the main paper) min ..."
Abstract
- Add to MetaCart
The kernel update presented in Jain et al. [1] for solving the problem (30 in the main paper) min
Learning low-rank kernel matrices
- In ICML
, 2006
"... Kernel learning plays an important role in many machine learning tasks. However, algorithms for learning a kernel matrix often scale poorly, with running times that are cubic in the number of data points. In this paper, we propose efficient algorithms for learning lowrank kernel matrices; our algori ..."
Abstract
-
Cited by 49 (8 self)
- Add to MetaCart
matrix. Special cases of our framework yield faster algorithms for various existing kernel learning problems. Experimental results demonstrate the effectiveness of our algorithms in learning both low-rank and full-rank kernels. 1.
Low-Rank Kernel Learning with Bregman Matrix Divergences
"... In this paper, we study low-rank matrix nearness problems, with a focus on learning lowrank positive semidefinite (kernel) matrices for machine learning applications. We propose efficient algorithms that scale linearly in the number of data points and quadratically in the rank of the input matrix. E ..."
Abstract
-
Cited by 47 (2 self)
- Add to MetaCart
In this paper, we study low-rank matrix nearness problems, with a focus on learning lowrank positive semidefinite (kernel) matrices for machine learning applications. We propose efficient algorithms that scale linearly in the number of data points and quadratically in the rank of the input matrix
Efficient SVM training using low-rank kernel representations
- Journal of Machine Learning Research
, 2001
"... SVM training is a convex optimization problem which scales with the training set size rather than the feature space dimension. While this is usually considered to be a desired quality, in large scale problems it may cause training to be impractical. The common techniques to handle this difficulty ba ..."
Abstract
-
Cited by 240 (3 self)
- Add to MetaCart
basically build a solution by solving a sequence of small scale subproblems. Our current effort is concentrated on the rank of the kernel matrix as a source for further enhancement of the training procedure. We first show that for a low rank kernel matrix it is possible to design a better interior point
Sharp analysis of low-rank kernel matrix approximations
- JMLR: WORKSHOP AND CONFERENCE PROCEEDINGS VOL 30 (2013) 1–25
, 2013
"... We consider supervised learning problems within the positive-definite kernel framework, such as kernel ridge regression, kernel logistic regression or the support vector machine. With kernels leading to infinite-dimensional feature spaces, a common practical limiting difficulty is the necessity of c ..."
Abstract
-
Cited by 13 (1 self)
- Add to MetaCart
of computing the kernel matrix, which most frequently leads to algorithms with running time at least quadratic in the number of observations n, i.e., O(n 2). Low-rank approximations of the kernel matrix are often considered as they allow the reduction of running time complexities to O(p 2 n), where p
Low-Rank Kernel Learning for Electricity Market Inference
"... Abstract—Recognizing the importance of smart grid data analytics, modern statistical learning tools are applied here to wholesale electricity market inference. Market clearing congestion patterns are uniquely modeled as rank-one components in the matrix of spatiotemporally correlated prices. Upon po ..."
Abstract
- Add to MetaCart
postulating a low-rank matrix factorization, kernels across pricing nodes and hours are systematically selected via a novel methodology. To process the high-dimensional market data involved, a blockcoordinate descent algorithm is developed by generalizing blocksparse vector recovery results to the matrix case
Stochastic Low-Rank Kernel Learning for Regression
"... We present a novel approach to learn a kernelbased regression function. It is based on the use of conical combinations of data-based parameterized kernels and on a new stochastic convex optimization procedure of which we establish convergence guarantees. The overall learning procedure has the nice p ..."
Abstract
-
Cited by 1 (0 self)
- Add to MetaCart
We present a novel approach to learn a kernelbased regression function. It is based on the use of conical combinations of data-based parameterized kernels and on a new stochastic convex optimization procedure of which we establish convergence guarantees. The overall learning procedure has the nice
Robust Low Rank Kernel Embeddings of Multivariate Distributions
"... Kernel embedding of distributions has led to many recent advances in machine learning. However, latent and low rank structures prevalent in real world distri-butions have rarely been taken into account in this setting. Furthermore, no prior work in kernel embedding literature has addressed the issue ..."
Abstract
- Add to MetaCart
Kernel embedding of distributions has led to many recent advances in machine learning. However, latent and low rank structures prevalent in real world distri-butions have rarely been taken into account in this setting. Furthermore, no prior work in kernel embedding literature has addressed
Results 1 - 10
of
240