Results 1 - 10
of
1,717
Sparse regression
, 2013
"... Low-rank approximation a b s t r a c t Advances of modern science and engineering lead to unprecedented amount of data for information processing. Of particular interest is the semi-supervised learning, where very few training samples are available among large volumes of unlabeled data. Graph-based ..."
Abstract
- Add to MetaCart
. An important novelty is that our formulation can be transformed to a standard LASSO regression. On one hand, this makes it possible to employ advanced sparse solvers to handle large scale problems; on the other hand, a globally optimal subset of basis can be chosen adaptively given desired strength
Blockwise sparse regression
- Statistica Sinica
"... proposed the grouped LASSO, which achieves shrink-age and selection simultaneously, as LASSO does, but works on blocks of covariates. That is, the grouped LASSO provides a model where some blocks of regression co-ecients are exactly zero. The grouped LASSO is useful when there are meaningful blocks ..."
Abstract
-
Cited by 64 (2 self)
- Add to MetaCart
of covariates such as polynomial regression and dummy variables from cat-egorical variables. In this paper, we propose an extension of the grouped LASSO, called `Blockwise Sparse Regression ' (BSR). The BSR achieves shrinkage and se-lection simultaneously on blocks of covariates similarly to the grouped
Standard Sparse Regression
"... Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis: high level view Some extensions (complex regularization) structured sparsity graphical model matrix regularization T. Zhang (Rutgers) Sparsity Models 2 / 28 Modern Sparsity Analysis: Motivat ..."
Abstract
- Add to MetaCart
Standard sparse regression model algorithms: convex relaxation and greedy algorithm sparse recovery analysis: high level view Some extensions (complex regularization) structured sparsity graphical model matrix regularization T. Zhang (Rutgers) Sparsity Models 2 / 28 Modern Sparsity Analysis
Detection boundary in sparse regression
, 2010
"... We study the problem of detection of a p-dimensional sparse vector of parameters in the linear regression model with Gaussian noise. We establish the detection boundary, i.e., the necessary and sufficient conditions for the possibility of successful detection as both the sample size n and the dimens ..."
Abstract
-
Cited by 18 (6 self)
- Add to MetaCart
We study the problem of detection of a p-dimensional sparse vector of parameters in the linear regression model with Gaussian noise. We establish the detection boundary, i.e., the necessary and sufficient conditions for the possibility of successful detection as both the sample size n
Portfolio replication with sparse regression
, 2008
"... Suppose an investor (such as a hedge fund or fund-of-fund) holds a secret portfolio of assets, which may change over time. Suppose further that the investor makes public the overall returns on this portfolio on a regular basis, as is the case wth hedge funds, funds of hedge funds, and indexes of hed ..."
Abstract
- Add to MetaCart
be invested and wish to hedge the risk of a signi cant downturn in the fund or fund of funds. As we cannot know its holdings, we are forced to estimate them statistically. Most practical strategies for replication available today make use of a rolling regression against fewer than ten factors. These factors
Collaborative Sparse Regression For Hyperspectral Unmixing
- IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING
, 2013
"... Sparse unmixing has been recently introduced in hyperspectral imaging as a framework to characterize mixed pixels. It assumes that the observed image signatures can be expressed in the form of linear combinations of a number of pure spectral signatures known in advance (e.g., spectra collected on th ..."
Abstract
-
Cited by 11 (4 self)
- Add to MetaCart
exploits the usual very low number of endmembers present in real images, out of a very large library. Specifically, we adopt the collaborative (also called “multitask” or “simultaneous”) sparse regression framework that improves the unmixing results by solving a joint sparse regression problem, where
Robust sparse regression in high dimensions
"... Due to the increasing availability of data sets with a large number of variables, sparse model estimation is a topic of high importance in modern data analysis. Sparse regression allows to improve prediction performance by variance reduction and increase interpretability of the resulting models due ..."
Abstract
- Add to MetaCart
Due to the increasing availability of data sets with a large number of variables, sparse model estimation is a topic of high importance in modern data analysis. Sparse regression allows to improve prediction performance by variance reduction and increase interpretability of the resulting models due
Partial Correlation Estimation by Joint Sparse Regression Models
- JASA
, 2008
"... In this article, we propose a computationally efficient approach—space (Sparse PArtial Correlation Estimation)—for selecting nonzero partial correlations under the high-dimension-low-sample-size setting. This method assumes the overall sparsity of the partial correlation matrix and employs sparse re ..."
Abstract
-
Cited by 94 (8 self)
- Add to MetaCart
In this article, we propose a computationally efficient approach—space (Sparse PArtial Correlation Estimation)—for selecting nonzero partial correlations under the high-dimension-low-sample-size setting. This method assumes the overall sparsity of the partial correlation matrix and employs sparse
Sparse Regression Codes: Recent Results and Future Directions
"... Abstract-Sparse Superposition or Sparse Regression codes were recently introduced by Barron and Joseph for communication over the AWGN channel. The code is defined in terms of a design matrix; codewords are linear combinations of subsets of columns of the matrix. These codes achieve the AWGN channe ..."
Abstract
- Add to MetaCart
Abstract-Sparse Superposition or Sparse Regression codes were recently introduced by Barron and Joseph for communication over the AWGN channel. The code is defined in terms of a design matrix; codewords are linear combinations of subsets of columns of the matrix. These codes achieve the AWGN
Results 1 - 10
of
1,717