• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 113
Next 10 →

SIMULTANEOUS ANALYSIS OF LASSO AND DANTZIG SELECTOR

by Peter J. Bickel, Alexandre Tsybakov, et al. - SUBMITTED TO THE ANNALS OF STATISTICS , 2007
"... We exhibit an approximate equivalence between the Lasso estimator and Dantzig selector. For both methods we derive parallel oracle inequalities for the prediction risk in the general nonparametric regression model, as well as bounds on the ℓp estimation loss for 1 ≤ p ≤ 2 in the linear model when th ..."
Abstract - Cited by 472 (11 self) - Add to MetaCart
We exhibit an approximate equivalence between the Lasso estimator and Dantzig selector. For both methods we derive parallel oracle inequalities for the prediction risk in the general nonparametric regression model, as well as bounds on the ℓp estimation loss for 1 ≤ p ≤ 2 in the linear model when

On the LASSO and Dantzig selector equivalence

by M. Salman Asif, Justin Romberg - in 44th Annual Conference on Information Sciences and Systems , 2010
"... Abstract—Recovery of sparse signals from noisy observations is a problem that arises in many information processing contexts. LASSO and the Dantzig selector (DS) are two well-known schemes used to recover high-dimensional sparse signals from linear observations. This paper presents some results on t ..."
Abstract - Cited by 7 (1 self) - Add to MetaCart
Abstract—Recovery of sparse signals from noisy observations is a problem that arises in many information processing contexts. LASSO and the Dantzig selector (DS) are two well-known schemes used to recover high-dimensional sparse signals from linear observations. This paper presents some results

Transductive versions of the LASSO and the Dantzig Selector

by Pierre Alquier, Mohamed Hebiri , 2009
"... We consider the linear regression problem, where the number p of covariates is possibly larger than the number n of observations (xi, yi)i≤i≤n, under sparsity assumptions. On the one hand, several methods have been successfully proposed to perform this task, for example the LASSO in [Tib96] or the D ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
to the more general task ”estimate on the whole domain”. In this work, we propose a generalized version both of the LASSO and the Dantzig Selector, based on the geometrical remarks about the LASSO in [Alq08, AH08]. The ”usual ” LASSO and Dantzig Selector, as well as new estimators interpreted as transductive

A REMARK ON THE LASSO AND THE DANTZIG SELECTOR

by Yohann De Castro
"... Abstract. This article investigates a new parameter for the high-dimensional regression with noise: the distortion. This latter has attracted a lot of attention recently with the appearance of new deterministic constructions of “almost”-Euclidean sections of the L1-ball. It measures how far is the i ..."
Abstract - Cited by 2 (2 self) - Add to MetaCart
is the intersection between the kernel of the design matrix and the unit L1-ball from an L2-ball. We show that the distortion holds enough information to derive oracle inequalities (i.e. a comparison to an ideal situation where one knows the s largest coefficients of the target) for the lasso and the Dantzig selector

DASSO: Connections between the dantzig selector and lasso

by Gareth M. James, Peter Radchenko, Jinchi Lv , 2009
"... We propose a new algorithm, DASSO, for fitting the entire coefficient path of the Dantzig selector with a similar computational cost to the least angle regression algorithm that is used to compute the lasso. DASSO efficiently constructs a piecewise linear path through a sequential simplex-like algo ..."
Abstract - Cited by 42 (4 self) - Add to MetaCart
We propose a new algorithm, DASSO, for fitting the entire coefficient path of the Dantzig selector with a similar computational cost to the least angle regression algorithm that is used to compute the lasso. DASSO efficiently constructs a piecewise linear path through a sequential simplex

The Double Dantzig

by Gareth M. James, Peter Radchenko, Yi Β
"... The Dantzig selector (Candes and Tao, 2007) is a new approach that has been proposed for performing variable selection and model fitting on linear regression models. It uses an L1 penalty to shrink the regression coefficients towards zero, in a similar fashion to the Lasso. While both the Lasso and ..."
Abstract - Add to MetaCart
The Dantzig selector (Candes and Tao, 2007) is a new approach that has been proposed for performing variable selection and model fitting on linear regression models. It uses an L1 penalty to shrink the regression coefficients towards zero, in a similar fashion to the Lasso. While both the Lasso

A Multi-Stage Framework for Dantzig Selector and LASSO

by Ji Liu, Peter Wonka, Jieping Ye, Tong Zhang
"... We consider the following sparse signal recovery (or feature selection) problem: given a design matrix X ∈ Rn×m (m ≫ n) and a noisy observation vector y ∈ Rn satisfying y = Xβ ∗ + ε where ε is the noise vector following a Gaussian distribution N(0,σ2 I), how to recover the signal (or parameter vecto ..."
Abstract - Cited by 3 (2 self) - Add to MetaCart
vector) β ∗ when the signal is sparse? The Dantzig selector has been proposed for sparse signal recovery with strong theoretical guarantees. In this paper, we propose a multi-stage Dantzig selector method, which iteratively refines the target signal β ∗. We show that if X obeys a certain condition

Stability Analysis of LASSO and Dantzig Selector via Constrained Minimal Singular Value of Gaussian Sensing Matrices

by Oliver James
"... In this paper, we introduce a new framework for interpreting the existing theoretical stability results of sparse signal recov-ery algorithms in practical terms. Our framework is built on the theory of constrained minimal singular values of Gaussian sensing matrices. Adopting our framework, we study ..."
Abstract - Add to MetaCart
study the sta-bility of two algorithms, namely LASSO and Dantzig selec-tor. We demonstrate that for a given stability parameter (noise sensitivity), there exits a minimum undersampling ratio above which the recovery algorithms are guaranteed to be stable.

Sparse recovery with coherent tight frame via analysis Dantzig selector and analysis LASSO

by Junhong Lin, Song Li , 2013
"... ..."
Abstract - Cited by 1 (0 self) - Add to MetaCart
Abstract not found

Gene Regulatory Network Reconstruction Using Bayesian Networks, the Dantzig Selector, the Lasso and Their Meta-Analysis

by Matthieu Vignes, Jimmy V, David Allouche, Nidal Ramadan-alban, Christine Cierco-ayrolles, Thomas Schiex, Brigitte Mangin, Simon De Givry , 2011
"... Modern technologies and especially next generation sequencing facilities are giving a cheaper access to genotype and genomic data measured on the same sample at once. This creates an ideal situation for multifactorial experiments designed to infer gene regulatory networks. The fifth ‘‘Dialogue for R ..."
Abstract - Cited by 4 (1 self) - Add to MetaCart
. We investigated a wide panel of methods ranging from Bayesian networks to penalised linear regressions to analyse such data, and proposed a simple yet very powerful meta-analysis, which combines these inference methods. We present results of the Challenge as well as more in-depth analysis
Next 10 →
Results 1 - 10 of 113
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University