Results 1 
9 of
9
The performance of group lasso for linear regression of grouped variables
, 2011
"... The lasso [19] and group lasso [23] are popular algorithms in the signal processing and statistics communities. In signal processing, these algorithms allow for efficient sparse approximations of arbitrary signals in overcomplete dictionaries. In statistics, they facilitate efficient variable select ..."
Abstract

Cited by 7 (2 self)
 Add to MetaCart
(Show Context)
The lasso [19] and group lasso [23] are popular algorithms in the signal processing and statistics communities. In signal processing, these algorithms allow for efficient sparse approximations of arbitrary signals in overcomplete dictionaries. In statistics, they facilitate efficient variable selection and reliable regression
Multiuser Detection in Asynchronous On–Off Random Access Channels Using Lasso
"... Abstract—This paper considers on–off random access channels where users transmit either a one or a zero to a base station. Such channels represent an abstraction of control channels used for scheduling requests in thirdgeneration cellular systems and uplinks in wireless sensor networks deployed for ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
Abstract—This paper considers on–off random access channels where users transmit either a one or a zero to a base station. Such channels represent an abstraction of control channels used for scheduling requests in thirdgeneration cellular systems and uplinks in wireless sensor networks deployed for target detection. This paper introduces a novel convexoptimizationbased scheme for multiuser detection (MUD) in asynchronous on–off random access channels that does not require knowledge of the delays or the instantaneous received signaltonoise ratios of the individual users at the base station. For any fixed number of temporal signal space dimensions N and maximum delay τ in the system, the proposed scheme can accommodate M � exp(O(N 1/3)) total users and k � N/logM active users in the system—a significant improvement over thek ≤ M � N scaling suggested by the use of classical matchedfilteringbased approaches to MUD employing orthogonal signaling. Furthermore, the computational complexity of the proposed scheme differs from that of a similar oraclebased scheme with perfect knowledge of the user delays by at most a factor oflog(N+τ). Finally, the results presented in here are nonasymptotic, in contrast to related previous work for synchronous channels that only guarantees that the probability of MUD error at the base station goes to zero asymptotically in M. I.
Group Model Selection Using Marginal Correlations: The Good, the Bad and the Ugly
"... Abstract — Group model selection is the problem of determining a small subset of groups of predictors (e.g., the expression data of genes) that are responsible for majority of the variation in a response variable (e.g., the malignancy of a tumor). This paper focuses on group model selection in high ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
Abstract — Group model selection is the problem of determining a small subset of groups of predictors (e.g., the expression data of genes) that are responsible for majority of the variation in a response variable (e.g., the malignancy of a tumor). This paper focuses on group model selection in highdimensional linear models, in which the number of predictors far exceeds the number of samples of the response variable. Existing works on highdimensional group model selection either require the number of samples of the response variable to be significantly larger than the total number of predictors contributing to the response or impose restrictive statistical priors on the predictors and/or nonzero regression coefficients. This paper provides comprehensive understanding of a lowcomplexity approach to group model selection that avoids some of these limitations. The proposed approach, termed Group Thresholding (GroTh), is based on thresholding of marginal correlations of groups of predictors with the response variable and is reminiscent of existing thresholdingbased approaches in the literature. The most important contribution of the paper in this regard is relating the performance of GroTh to a polynomialtime verifiable property of the predictors for the general case of arbitrary (random or deterministic) predictors and arbitrary nonzero regression coefficients.
Asynchronous CodeDivision Random Access Using Convex Optimization
"... Many applications in cellular systems and sensor networks involve a random subset of a large number of users asynchronously reporting activity to a base station. This paper examines the problem of multiuser detection (MUD) in random access channels for such applications. Traditional orthogonal signa ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
Many applications in cellular systems and sensor networks involve a random subset of a large number of users asynchronously reporting activity to a base station. This paper examines the problem of multiuser detection (MUD) in random access channels for such applications. Traditional orthogonal signaling ignores the random nature of user activity in this problem and limits the total number of users to be on the order of the number of signal space dimensions. Contentionbased schemes, on the other hand, suffer from delays caused by colliding transmissions and the hidden node problem. In contrast, this paper presents a novel pairing of an asynchronous nonorthogonal codedivision random access scheme with a convex optimizationbased MUD algorithm that overcomes the issues associated with orthogonal signaling and contentionbased methods. Two key distinguishing features of the proposed MUD algorithm are that it does not require knowledge of the delay or channel state information of every user and it has polynomialtime computational complexity. The main analytical contribution of this paper is the relationship between the performance of the proposed MUD algorithm in the presence of arbitrary or random delays and two simple metrics of the set of user codewords. The study of these metrics is then focused on two specific sets of codewords, random binary codewords and specially constructed algebraic
1On Block Coherence of Frames
"... Block coherence of matrices plays an important role in analyzing the performance of block compressed sensing recovery algorithms (Bajwa and Mixon, 2012). In this paper, we characterize two block coherence metrics: worstcase and average block coherence. First, we present lower bounds on worstcase b ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
Block coherence of matrices plays an important role in analyzing the performance of block compressed sensing recovery algorithms (Bajwa and Mixon, 2012). In this paper, we characterize two block coherence metrics: worstcase and average block coherence. First, we present lower bounds on worstcase block coherence, in both the general case and also when the matrix is constrained to be a union of orthobases. We then present deterministic matrix constructions based upon Kronecker products which obtain these lower bounds. We also characterize the worstcase block coherence of random subspaces. Finally, we present a flipping algorithm that can improve the average block coherence of a matrix, while maintaining the worstcase block coherence of the original matrix. We provide numerical examples which demonstrate that our proposed deterministic matrix construction performs well in block compressed sensing. I.
1Asynchronous CodeDivision Random Access Using Convex Optimization
"... Many applications in cellular systems and sensor networks involve a random subset of a large number of users asynchronously reporting activity to a base station. This paper examines the problem of multiuser detection (MUD) in random access channels for such applications. Traditional orthogonal signa ..."
Abstract
 Add to MetaCart
Many applications in cellular systems and sensor networks involve a random subset of a large number of users asynchronously reporting activity to a base station. This paper examines the problem of multiuser detection (MUD) in random access channels for such applications. Traditional orthogonal signaling ignores the random nature of user activity in this problem and limits the total number of users to be on the order of the number of signal space dimensions. Contentionbased schemes, on the other hand, suffer from delays caused by colliding transmissions and the hidden node problem. In contrast, this paper presents a novel pairing of an asynchronous nonorthogonal codedivision random access scheme with a convex optimizationbased MUD algorithm that overcomes the issues associated with orthogonal signaling and contentionbased methods. Two key distinguishing features of the proposed MUD algorithm are that it does not require knowledge of the delay or channel state information of every user and it has polynomialtime computational complexity. The main analytical contribution of this paper is the relationship between the performance of the proposed MUD algorithm in the presence of arbitrary or random delays and two simple metrics of the set of user codewords. The study of these metrics is then focused on two specific sets of codewords, random binary codewords and specially constructed algebraic codewords, for asynchronous random access. The ensuing analysis confirms that the proposed scheme together with either of these two codeword sets significantly outperforms the orthogonal signalingbased random access in terms of the total number of users in the system.
REGRESSION PERFORMANCE OF GROUP LASSO FOR ARBITRARY DESIGN MATRICES
"... In many linear regression problems, explanatory variables are activated in groups or clusters; group lasso has been proposed for regression in such cases. This paper studies the nonasymptotic regression performance of group lasso using ℓ1/ℓ2 regularization for arbitrary (random or deterministic) des ..."
Abstract
 Add to MetaCart
(Show Context)
In many linear regression problems, explanatory variables are activated in groups or clusters; group lasso has been proposed for regression in such cases. This paper studies the nonasymptotic regression performance of group lasso using ℓ1/ℓ2 regularization for arbitrary (random or deterministic) design matrices. In particular, the paper establishes under a statistical prior on the set of nonzero coefficients that the ℓ1/ℓ2 group lasso has a nearoptimal regression error for all but a vanishingly small set of models. The analysis in the paper relies on three easily computable metrics of the design matrix – coherence, block coherence, and spectral norm. Remarkably, under certain conditions on these metrics, the ℓ1/ℓ2 group lasso can perform nearideal regression even if the model order scales almost linearly with the number of rows of the design matrix. This is in stark contrast with prior work on the regression performance of the ℓ1/ℓ2 group lasso that only provides linear scaling of the model order for the case of random design matrices.
Average Case Analysis of HighDimensional BlockSparse Recovery and Regression for Arbitrary Designs
, 2015
"... This paper studies conditions for highdimensional inference when the set of observations is given by a linear combination of a small number of groups of columns of a design matrix, termed the “blocksparse” case. In this regard, it first specifies conditions on the design matrix under which most of ..."
Abstract
 Add to MetaCart
This paper studies conditions for highdimensional inference when the set of observations is given by a linear combination of a small number of groups of columns of a design matrix, termed the “blocksparse” case. In this regard, it first specifies conditions on the design matrix under which most of its block submatrices are well conditioned. It then leverages this result for averagecase analysis of highdimensional blocksparse recovery and regression. In contrast to earlier works: (i) this paper provides conditions on arbitrary designs that can be explicitly computed in polynomial time, (ii) the provided conditions translate into nearoptimal scaling of the number of observations with the number of active blocks of the design matrix, and (iii) the conditions suggest that the spectral norm, rather than the column/block coherences, of the design matrix fundamentally limits the performance of computational methods in highdimensional settings.
Average Case Analysis of HighDimensional BlockSparse Recovery and Regression for Arbitrary Designs
"... Abstract This paper studies conditions for highdimensional inference when the set of observations is given by a linear combination of a small number of groups of columns of a design matrix, termed the "blocksparse" case. In this regard, it first specifies conditions on the design matrix ..."
Abstract
 Add to MetaCart
(Show Context)
Abstract This paper studies conditions for highdimensional inference when the set of observations is given by a linear combination of a small number of groups of columns of a design matrix, termed the "blocksparse" case. In this regard, it first specifies conditions on the design matrix under which most of its block submatrices are well conditioned. It then leverages this result for averagecase analysis of highdimensional blocksparse recovery and regression. In contrast to earlier works: (i) this paper provides conditions on arbitrary designs that can be explicitly computed in polynomial time, (ii) the provided conditions translate into nearoptimal scaling of the number of observations with the number of active blocks of the design matrix, and (iii) the conditions suggest that the spectral norm, rather than the column/block coherences, of the design matrix fundamentally limits the performance of computational methods in highdimensional settings.