Results 1  10
of
3,099
Variable Clustering by Covariance Selection
, 2002
"... We describe graphical gaussian models, or covariance selection models, and a C++ class that implements their functionality. Graph theoretic clustering is applied to the resulting conditional independence graph to yield clusters of variables that are strongly associated. We present a method of genera ..."
Abstract
 Add to MetaCart
We describe graphical gaussian models, or covariance selection models, and a C++ class that implements their functionality. Graph theoretic clustering is applied to the resulting conditional independence graph to yield clusters of variables that are strongly associated. We present a method
Bayesian covariance selection
 ISDS Discussion Paper
, 2004
"... We present a novel structural learning method called HdBCS that performs covariance selection in a Bayesian framework for datasets with tens of thousands of variables. HdBCS is based on the intrinsic connection between graphical models on undirected graphs and graphical models on directed acyclic gr ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
We present a novel structural learning method called HdBCS that performs covariance selection in a Bayesian framework for datasets with tens of thousands of variables. HdBCS is based on the intrinsic connection between graphical models on undirected graphs and graphical models on directed acyclic
Covariance Selection by Thresholding the Sample Correlation Matrix
"... Covariance selection by thresholding the sample correlation matrix ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
Covariance selection by thresholding the sample correlation matrix
Chapter 7. Covariate Selection
"... This chapter addresses strategies for selecting variables for adjustment in nonexperimental comparative effectiveness research (CER), and uses causal graphs to illustrate the causal network relating treatment to outcome. While selection approaches should be based on an understanding of the causal ne ..."
Abstract
 Add to MetaCart
This chapter addresses strategies for selecting variables for adjustment in nonexperimental comparative effectiveness research (CER), and uses causal graphs to illustrate the causal network relating treatment to outcome. While selection approaches should be based on an understanding of the causal
Joint covariate selection for grouped classification
"... We address the problem of recovering a common set of covariates that are relevant simultaneously to several classification problems. We propose a joint measure of complexity for the group of problems that couples covariate selection. By penalizing the sum of ℓ2norms of the blocks of coefficients as ..."
Abstract

Cited by 11 (2 self)
 Add to MetaCart
We address the problem of recovering a common set of covariates that are relevant simultaneously to several classification problems. We propose a joint measure of complexity for the group of problems that couples covariate selection. By penalizing the sum of ℓ2norms of the blocks of coefficients
On Sparse Nonparametric Conditional Covariance Selection
"... We develop a penalized kernel smoothing method for the problem of selecting nonzeroelementsoftheconditionalprecisionmatrix, known as conditional covariance selection. This problem has a key role in many modernapplicationssuchasfinanceandcomputational biology. However, it has not been properly addres ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
We develop a penalized kernel smoothing method for the problem of selecting nonzeroelementsoftheconditionalprecisionmatrix, known as conditional covariance selection. This problem has a key role in many modernapplicationssuchasfinanceandcomputational biology. However, it has not been properly
A Pathwise Algorithm for Covariance Selection
, 2009
"... Covariance selection seeks to estimate a covariance matrix by maximum likelihood while restricting the number of nonzero inverse covariance matrix coefficients. A single penalty parameter usually controls the tradeoff between log likelihood and sparsity in the inverse matrix. We describe an efficien ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
Covariance selection seeks to estimate a covariance matrix by maximum likelihood while restricting the number of nonzero inverse covariance matrix coefficients. A single penalty parameter usually controls the tradeoff between log likelihood and sparsity in the inverse matrix. We describe
High dimensional graphs and variable selection with the Lasso
 ANNALS OF STATISTICS
, 2006
"... The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso is a ..."
Abstract

Cited by 736 (22 self)
 Add to MetaCart
The pattern of zero entries in the inverse covariance matrix of a multivariate normal distribution corresponds to conditional independence restrictions between variables. Covariance selection aims at estimating those structural zeros from data. We show that neighborhood selection with the Lasso
Firstorder methods for sparse covariance selection
 SIAM Journal on Matrix Analysis and Applications
"... Abstract. Given a sample covariance matrix, we solve a maximum likelihood problem penalized by the number of nonzero coefficients in the inverse covariance matrix. Our objective is to find a sparse representation of the sample data and to highlight conditional independence relationships between the ..."
Abstract

Cited by 104 (2 self)
 Add to MetaCart
Abstract. Given a sample covariance matrix, we solve a maximum likelihood problem penalized by the number of nonzero coefficients in the inverse covariance matrix. Our objective is to find a sparse representation of the sample data and to highlight conditional independence relationships between
Covariate Selection for Semiparametric Hazard Function Regression
"... We study a flexible class of nonproportional hazard function regression models in which the influence of the covariates splits into the sum of a parametric part and a timedependent nonparametric part. We develop a method of covariate selection for the parametric part by adjusting for the implic ..."
Abstract

Cited by 8 (1 self)
 Add to MetaCart
We study a flexible class of nonproportional hazard function regression models in which the influence of the covariates splits into the sum of a parametric part and a timedependent nonparametric part. We develop a method of covariate selection for the parametric part by adjusting
Results 1  10
of
3,099