Results 1 - 10
of
974
Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
, 2001
"... Variable selection is fundamental to high-dimensional statistical modeling, including nonparametric regression. Many approaches in use are stepwise selection procedures, which can be computationally expensive and ignore stochastic errors in the variable selection process. In this article, penalized ..."
Abstract
-
Cited by 948 (62 self)
- Add to MetaCart
of the proposed penalized likelihood estimators are established. Furthermore, with proper choice of regularization parameters, we show that the proposed estimators perform as well as the oracle procedure in variable selection; namely, they work as well as if the correct submodel were known. Our simulation shows
Flexible smoothing with B-splines and penalties
- STATISTICAL SCIENCE
, 1996
"... B-splines are attractive for nonparametric modelling, but choosing the optimal number and positions of knots is a complex task. Equidistant knots can be used, but their small and discrete number allows only limited control over smoothness and fit. We propose to use a relatively large number of knots ..."
Abstract
-
Cited by 405 (7 self)
- Add to MetaCart
of knots and a difference penalty on coefficients of adjacent B-splines. We show connections to the familiar spline penalty on the integral of the squared second derivative. A short overview of B-splines, their construction, and penalized likelihood is presented. We discuss properties of penalized B
Model Selection Through Sparse Maximum Likelihood Estimation for Multivariate Gaussian or Binary Data
- JOURNAL OF MACHINE LEARNING RESEARCH
, 2008
"... We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added ℓ1-norm penalty term. The problem as formulated is convex but the memor ..."
Abstract
-
Cited by 334 (2 self)
- Add to MetaCart
We consider the problem of estimating the parameters of a Gaussian or binary distribution in such a way that the resulting undirected graphical model is sparse. Our approach is to solve a maximum likelihood problem with an added ℓ1-norm penalty term. The problem as formulated is convex
Coordinate descent algorithms for lasso penalized regression
- Ann. Appl. Stat
, 2008
"... Imposition of a lasso penalty shrinks parameter estimates toward zero and performs continuous model selection. Lasso penalized regression is capable of handling linear regression problems where the number of predictors far exceeds the number of cases. This paper tests two exceptionally fast algorith ..."
Abstract
-
Cited by 109 (3 self)
- Add to MetaCart
Imposition of a lasso penalty shrinks parameter estimates toward zero and performs continuous model selection. Lasso penalized regression is capable of handling linear regression problems where the number of predictors far exceeds the number of cases. This paper tests two exceptionally fast
Selecting the Number of Knots For Penalized Splines
, 2000
"... Penalized splines, or P-splines, are regression splines fit by least-squares with a roughness penaly. P-splines have much in common with smoothing splines, but the type of penalty used with a P-spline is somewhat more general than for a smoothing spline. Also, the number and location of the knots ..."
Abstract
-
Cited by 105 (10 self)
- Add to MetaCart
Penalized splines, or P-splines, are regression splines fit by least-squares with a roughness penaly. P-splines have much in common with smoothing splines, but the type of penalty used with a P-spline is somewhat more general than for a smoothing spline. Also, the number and location
Hierarchical penalization
- IN ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 20
, 2007
"... Hierarchical penalization is a generic framework for incorporating prior information in the fitting of statistical models, when the explicative variables are organized in a hierarchical structure. The penalizer is a convex functional that performs soft selection at the group level, and shrinks varia ..."
Abstract
-
Cited by 15 (3 self)
- Add to MetaCart
Hierarchical penalization is a generic framework for incorporating prior information in the fitting of statistical models, when the explicative variables are organized in a hierarchical structure. The penalizer is a convex functional that performs soft selection at the group level, and shrinks
Spatial resolution properties of penalized-likelihood image reconstruction methods:
- Space-invariant tomographs,” IEEE Trans. Image Processing,
, 1996
"... ABSTRACT This paper examines the spatial resolution properties of penalized-likelihood image reconstruction methods by analyzing the local impulse response. The analysis shows that standard regularization penalties induce space-variant local impulse response functions, even for space-invariant tomo ..."
Abstract
-
Cited by 148 (71 self)
- Add to MetaCart
to a modified regularization penalty that yields reconstructed images with nearly uniform resolution. The modified penalty also provides a very practical method for choosing the regularization parameter to obtain a specified resolution in images reconstructed by penalized-likelihood methods.
ON METHODS OF SIEVES AND PENALIZATION
, 1997
"... We develop a general theory which provides a unified treatment for the asymptotic normality and efficiency of the maximum likelihood estimates (MLE’s) in parametric, semiparametric and nonparametric models. We find that the asymptotic behavior of substitution estimates for estimating smooth function ..."
Abstract
-
Cited by 61 (1 self)
- Add to MetaCart
functionals are essentially governed by two indices: the degree of smoothness of the functional and the local size of the underlying parameter space. We show that when the local size of the parameter space is not very large, the substitution standard (nonsieve), substitution sieve and substitution penalized
Results 1 - 10
of
974