Results 11  20
of
405
Bayesian Smoothing and Regression Splines for Measurement Error Problems
 Journal of the American Statistical Association
, 2001
"... In the presence of covariate measurement error, estimating a regression function nonparametrically is extremely dicult, the problem being related to deconvolution. Various frequentist approaches exist for this problem, but to date there has been no Bayesian treatment. In this paper we describe Bayes ..."
Abstract

Cited by 41 (7 self)
 Add to MetaCart
In the presence of covariate measurement error, estimating a regression function nonparametrically is extremely dicult, the problem being related to deconvolution. Various frequentist approaches exist for this problem, but to date there has been no Bayesian treatment. In this paper we describe Bayesian approaches to modeling a exible regression function when the predictor variable is measured with error. The regression function is modeled with smoothing splines and regression P{splines. Two methods are described for exploration of the posterior. The rst is called iterative conditional modes (ICM) and is only partially Bayesian. ICM uses a componentwise maximization routine to nd the mode of the posterior. It also serves to create starting values for the second method, which is fully Bayesian and uses Markov chain Monte Carlo techniques to generate observations from the joint posterior distribution. Using the MCMC approach has the advantage that interval estimates that directly model and adjust for the measurement error are easily calculated. We provide simulations with several nonlinear regression functions and provide an illustrative example. Our simulations indicate that the frequentist mean squared error properties of the fully Bayesian method are better than those of ICM and also of previously proposed frequentist methods, at least in the examples we have studied. KEY WORDS: Bayesian methods; Eciency; Errors in variables; Functional method; Generalized linear models; Kernel regression; Measurement error; Nonparametric regression; P{splines; Regression Splines; SIMEX; Smoothing Splines; Structural modeling. Short title. Nonparametric Regression with Measurement Error Author Aliations Scott M. Berry (Email: scott@berryconsultants.com) is Statistical Scientist,...
Smoothing with Mixed Model Software
 Journal of Statistical Software
, 2004
"... Smoothing with mixed model software Smoothing methods that use basis functions with penalization can be formulated as fits in a mixed model framework. One of the major benefits is that software for mixed model analysis can be used for smoothing. We illustrate this for several smoothing models such a ..."
Abstract

Cited by 36 (2 self)
 Add to MetaCart
Smoothing with mixed model software Smoothing methods that use basis functions with penalization can be formulated as fits in a mixed model framework. One of the major benefits is that software for mixed model analysis can be used for smoothing. We illustrate this for several smoothing models such as additive and varying coefficient models for both SPLUS and SAS software. Code for each of the illustrations is available on the Internet.
Nonparametric Regression In The Presence Of Measurement Error
 Biometrika
, 1999
"... This paper develops the two ideas of approximately consistent and regression spline estimation in the presence of measurement error. In ..."
Abstract

Cited by 35 (5 self)
 Add to MetaCart
This paper develops the two ideas of approximately consistent and regression spline estimation in the presence of measurement error. In
Adaptive Bayesian Regression Splines in Semiparametric Generalized Linear Models
 Journal of Computational and Graphical Statistics
, 1998
"... This paper presents a fully Bayesian approach to regression splines with automatic knot selection in generalized semiparametric models for fundamentally nonGaussian responses. In a basis function representation of the regression spline we use a Bspline basis. The reversible jump Markov chain Mon ..."
Abstract

Cited by 32 (2 self)
 Add to MetaCart
(Show Context)
This paper presents a fully Bayesian approach to regression splines with automatic knot selection in generalized semiparametric models for fundamentally nonGaussian responses. In a basis function representation of the regression spline we use a Bspline basis. The reversible jump Markov chain Monte Carlo method allows for simultaneous estimation both of the number of knots and the knot placement, together with the unknown basis coefficients determining the shape of the spline. Since the spline can be represented as design matrix times unknown (basis) coefficients, it is straightforward to include additionally a vector of covariates with fixed effects, yielding a semiparametric model. The method is illustrated with data sets from the literature for curve estimation in generalized linear models, the Tokyo rainfall data and the coal mining disaster data, and by a creditscoring problem for generalized semiparametric models. Keywords: Bspline basis; knot selection; nonnormal response...
COBS: Qualitatively Constrained Smoothing via Linear Programming
, 1999
"... this paper, we attempt to bring the problem of constrained spline smoothing to the foreground and describe the details of a constrained Bspline smoothing (COBS) algorithm that is being made available to Splus users. Recent work of He & Shi (1998) considered a special case and showed that the L ..."
Abstract

Cited by 31 (5 self)
 Add to MetaCart
this paper, we attempt to bring the problem of constrained spline smoothing to the foreground and describe the details of a constrained Bspline smoothing (COBS) algorithm that is being made available to Splus users. Recent work of He & Shi (1998) considered a special case and showed that the L 1 projection of a smooth function into the space of Bsplines provides a monotone smoother that is flexible, efficient and achieves the optimal rate of convergence. Several options and generalizations are included in COBS: it can handle small or large data sets either with user interaction or full automation. Three examples are provided to show how COBS works in a variety of realworld applications.
On the asymptotics of penalized splines
, 2007
"... The asymptotic behaviour of penalized spline estimators is studied in the univariate case. We use Bsplines and a penalty is placed on mthorder differences of the coefficients. The number of knots is assumed to converge to infinity as the sample size increases. We show that penalized splines behav ..."
Abstract

Cited by 31 (3 self)
 Add to MetaCart
The asymptotic behaviour of penalized spline estimators is studied in the univariate case. We use Bsplines and a penalty is placed on mthorder differences of the coefficients. The number of knots is assumed to converge to infinity as the sample size increases. We show that penalized splines behave similarly to NadarayaWatson kernel estimators with ‘equivalent ’ kernels depending upon m. The equivalent kernels we obtain for penalized splines are the same as those found by Silverman for smoothing splines. The asymptotic distribution of the penalized spline estimator is Gaussian and we give simple expressions for the asymptotic mean and variance. Provided that it is fast enough, the rate at which the number of knots converges to infinity does not affect the asymptotic distribution. The optimal rate of convergence of the penalty parameter is given. Penalized splines are not designadaptive.
Testing for Predator Dependence in PredatorPrey Dynamics: A NonParametric Approach
 Proceedings of the Royal Society of London B
, 2000
"... this paper we will call these timeseries by the number of the gure from which they were retrieved, i.e. 11a, 12a, 14c, 2b and 2c. B. G. Veilleux (personal communication) was not able to provide the original data of these experiments. (a) Sampling errors ..."
Abstract

Cited by 27 (1 self)
 Add to MetaCart
this paper we will call these timeseries by the number of the gure from which they were retrieved, i.e. 11a, 12a, 14c, 2b and 2c. B. G. Veilleux (personal communication) was not able to provide the original data of these experiments. (a) Sampling errors
Joint probabilistic curve clustering and alignment
 In Advances in Neural Information Processing Systems 17
, 2005
"... Clustering and prediction of sets of curves is an important problem in many areas of science and engineering. It is often the case that curves tend to be misaligned from each other in a continuous manner, either in space (across the measurements) or in time. We develop a probabilistic framework that ..."
Abstract

Cited by 26 (0 self)
 Add to MetaCart
(Show Context)
Clustering and prediction of sets of curves is an important problem in many areas of science and engineering. It is often the case that curves tend to be misaligned from each other in a continuous manner, either in space (across the measurements) or in time. We develop a probabilistic framework that allows for joint clustering and continuous alignment of sets of curves in curve space (as opposed to a fixeddimensional featurevector space). The proposed methodology integrates new probabilistic alignment models with modelbased curve clustering algorithms. The probabilistic approach allows for the derivation of consistent EM learning algorithms for the joint clusteringalignment problem. Experimental results are shown for alignment of human growth data, and joint clustering and alignment of gene expression timecourse data. 1
2007), ”A Note on Penalized Spline Smoothing with Correlated Errors”, mimeo
"... This note investigates the behavior of data driven smoothing parameters for penalized spline regression in the presence of correlated data. It has been shown for other smoothing methods before, that mean squared error minimizers, such as (generalized) cross validation or Akaike criterion, are extre ..."
Abstract

Cited by 25 (5 self)
 Add to MetaCart
This note investigates the behavior of data driven smoothing parameters for penalized spline regression in the presence of correlated data. It has been shown for other smoothing methods before, that mean squared error minimizers, such as (generalized) cross validation or Akaike criterion, are extremely sensitive to misspecifications of the correlation structure over or (under) fitting the data. In contrast to this, we show that a maximum likelihood based choice of the smoothing parameter is more robust and for moderately misspecified correlation structure over or (under) fitting does not occur. This is demonstrated in simulations, data examples and supported by theoretical investigations.
On semiparametric regression with O’Sullivan penalized splines
 Australian and New Zealand Journal of Statistics
, 2008
"... An exposition on the use of O’Sullivan penalized splines in contemporary semiparametric regression, including mixed model and Bayesian formulations, is presented. O’Sullivan penalized splines are similar to Psplines, but have the advantage of being a direct generalization of smoothing splines. Ex ..."
Abstract

Cited by 24 (7 self)
 Add to MetaCart
An exposition on the use of O’Sullivan penalized splines in contemporary semiparametric regression, including mixed model and Bayesian formulations, is presented. O’Sullivan penalized splines are similar to Psplines, but have the advantage of being a direct generalization of smoothing splines. Exact expressions for the O’Sullivan penalty matrix are obtained. Comparisons between the two types of splines reveal that O’Sullivan penalized splines more closely mimic the natural boundary behaviour of smoothing splines. Implementation in modern computing environments such as MATLAB, R and BUGS is discussed.