Results 1  10
of
10,458
Smooth minimization of nonsmooth functions
 Math. Programming
, 2005
"... In this paper we propose a new approach for constructing efficient schemes for nonsmooth convex optimization. It is based on a special smoothing technique, which can be applied to the functions with explicit maxstructure. Our approach can be considered as an alternative to blackbox minimization. F ..."
Abstract

Cited by 523 (1 self)
 Add to MetaCart
In this paper we propose a new approach for constructing efficient schemes for nonsmooth convex optimization. It is based on a special smoothing technique, which can be applied to the functions with explicit maxstructure. Our approach can be considered as an alternative to blackbox minimization
Adapting to unknown smoothness via wavelet shrinkage
 JOURNAL OF THE AMERICAN STATISTICAL ASSOCIATION
, 1995
"... We attempt to recover a function of unknown smoothness from noisy, sampled data. We introduce a procedure, SureShrink, which suppresses noise by thresholding the empirical wavelet coefficients. The thresholding is adaptive: a threshold level is assigned to each dyadic resolution level by the princip ..."
Abstract

Cited by 1006 (18 self)
 Add to MetaCart
by the principle of minimizing the Stein Unbiased Estimate of Risk (Sure) for threshold estimates. The computational effort of the overall procedure is order N log(N) as a function of the sample size N. SureShrink is smoothnessadaptive: if the unknown function contains jumps, the reconstruction (essentially) does
High Accuracy Optical Flow Estimation Based on a Theory for Warping
, 2004
"... We study an energy functional for computing optical flow that combines three assumptions: a brightness constancy assumption, a gradient constancy assumption, and a discontinuitypreserving spatiotemporal smoothness constraint. ..."
Abstract

Cited by 509 (45 self)
 Add to MetaCart
We study an energy functional for computing optical flow that combines three assumptions: a brightness constancy assumption, a gradient constancy assumption, and a discontinuitypreserving spatiotemporal smoothness constraint.
Bayesian density estimation and inference using mixtures.
 J. Amer. Statist. Assoc.
, 1995
"... JSTOR is a notforprofit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about J ..."
Abstract

Cited by 653 (18 self)
 Add to MetaCart
mixtures of normal distributions. Efficient simulation methods are used to approximate various prior, posterior, and predictive distributions. This allows for direct inference on a variety of practical issues, including problems of local versus global smoothing, uncertainty about density estimates
Efficient Nonparametric Smoothness Estimation
"... Abstract Sobolev quantities (norms, inner products, and distances) of probability density functions are important in the theory of nonparametric statistics, but have rarely been used in practice, due to a lack of practical estimators. They also include, as special cases, L 2 quantities which are us ..."
Abstract
 Add to MetaCart
Abstract Sobolev quantities (norms, inner products, and distances) of probability density functions are important in the theory of nonparametric statistics, but have rarely been used in practice, due to a lack of practical estimators. They also include, as special cases, L 2 quantities which
Smooth Estimates of Normal Mixtures
"... Posterior distributions for mixture models often have multiple modes, particularly near the boundaries of the parameter space where the component variances are small. This multimodality results in predictive densities that are extremely rough. The authors propose an adjustment of the standard Normal ..."
Abstract
 Add to MetaCart
NormalInverse Gamma prior structure that directly controls the ratio of the largest component variance to the smallest component variance. The prior adjustment smooths out modes near the boundary of the parameter space, producing more reasonable estimates of the predictive density. R ESUM E Les lois
Smooth Estimates of Normal Mixtures
"... Posterior distributions for mixture models often have multiple modes, particularly near the boundaries of the parameter space where the component variances are small. This multimodality results in predictive densities that are extremely rough. The authors propose an adjustment of the standard Normal ..."
Abstract
 Add to MetaCart
NormalInverse Gamma prior structure that directly controls the ratio of the largest component variance to the smallest component variance. The prior adjustment smooths out modes near the boundary of the parameter space, producing more reasonable estimates of the predictive density. R ESUM E Les lois
SMOOTHED ESTIMATOR OF THE PERIODIC HAZARD FUNCTION
"... Abstract. A smoothed estimator of the periodic hazard function is considered and its asymptotic probability distribution and bootstrap simultaneous confidence intervals are derived. Moreover, consistency of the bootstrap method is proved and some applications of the developed theory are presented. ..."
Abstract
 Add to MetaCart
Abstract. A smoothed estimator of the periodic hazard function is considered and its asymptotic probability distribution and bootstrap simultaneous confidence intervals are derived. Moreover, consistency of the bootstrap method is proved and some applications of the developed theory are presented
DeNoising By SoftThresholding
, 1992
"... Donoho and Johnstone (1992a) proposed a method for reconstructing an unknown function f on [0; 1] from noisy data di = f(ti)+ zi, iid i =0;:::;n 1, ti = i=n, zi N(0; 1). The reconstruction fn ^ is de ned in the wavelet domain by translating all the empirical wavelet coe cients of d towards 0 by an a ..."
Abstract

Cited by 1279 (14 self)
 Add to MetaCart
by an amount p 2 log(n) = p n. We prove two results about that estimator. [Smooth]: With high probability ^ fn is at least as smooth as f, in any of a wide variety of smoothness measures. [Adapt]: The estimator comes nearly as close in mean square to f as any measurable estimator can come, uniformly over
Locally weighted learning
 ARTIFICIAL INTELLIGENCE REVIEW
, 1997
"... This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias, ass ..."
Abstract

Cited by 599 (51 self)
 Add to MetaCart
This paper surveys locally weighted learning, a form of lazy learning and memorybased learning, and focuses on locally weighted linear regression. The survey discusses distance functions, smoothing parameters, weighting functions, local model structures, regularization of the estimates and bias
Results 1  10
of
10,458