@MISC{Privacy_assumethat, author = {Redacted For Privacy and Dr. H. D. Brunk}, title = {Assume that Y 1}, year = {} }
Share
OpenURL
Abstract
are i.i.d. observations from a distribu-tion with a continuous density function g. Let yE(-0. A density n function estimator which can be written in the form 1 K(y,Yj) is called a kernel estimator with kernel K. Whittle (1958) proposed selecting a kernel estimator of a density, using as criterion expected square error, and observed that implementation of this approach requires the specification of only first and second moments of the joint distri-bution of the values of the density function g(.) at the various values of its argument. Hartigan-(1969) described Whittle's approach as a "linear Bayes " approach. Brunk (1980) preferred to refer to it as "Bayesian Least Squares " because, as for ordinary least squares, both input and output involve only first and second moments. In this thesis, Brunk (1980)'s Bayesian Least Squares method has been slightly modified and applied to the estimation of univariate and mixing densities. For the univariate density estimation, we begin with a prescribed prior mean probability density g of g, and let {P r (y)} be a pre--scribed sequence of functions orthonormal w.r.t. g o, with P