Results 1  10
of
646,566
Probabilistic Visual Learning for Object Representation
, 1996
"... We present an unsupervised technique for visual learning which is based on density estimation in highdimensional spaces using an eigenspace decomposition. Two types of density estimates are derived for modeling the training data: a multivariate Gaussian (for unimodal distributions) and a Mixtureof ..."
Abstract

Cited by 705 (15 self)
 Add to MetaCart
We present an unsupervised technique for visual learning which is based on density estimation in highdimensional spaces using an eigenspace decomposition. Two types of density estimates are derived for modeling the training data: a multivariate Gaussian (for unimodal distributions) and a MixtureofGaussians
Probabilistic Visual Learning for Object Detection
, 1995
"... We present an unsupervised technique for visual learning which is based on density estimation in highdimensional spaces using an eigenspace decomposition. Two types of density estimates are derived for modeling the training data: a multivariate Gaussian (for a unimodal distribution) and a multivari ..."
Abstract

Cited by 235 (16 self)
 Add to MetaCart
multivariate MixtureofGaussians model (for multimodal distributions). These probability densities are then used to formulate a maximumlikelihood estimation framework for visual search and target detection for automatic object recognition. This learning technique is tested in experiments with modeling
Notes on methods based on maximumlikelihood estimation for learning the parameters of the mixture of Gaussians model
, 1999
"... In these notes, we present and review dierent methods based on maximumlikelihood estimation for learning the parameters of the mixtureofGaussians model. We describe a method based on the likelihood equations, traditional gradientbased methods (among them steepest ascent and gradient ascent), exp ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
In these notes, we present and review dierent methods based on maximumlikelihood estimation for learning the parameters of the mixtureofGaussians model. We describe a method based on the likelihood equations, traditional gradientbased methods (among them steepest ascent and gradient ascent
Mixture Models and the Segmentation of Multimodal Textures
"... A problem of using mixtureofGaussian models for unsupervised texture segmentation is that “multimodal ” textures (such as can often be encountered in natural images) cannot be well represented by a single Gaussian cluster. We propose a divideandconquer method that groups together Gaussian clust ..."
Abstract
 Add to MetaCart
A problem of using mixtureofGaussian models for unsupervised texture segmentation is that “multimodal ” textures (such as can often be encountered in natural images) cannot be well represented by a single Gaussian cluster. We propose a divideandconquer method that groups together Gaussian
Speaker verification using Adapted Gaussian mixture models
 Digital Signal Processing
, 2000
"... In this paper we describe the major elements of MIT Lincoln Laboratory’s Gaussian mixture model (GMM)based speaker verification system used successfully in several NIST Speaker Recognition Evaluations (SREs). The system is built around the likelihood ratio test for verification, using simple but ef ..."
Abstract

Cited by 976 (42 self)
 Add to MetaCart
In this paper we describe the major elements of MIT Lincoln Laboratory’s Gaussian mixture model (GMM)based speaker verification system used successfully in several NIST Speaker Recognition Evaluations (SREs). The system is built around the likelihood ratio test for verification, using simple
Learning sparse codes with a mixtureofGaussians prior
 In Advances in Neural Information Processing Systems
, 2000
"... We describe a method for learning an overcomplete set of basis functions for the purpose of modeling sparse structure in images. The sparsity of the basis function coefficients is modeled with a mixtureofGaussians distribution. One Gaussian captures nonactive coefficients with a smallvariance ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
We describe a method for learning an overcomplete set of basis functions for the purpose of modeling sparse structure in images. The sparsity of the basis function coefficients is modeled with a mixtureofGaussians distribution. One Gaussian captures nonactive coefficients with a small
Image denoising using a scale mixture of Gaussians in the wavelet domain
 IEEE TRANS IMAGE PROCESSING
, 2003
"... We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vecto ..."
Abstract

Cited by 514 (17 self)
 Add to MetaCart
We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian
A gentle tutorial on the EM algorithm and its application to parameter estimation for gaussian mixture and hidden markov models
, 1997
"... We describe the maximumlikelihood parameter estimation problem and how the Expectationform of the EM algorithm as it is often given in the literature. We then develop the EM parameter estimation procedure for two applications: 1) finding the parameters of a mixture of Gaussian densities, and 2) fi ..."
Abstract

Cited by 678 (4 self)
 Add to MetaCart
) finding the parameters of a hidden Markov model (HMM) (i.e., the BaumWelch algorithm) for both discrete and Gaussian mixture observation models. We derive the update equations in fairly explicit detail but we do not prove any convergence properties. We try to emphasize intuition rather than mathematical
Dynamic control of adaptive mixtureofgaussians background model
 In Proceedings of the IEEE International Conference on Video and Signal Based Surveillance
"... We propose a method for create a background model in nonstationary scenes. Each pixel has a dynamic Gaussian mixture model. Our approach can automatically change the number of Gaussians in each pixel. The number of Gaussians increases when pixel values often change because of Illumination change, ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
We propose a method for create a background model in nonstationary scenes. Each pixel has a dynamic Gaussian mixture model. Our approach can automatically change the number of Gaussians in each pixel. The number of Gaussians increases when pixel values often change because of Illumination change
A reevaluation of mixtureofgaussian background modeling
, 2005
"... Mixture of Gaussians (MOG) has been widely used for robustly modeling complicated backgrounds, especially those with small repetitive movements (such as leaves, bushes, rotating fan, ocean waves, rain). The performance of MOG can be greatly improved by tackling several practical issues. In this pape ..."
Abstract

Cited by 11 (0 self)
 Add to MetaCart
Mixture of Gaussians (MOG) has been widely used for robustly modeling complicated backgrounds, especially those with small repetitive movements (such as leaves, bushes, rotating fan, ocean waves, rain). The performance of MOG can be greatly improved by tackling several practical issues
Results 1  10
of
646,566