Results 1  10
of
306,339
Hierarchical mixtures of experts and the EM algorithm
, 1993
"... We present a treestructured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM’s). Learning is treated as a maximum likelihood ..."
Abstract

Cited by 873 (21 self)
 Add to MetaCart
We present a treestructured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM’s). Learning is treated as a maximum likelihood
Image denoising using a scale mixture of Gaussians in the wavelet domain
 IEEE TRANS IMAGE PROCESSING
, 2003
"... We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vecto ..."
Abstract

Cited by 509 (17 self)
 Add to MetaCart
We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian
Equivariant Adaptive Source Separation
 IEEE Trans. on Signal Processing
, 1996
"... Source separation consists in recovering a set of independent signals when only mixtures with unknown coefficients are observed. This paper introduces a class of adaptive algorithms for source separation which implements an adaptive version of equivariant estimation and is henceforth called EASI (Eq ..."
Abstract

Cited by 447 (9 self)
 Add to MetaCart
Source separation consists in recovering a set of independent signals when only mixtures with unknown coefficients are observed. This paper introduces a class of adaptive algorithms for source separation which implements an adaptive version of equivariant estimation and is henceforth called EASI
Unsupervised Learning for Source Separation with Mixture of Gaussian Prior for Sources and Gaussian Prior for Mixture Coefficients
, 2001
"... In this contribution, we present two new algorithms for unsupervised learning and source separation for the case of noisy instantaneous linear mixture, within the Bayesian inference framework. The source distribution prior is modeled by a mixture of Gaussians [10]andthe mixing matrix elements distri ..."
Abstract

Cited by 12 (10 self)
 Add to MetaCart
In this contribution, we present two new algorithms for unsupervised learning and source separation for the case of noisy instantaneous linear mixture, within the Bayesian inference framework. The source distribution prior is modeled by a mixture of Gaussians [10]andthe mixing matrix elements
Bayesian source separation with mixture of Gaussians prior for sources and Gaussian prior for mixture coefficients
, 2000
"... In this contribution, we present new algorithms to source separation for the case of noisy instantaneous linear mixture, within the Bayesian statistical framework. The source distribution prior is modeled by a mixture of Gaussians [1] and the mixing matrix elements distributions by a Gaussian [2]. W ..."
Abstract
 Add to MetaCart
In this contribution, we present new algorithms to source separation for the case of noisy instantaneous linear mixture, within the Bayesian statistical framework. The source distribution prior is modeled by a mixture of Gaussians [1] and the mixing matrix elements distributions by a Gaussian [2
2475–2479 This journal is © The Royal Society of Chemistry 2008 Pu bl ish ed o n M ar ch 8. D ow nl oa de d by P en ns yl va ni a St at e U ni ve rs ity o n /0
, 2008
"... First published as an Advance Article on the web 3rd January 2001 The electrochemical generation and characterisation of a variety of oquinodimethanes (oQDMs) are described together with the outcome of preparative experiments in which they are key intermediates. The quinodimethanes are convenientl ..."
Abstract

Cited by 339 (7 self)
 Add to MetaCart
First published as an Advance Article on the web 3rd January 2001 The electrochemical generation and characterisation of a variety of oquinodimethanes (oQDMs) are described together with the outcome of preparative experiments in which they are key intermediates. The quinodimethanes are conveniently formed, in DMF, by both direct and redoxcatalysed electroreduction of 1,2bis(halomethyl)arenes. Their predominant reaction is polymerisation to poly(oxylylene) (oPX) polymers. In the presence of dienophiles the electrogenerated oQDMs may undergo efficient cycloaddition reaction and distinctions between the possible mechanisms have been attempted on the basis of voltammetric, preparative and stereochemical experiments. Contrary to the precedent of the corresponding methyl ester, diphenyl maleate radicalanion isomerises only slowly to the fumarate radicalanion, yet coelectrolysis of 2,3bis(bromomethyl)1,4dimethoxybenzene and diphenyl maleate or diphenyl fumarate gives exclusively the corresponding transadduct. Coelectrolysis of dimethyl maleate with either 1,2bis(bromomethyl)benzene (more easily reduced) or 2,3bis(bromomethyl)1,4dimethoxybenzene (less easily reduced) gave only oPX polymer. The results are rationalised in terms of a double nucleophilic substitution mechanism where electron transfer between dienophile radicalanion and dihalide is relatively slow. Where electron transfer from maleate or fumarate radicalanions is likely to be fast oquinodimethanes are formed by redoxcatalysis and they polymerise rather than undergo Diels–Alder reaction. Dimerisation of the dienophile radicalanions, with k2 = 104 to 105 M1 s1, does not apparently compete with nucleophilic substitution or, where relevant, electron transfer.
Image Coding based on Mixture Modeling of Wavelet Coefficients and a Fast EstimationQuantization Framework
, 1997
"... We introduce a new image compression paradigm that combines compression efficiency with speed, and is based on an independent "infinite" mixture model which accurately captures the spacefrequency characterization of the wavelet image representation. Specifically, we model image wavelet co ..."
Abstract

Cited by 166 (12 self)
 Add to MetaCart
We introduce a new image compression paradigm that combines compression efficiency with speed, and is based on an independent "infinite" mixture model which accurately captures the spacefrequency characterization of the wavelet image representation. Specifically, we model image wavelet
Scale Mixtures of Gaussians and the Statistics of Natural Images
 in Adv. Neural Information Processing Systems
, 2000
"... The statistics of photographic images, when represented using multiscale (wavelet) bases, exhibit two striking types of nonGaussian behavior. First, the marginal densities of the coefficients have extended heavy tails. Second, the joint densities exhibit variance dependencies not captured by secon ..."
Abstract

Cited by 172 (17 self)
 Add to MetaCart
by secondorder models. We examine properties of the class of Gaussian scale mixtures, and show that these densities can accurately characterize both the marginal and joint distributions of natural image wavelet coefficients. This class of model suggests a Markov structure, in which wavelet coefficients
Deep Neural Networks for Acoustic Modeling in Speech Recognition
"... Most current speech recognition systems use hidden Markov models (HMMs) to deal with the temporal variability of speech and Gaussian mixture models to determine how well each state of each HMM fits a frame or a short window of frames of coefficients that represents the acoustic input. An alternative ..."
Abstract

Cited by 230 (38 self)
 Add to MetaCart
Most current speech recognition systems use hidden Markov models (HMMs) to deal with the temporal variability of speech and Gaussian mixture models to determine how well each state of each HMM fits a frame or a short window of frames of coefficients that represents the acoustic input
Results 1  10
of
306,339