Results 1  10
of
777,997
Mutual information and minimum meansquare error in Gaussian channels
 IEEE TRANS. INFORM. THEORY
, 2005
"... This paper deals with arbitrarily distributed finitepower input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the inputoutput mutual information and the minimum meansquare error (MMSE) achievable by optimal estimation of the input given the out ..."
Abstract

Cited by 285 (32 self)
 Add to MetaCart
This paper deals with arbitrarily distributed finitepower input signals observed through an additive Gaussian noise channel. It shows a new formula that connects the inputoutput mutual information and the minimum meansquare error (MMSE) achievable by optimal estimation of the input given
Mean shift, mode seeking, and clustering
 IEEE Transactions on Pattern Analysis and Machine Intelligence
, 1995
"... AbstractMean shift, a simple iterative procedure that shifts each data point to the average of data points in its neighborhood, is generalized and analyzed in this paper. This generalization makes some kmeans like clustering algorithms its special cases. It is shown that mean shift is a modeseeki ..."
Abstract

Cited by 620 (0 self)
 Add to MetaCart
AbstractMean shift, a simple iterative procedure that shifts each data point to the average of data points in its neighborhood, is generalized and analyzed in this paper. This generalization makes some kmeans like clustering algorithms its special cases. It is shown that mean shift is a mode
Cascade multiterminal source coding
 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY. AUTHORIZED LICENSED USE LIMITED TO: STANFORD UNIVERSITY. DOWNLOADED ON MARCH 02,2010 AT 16:56:04 EST FROM IEEE XPLORE. RESTRICTIONS APPLY
, 2009
"... We investigate distributed source coding of two correlated sources X and Y where messages are passed to a decoder in a cascade fashion. The encoder of X sends a message at rate R 1 to the encoder of Y. The encoder of Y then sends a message to the decoder at rate R 2 based both on Y and on the messa ..."
Abstract

Cited by 22 (6 self)
 Add to MetaCart
and on the message it received about X. The decoder's task is to estimate a function of X and Y. For example, we consider the minimum mean squarederror distortion when encoding the sum of jointly Gaussian random variables under these constraints. We also characterize the rates needed to reconstruct a function
A minimum squarederror framework for generalized sampling
 IEEE Trans. Sig. Proc
, 2006
"... We treat the problem of reconstructing a signal from its nonideal samples where the sampling and reconstruction spaces as well as the class of input signals can be arbitrary subspaces of a Hilbert space. Our formulation is general, and includes as special cases reconstruction from finitely many sam ..."
Abstract

Cited by 38 (23 self)
 Add to MetaCart
reconstruction. The approaches we propose differ in their assumptions on the input signal: If the signal is known to lie in an appropriately chosen subspace, then we propose a method that achieves the minimal squarederror. On the other hand, when the signal is not restricted, we show that the minimal
Quantization Index Modulation: A Class of Provably Good Methods for Digital Watermarking and Information Embedding
 IEEE TRANS. ON INFORMATION THEORY
, 1999
"... We consider the problem of embedding one signal (e.g., a digital watermark), within another "host" signal to form a third, "composite" signal. The embedding is designed to achieve efficient tradeoffs among the three conflicting goals of maximizing informationembedding rate, mini ..."
Abstract

Cited by 495 (15 self)
 Add to MetaCart
(AWGN) channels, which may be good models for hybrid transmission applications such as digital audio broadcasting, and meansquareerrorconstrained attack channels that model privatekey watermarking applications.
Representing twentieth century spacetime climate variability, part 1: development of a 196190 mean monthly terrestrial climatology
 Journal of Climate
, 1999
"... The construction of a 0.58 lat 3 0.58 long surface climatology of global land areas, excluding Antarctica, is described. The climatology represents the period 1961–90 and comprises a suite of nine variables: precipitation, wetday frequency, mean temperature, diurnal temperature range, vapor pressur ..."
Abstract

Cited by 551 (12 self)
 Add to MetaCart
The construction of a 0.58 lat 3 0.58 long surface climatology of global land areas, excluding Antarctica, is described. The climatology represents the period 1961–90 and comprises a suite of nine variables: precipitation, wetday frequency, mean temperature, diurnal temperature range, vapor
Image denoising using a scale mixture of Gaussians in the wavelet domain
 IEEE TRANS IMAGE PROCESSING
, 2003
"... We describe a method for removing noise from digital images, based on a statistical model of the coefficients of an overcomplete multiscale oriented basis. Neighborhoods of coefficients at adjacent positions and scales are modeled as the product of two independent random variables: a Gaussian vecto ..."
Abstract

Cited by 514 (17 self)
 Add to MetaCart
published methods, both visually and in terms of mean squared error.
Constrained Kmeans Clustering with Background Knowledge
 In ICML
, 2001
"... Clustering is traditionally viewed as an unsupervised method for data analysis. However, in some cases information about the problem domain is available in addition to the data instances themselves. In this paper, we demonstrate how the popular kmeans clustering algorithm can be pro tably modi ed ..."
Abstract

Cited by 473 (9 self)
 Add to MetaCart
Clustering is traditionally viewed as an unsupervised method for data analysis. However, in some cases information about the problem domain is available in addition to the data instances themselves. In this paper, we demonstrate how the popular kmeans clustering algorithm can be pro tably modi ed
An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants
 MACHINE LEARNING
, 1999
"... Methods for voting classification algorithms, such as Bagging and AdaBoost, have been shown to be very successful in improving the accuracy of certain classifiers for artificial and realworld datasets. We review these algorithms and describe a large empirical study comparing several variants in co ..."
Abstract

Cited by 695 (2 self)
 Add to MetaCart
the increase in the
average tree size in AdaBoost trials and its success in reducing the
error. We compare the meansquared error of voting methods to
nonvoting methods and show that the voting methods lead to large
and significant reductions in the meansquared errors. Practical
problems
Algorithms for Nonnegative Matrix Factorization
 In NIPS
, 2001
"... Nonnegative matrix factorization (NMF) has previously been shown to be a useful decomposition for multivariate data. Two different multiplicative algorithms for NMF are analyzed. They differ only slightly in the multiplicative factor used in the update rules. One algorithm can be shown to minim ..."
Abstract

Cited by 1230 (5 self)
 Add to MetaCart
to minimize the conventional least squares error while the other minimizes the generalized KullbackLeibler divergence. The monotonic convergence of both algorithms can be proven using an auxiliary function analogous to that used for proving convergence of the ExpectationMaximization algorithm
Results 1  10
of
777,997