Results 1  10
of
63
Quantization
 IEEE TRANS. INFORM. THEORY
, 1998
"... The history of the theory and practice of quantization dates to 1948, although similar ideas had appeared in the literature as long ago as 1898. The fundamental role of quantization in modulation and analogtodigital conversion was first recognized during the early development of pulsecode modula ..."
Abstract

Cited by 884 (12 self)
 Add to MetaCart
The history of the theory and practice of quantization dates to 1948, although similar ideas had appeared in the literature as long ago as 1898. The fundamental role of quantization in modulation and analogtodigital conversion was first recognized during the early development of pulsecode modulation systems, especially in the 1948 paper of Oliver, Pierce, and Shannon. Also in 1948, Bennett published the first highresolution analysis of quantization and an exact analysis of quantization noise for Gaussian processes, and Shannon published the beginnings of rate distortion theory, which would provide a theory for quantization as analogtodigital conversion and as data compression. Beginning with these three papers of fifty years ago, we trace the history of quantization from its origins through this decade, and we survey the fundamentals of the theory and many of the popular and promising techniques for quantization.
GTM: The generative topographic mapping
 Neural Computation
, 1998
"... Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper ..."
Abstract

Cited by 361 (6 self)
 Add to MetaCart
(Show Context)
Latent variable models represent the probability density of data in a space of several dimensions in terms of a smaller number of latent, or hidden, variables. A familiar example is factor analysis which is based on a linear transformations between the latent space and the data space. In this paper we introduce a form of nonlinear latent variable model called the Generative Topographic Mapping for which the parameters of the model can be determined using the EM algorithm. GTM provides a principled alternative to the widely used SelfOrganizing Map (SOM) of Kohonen (1982), and overcomes most of the significant limitations of the SOM. We demonstrate the performance of the GTM algorithm on a toy problem and on simulated data from flow diagnostics for a multiphase oil pipeline. Copyright c○MIT Press (1998). 1
Deterministic Annealing for Clustering, Compression, Classification, Regression, and Related Optimization Problems
 Proceedings of the IEEE
, 1998
"... this paper. Let us place it within the neural network perspective, and particularly that of learning. The area of neural networks has greatly benefited from its unique position at the crossroads of several diverse scientific and engineering disciplines including statistics and probability theory, ph ..."
Abstract

Cited by 321 (20 self)
 Add to MetaCart
this paper. Let us place it within the neural network perspective, and particularly that of learning. The area of neural networks has greatly benefited from its unique position at the crossroads of several diverse scientific and engineering disciplines including statistics and probability theory, physics, biology, control and signal processing, information theory, complexity theory, and psychology (see [45]). Neural networks have provided a fertile soil for the infusion (and occasionally confusion) of ideas, as well as a meeting ground for comparing viewpoints, sharing tools, and renovating approaches. It is within the illdefined boundaries of the field of neural networks that researchers in traditionally distant fields have come to the realization that they have been attacking fundamentally similar optimization problems.
Unsupervised Texture Segmentation in a Deterministic Annealing Framework
, 1998
"... We present a novel optimization framework for unsupervised texture segmentation that relies on statistical tests as a measure of homogeneity. Texture segmentation is formulated as a data clustering problem based on sparse proximity data. Dissimilarities of pairs of textured regions are computed from ..."
Abstract

Cited by 104 (9 self)
 Add to MetaCart
(Show Context)
We present a novel optimization framework for unsupervised texture segmentation that relies on statistical tests as a measure of homogeneity. Texture segmentation is formulated as a data clustering problem based on sparse proximity data. Dissimilarities of pairs of textured regions are computed from a multiscale Gabor filter image representation. We discuss and compare a class of clustering objective functions which is systematically derived from invariance principles. As a general optimization framework we propose deterministic annealing based on a meanfield approximation. The canonical way to derive clustering algorithms within this framework as well as an efficient implementation of meanfield annealing and the closely related Gibbs sampler are presented. We apply both annealing variants to Brodatzlike microtexture mixtures and realword images.
Data clustering using a model granular magnet
 Neural Computation
, 1997
"... We present a new approach to clustering, based on the physical properties of an inhomogeneous ferromagnet. No assumption is made regarding the underlying distribution of the data. We assign a Potts spin to each data point and introduce an interaction between neighboring points, whose strength is a d ..."
Abstract

Cited by 72 (4 self)
 Add to MetaCart
(Show Context)
We present a new approach to clustering, based on the physical properties of an inhomogeneous ferromagnet. No assumption is made regarding the underlying distribution of the data. We assign a Potts spin to each data point and introduce an interaction between neighboring points, whose strength is a decreasing function of the distance between the neighbors. This magnetic system exhibits three phases. At very low temperatures, it is completely ordered; all spins are aligned. At very high temperatures, the system does not exhibit any ordering, and in an intermediate regime, clusters of relatively strongly coupled spins become ordered, whereas different clusters remain uncorrelated. This intermediate phase is identified by a jump in the order parameters. The spinspin correlation function is used to partition the spins and the corresponding data points into clusters. We demonstrate on three synthetic and three real data sets how the method works. Detailed comparison to the performance of other techniques clearly indicates the relative success of our method. 1
Vector Quantization of Image Subbands: A Survey
 IEEE Transactions on Image Processing
, 1996
"... Subband and wavelet decompositions are powerful tools in image coding, because of their decorrelating effects on image pixels, the concentration of energy in a few coefficients, their multirate/multiresolution framework, and their frequency splitting which allows for efficient coding matched to the ..."
Abstract

Cited by 59 (6 self)
 Add to MetaCart
(Show Context)
Subband and wavelet decompositions are powerful tools in image coding, because of their decorrelating effects on image pixels, the concentration of energy in a few coefficients, their multirate/multiresolution framework, and their frequency splitting which allows for efficient coding matched to the statistics of each frequency band and to the characteristics of the human visual system. Vector quantization provides a means of converting the decomposed signal into bits in a manner that takes advantage of remaining inter and intraband correlation as well as of the more flexible partitions of higher dimensional vector spaces. Since 1988 a growing body of research has examined the use of vector quantization for subband/wavelet transform coefficients. We present a survey of these methods. 1 Introduction Image compression maps an original image into a bit stream suitable for communication over or storage in a digital medium. The number of bits required to represent the coded image should b...
SelfOrganized Formation of Various InvariantFeature Filters in the AdaptiveSubspace SOM
 NEURAL COMPUTATION
, 1997
"... The AdaptiveSubspace SOM (ASSOM) is a modular neuralnetwork architecture, the modules of which learn to identify input patterns subject to some simple transformations. The learning process is unsupervised, competitive, and related to that of the traditional SOM (SelfOrganizing Map). Each neural m ..."
Abstract

Cited by 43 (0 self)
 Add to MetaCart
The AdaptiveSubspace SOM (ASSOM) is a modular neuralnetwork architecture, the modules of which learn to identify input patterns subject to some simple transformations. The learning process is unsupervised, competitive, and related to that of the traditional SOM (SelfOrganizing Map). Each neural module becomes adaptively specific to some restricted class of transformations, and modules close to each other in the network become tuned to similar features in an orderly fashion. If different transformations exist in the input signals, different subsets of ASSOM units become tuned to these transformation classes.
SelfOrganizing Maps: Generalizations and New Optimization Techniques
 Neurocomputing
, 1998
"... We offer three algorithms for the generation of topographic mappings to the practitioner of unsupervised data analysis. The algorithms are each based on the minimization of a cost function which is performed using an EM algorithm and deterministic annealing. The soft topographic vector quantization ..."
Abstract

Cited by 41 (1 self)
 Add to MetaCart
(Show Context)
We offer three algorithms for the generation of topographic mappings to the practitioner of unsupervised data analysis. The algorithms are each based on the minimization of a cost function which is performed using an EM algorithm and deterministic annealing. The soft topographic vector quantization algorithm (STVQ)  like the original SelfOrganizing Map (SOM)  provides a tool for the creation of selforganizing maps of Euclidean data. Its optimization scheme, however, offers an alternative to the heuristic stepwise shrinking of the neighborhood width in the SOM and makes it possible to use a fixed neighborhood function solely to encode desired neighborhood relations between nodes. The kernelbased soft topographic mapping (STMK) is a generalization of STVQ and introduces new distance measures in data space based on kernel functions. Using the new distance measures corresponds to performing the STVQ in a highdimensional feature space, which is related to data space by a nonlinear ma...
A Stochastic SelfOrganizing Map for Proximity Data
 Neural Computation
, 1999
"... We derive an efficient algorithm for topographic mapping of proximity data (TMP), which can be seen as an extension of Kohonen's SelfOrganizing Map to arbitrary distance measures. The TMP cost function is derived in a Baysian framework of Folded Markov Chains for the description of autoencoders ..."
Abstract

Cited by 35 (7 self)
 Add to MetaCart
We derive an efficient algorithm for topographic mapping of proximity data (TMP), which can be seen as an extension of Kohonen's SelfOrganizing Map to arbitrary distance measures. The TMP cost function is derived in a Baysian framework of Folded Markov Chains for the description of autoencoders. It incorporates the data via a dissimilarity matrix D and the topographic neighborhood via a matrix H of transition probabilities. From the principle of Maximum Entropy a nonfactorizing Gibbsdistribution is obtained, which is approximated in a meanfield fashion. This allows for Maximum Likelihood estimation using an EM algorithm. In analogy to the transition from Topographic Vector Quantization (TVQ) to the Selforganizing Map (SOM) we suggest an approximation to TMP which is computationally more efficient. In order to prevent convergence to local minima, an annealing scheme in the temperature parameter is introduced, for which the critical temperature of the first phasetransition is calcul...