Results 1  10
of
18
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 792 (27 self)
 Add to MetaCart
(Show Context)
The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances — including the key problems of computing marginals and modes of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. We describe how a wide varietyof algorithms — among them sumproduct, cluster variational methods, expectationpropagation, mean field methods, maxproduct and linear programming relaxation, as well as conic programming relaxations — can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
Maximum entropy analysis of the spectral functions in lattice qcd
"... First principle calculation of the QCD spectral functions (SPFs) based on the lattice QCD simulations is reviewed. Special emphasis is placed on the Bayesian inference theory and the Maximum Entropy Method (MEM), which is a useful tool to extract SPFs from the imaginarytime correlation functions nu ..."
Abstract

Cited by 34 (0 self)
 Add to MetaCart
(Show Context)
First principle calculation of the QCD spectral functions (SPFs) based on the lattice QCD simulations is reviewed. Special emphasis is placed on the Bayesian inference theory and the Maximum Entropy Method (MEM), which is a useful tool to extract SPFs from the imaginarytime correlation functions numerically obtained by the Monte Carlo method. Three important aspects of MEM are (i) it does not require a priori assumptions or parametrizations of SPFs, (ii) for given data, a unique solution is obtained if it exists, and (iii) the statistical significance of the solution can be quantitatively analyzed. The ability of MEM is explicitly demonstrated by using mock data as well as lattice QCD data. When applied to lattice data, MEM correctly reproduces the lowenergy resonances and shows the existence of highenergy continuum in hadronic correlation functions. This opens up various possibilities for studying hadronic properties in QCD beyond the conventional way of analyzing the lattice data. Future problems to be studied by MEM in lattice QCD are also summarized. Contents 1
Maximum Entropy Production as an Inference Algorithm that Translates Physical Assumptions into Macroscopic Predictions: Don’t Shoot the Messenger
 ENTROPY
, 2009
"... ..."
Minimum entropy approach to word segmentation problems
 Physica A
"... Given a sequence composed of a limit number of characters, we try to “read” it as a “text”. This involves to segment the sequence into “words”. The difficulty is to distinguish good segmentation from enormous number of random ones. Aiming at revealing the nonrandomness of the sequence as strongly as ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
Given a sequence composed of a limit number of characters, we try to “read” it as a “text”. This involves to segment the sequence into “words”. The difficulty is to distinguish good segmentation from enormous number of random ones. Aiming at revealing the nonrandomness of the sequence as strongly as possible, by applying maximum likelihood method, we find a quantity called segmentation entropy that can be used to fulfill the duty. Contrary to commonplace where maximum entropy principle was applied to obtain good solution, we choose to minimize the segmentation entropy to obtain good segmentation. The concept developed in this letter can be used to study the noncoding DNA sequences, e.g., for regulatory elements prediction, in eukaryote genomes. 1 I. INTRODUCTION. The problem addressed in this paper is rather elementary in statistics. It is best described as the following: suppose one who knows nothing about English language was given a
A Generalized Approach for Atomic Force Microscopy Image Restoration with Bregman Distances as Tikhonov Regularization Term
 3rd International Conference on Inverse Problems in Engineering (3ICIPE), Book
, 1999
"... Tikhonov’s regularization approach applied to image restoration, stated in terms of illposed problems, has proved to be a powerful tool to solve noisy and incomplete data. This work proposes a variable norm discrepancy function as the regularization term, where the entropy functional was derived. O ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Tikhonov’s regularization approach applied to image restoration, stated in terms of illposed problems, has proved to be a powerful tool to solve noisy and incomplete data. This work proposes a variable norm discrepancy function as the regularization term, where the entropy functional was derived. Our method is applied to true Atomic Force Microscopy (AFM) biological images, producing satisfactory results. These images represent a mapping of local interaction forces exerted between a reduced scaled AFM sensing tip and the biological sample, kept alive in aqueous or air enviroment.
Reinterpretation and Enhancement Of SignalSubspaceBased Imaging Methods For Extended Scatterers
, 2009
"... Interior sampling and exterior sampling (enclosure) signalsubspacebased imaging methodologies for extended scatterers derived in previous work are reformulated and reinterpreted in terms of the concepts of angles and distances between subspaces. The insight gained from this reformulation naturally ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Interior sampling and exterior sampling (enclosure) signalsubspacebased imaging methodologies for extended scatterers derived in previous work are reformulated and reinterpreted in terms of the concepts of angles and distances between subspaces. The insight gained from this reformulation naturally paves the way for a broader, more encompassing inversion methodology based on a crosscoherence matrix associated to the singular vectors of the scattering or response matrix and the singular vectors intrinsic to a given, hypothesized support region for the scatterers (under a known background Green function associated to a known embedding medium where the scatterers reside). A number of new imaging functionals based on that crosscoherence matrix emerge, being of particular interest imaging functionals based on informationtheoretic concepts applied to an interpretation of the entries in that matrix as probability amplitudes. The resulting approach is based on entropy minimization, and it has the enormous advantage of not requiring for its implementation the estimation of a cutoff in the singular value spectrum separating signal versus noise subspaces, which is a common computational difficulty in both imaging and shape reconstruction contexts. The theoretical and computational concepts developed in the paper are illustrated for electromagnetic scattering examples in twodimensional space. Both imaging and shape reconstruction contexts are considered, and in the shape reconstruction context it is also shown how to combine the signal subspace approach with the level set method.
Spectrum Estimation Using Multirate Observations
, 2003
"... This article considers merging the statistical information gained in lowrate measurements of a nonobservable highrate signal. We consider a model where a widesense stationary random signal x(n) is being observed indirectly using several linear multirate sensors. Each sensor outputs a measurement ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
This article considers merging the statistical information gained in lowrate measurements of a nonobservable highrate signal. We consider a model where a widesense stationary random signal x(n) is being observed indirectly using several linear multirate sensors. Each sensor outputs a measurement signal v i (n) whose sampling rate is only a fraction of the sampling rate assumed for the original signal. We pose the following problem: Given certain autocorrelation coefficients of the observable signals v i (n), estimate the power spectral density of the original signal x(n). It turns
Information Theory Of Multirate Systems
 in IEEE International Symposium on Information Theory (ISIT
, 2001
"... The main objective of this paper is to study the statistical information gained about a random signal by linear observations made at dierent sampling rates. To do this, we rst consider estimating the statistics of a discretetime random signal based on statistics obtained from a ltered and downsamp ..."
Abstract

Cited by 2 (2 self)
 Add to MetaCart
(Show Context)
The main objective of this paper is to study the statistical information gained about a random signal by linear observations made at dierent sampling rates. To do this, we rst consider estimating the statistics of a discretetime random signal based on statistics obtained from a ltered and downsampled version of it. This problem is inherently illposed since, in general, lowrate observations give only partial information about a signal's statistics. We use the Maximum Entropy Principle as a formal method of inductive inference and provide a solution for the case of stationary random signals and linear observers. We then formulate a measure of informativity for multirate observations. Using this measure, we are able to rank multirate observers in a class and order them based on how informative their output signals are. The ordering induced this way is a partial ordering using which we introduce a novel informationtheoretic notion of optimality for multirate linear systems. 1. INT...
Multirate Spectral Estimation
 in Intl. Conf. Acoustic, Speech and Signal Processing (ICASSP
, 2001
"... This article introduces a mathematical theory for estimating the power spectral density (PSD) of a random signal based on lowsamplingrate measurements. We formulate the problem using a mathematical model where an observer sees a discretetime WSS random signal x(n) through a bank of measurement de ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
This article introduces a mathematical theory for estimating the power spectral density (PSD) of a random signal based on lowsamplingrate measurements. We formulate the problem using a mathematical model where an observer sees a discretetime WSS random signal x(n) through a bank of measurement devices or sensors. Each sensor outputs a measurement signal v i (n) whose sampling rate is only a fraction of the sampling rate assumed for the original nonobservable signal.