Results 1  10
of
19
Graphical models, exponential families, and variational inference
, 2008
"... The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fiel ..."
Abstract

Cited by 792 (27 self)
 Add to MetaCart
The formalism of probabilistic graphical models provides a unifying framework for capturing complex dependencies among random variables, and building largescale multivariate statistical models. Graphical models have become a focus of research in many statistical, computational and mathematical fields, including bioinformatics, communication theory, statistical physics, combinatorial optimization, signal and image processing, information retrieval and statistical machine learning. Many problems that arise in specific instances — including the key problems of computing marginals and modes of probability distributions — are best studied in the general setting. Working with exponential family representations, and exploiting the conjugate duality between the cumulant function and the entropy for exponential families, we develop general variational representations of the problems of computing likelihoods, marginal probabilities and most probable configurations. We describe how a wide varietyof algorithms — among them sumproduct, cluster variational methods, expectationpropagation, mean field methods, maxproduct and linear programming relaxation, as well as conic programming relaxations — can all be understood in terms of exact or approximate forms of these variational representations. The variational approach provides a complementary alternative to Markov chain Monte Carlo as a general source of approximation methods for inference in largescale statistical models.
Motion Picture Restoration
, 1993
"... This dissertation presents algorithms for restoring some of the major corruptions observed in archived film or video material. The two principal problems of impulsive distortion (Dirt and Sparkle or Blotches) and noise degradation are considered. There is also an algorithm for suppressing the inter ..."
Abstract

Cited by 66 (11 self)
 Add to MetaCart
This dissertation presents algorithms for restoring some of the major corruptions observed in archived film or video material. The two principal problems of impulsive distortion (Dirt and Sparkle or Blotches) and noise degradation are considered. There is also an algorithm for suppressing the interline jitter common in images decoded from noisy video signals. In the case of noise reduction and Blotch removal the thesis considers image sequences to be three dimensional signals involving evolution of features in time and space. This is necessary if any process presented is to show an improvement over standard twodimensional techniques. It is important to recognize that consideration of image sequences must involve an appreciation of the problems incurred by the motion of objects in the scene. The most obvious implication is that due to motion, useful three dimensional processing does not necessarily proceed in a direction `orthogonal' to the image frames. Therefore, attention is giv...
Estimation in Gaussian Graphical Models Using Tractable Subgraphs: A WalkSum Analysis
, 2008
"... Graphical models provide a powerful formalism for statistical signal processing. Due to their sophisticated modeling capabilities, they have found applications in a variety of fields such as computer vision, image processing, and distributed sensor networks. In this paper, we present a general clas ..."
Abstract

Cited by 28 (14 self)
 Add to MetaCart
Graphical models provide a powerful formalism for statistical signal processing. Due to their sophisticated modeling capabilities, they have found applications in a variety of fields such as computer vision, image processing, and distributed sensor networks. In this paper, we present a general class of algorithms for estimation in Gaussian graphical models with arbitrary structure. These algorithms involve a sequence of inference problems on tractable subgraphs over subsets of variables. This framework includes parallel iterations such as embedded trees, serial iterations such as block Gauss–Seidel, and hybrid versions of these iterations. We also discuss a method that uses local memory at each node to overcome temporary communication failures that may arise in distributed sensor network applications. We analyze these algorithms based on the recently developed walksum interpretation of Gaussian inference. We describe the walks “computed ” by the algorithms using walksum diagrams, and show that for iterations based on a very large and flexible set of sequences of subgraphs, convergence is guaranteed in walksummable models. Consequently, we are free to choose spanning trees and subsets of variables adaptively at each iteration. This leads to efficient methods for optimizing the next iteration step to achieve maximum reduction in error. Simulation results demonstrate that these nonstationary algorithms provide a significant speedup in convergence over traditional onetree and twotree iterations.
On Learning Discrete Graphical Models using GroupSparse
"... We study the problem of learning the graph structure associated with a general discrete graphical models (each variable can take any of m> 1 values, the clique factors have maximum size c ≥ 2) from samples, under highdimensional scaling where the number of variables p could be larger than the nu ..."
Abstract

Cited by 19 (5 self)
 Add to MetaCart
(Show Context)
We study the problem of learning the graph structure associated with a general discrete graphical models (each variable can take any of m> 1 values, the clique factors have maximum size c ≥ 2) from samples, under highdimensional scaling where the number of variables p could be larger than the number of samples n. We provide a quantitative consistency analysis of a procedure based on nodewise multiclass logistic regression with groupsparse regularization. We first consider general mary pairwise models – where each factor depends on at most two variables. We show that when
Convex relaxation methods for graphical models: Lagrangian and maximum entropy approaches
, 2008
"... Graphical models provide compact representations of complex probability distributions of many random variables through a collection of potential functions defined on small subsets of these variables. This representation is defined with respect to a graph in which nodes represent random variables and ..."
Abstract

Cited by 17 (2 self)
 Add to MetaCart
(Show Context)
Graphical models provide compact representations of complex probability distributions of many random variables through a collection of potential functions defined on small subsets of these variables. This representation is defined with respect to a graph in which nodes represent random variables and edges represent the interactions among those random variables. Graphical models provide a powerful and flexible approach to many problems in science and engineering, but also present serious challenges owing to the intractability of optimal inference and estimation over general graphs. In this thesis, we consider convex optimization methods to address two central problems that commonly arise for graphical models. First, we consider the problem of determining the most probable configuration—also known as the maximum a posteriori (MAP) estimate—of all variables in a graphical model, conditioned on (possibly noisy) measurements of some variables. This general problem is intractable, so we consider a Lagrangian relaxation (LR) approach to obtain a tractable dual problem. This involves using the Lagrangian decomposition technique
A recursive modelreduction method for approximate inference in Gaussian Markov random fields
 IEEE TRANS. IMAG. PROC
, 2008
"... This paper presents recursive cavity modeling—a principled, tractable approach to approximate, nearoptimal inference for large Gauss–Markov random fields. The main idea is to subdivide the random field into smaller subfields, constructing cavity models which approximate these subfields. Each cavit ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
This paper presents recursive cavity modeling—a principled, tractable approach to approximate, nearoptimal inference for large Gauss–Markov random fields. The main idea is to subdivide the random field into smaller subfields, constructing cavity models which approximate these subfields. Each cavity model is a concise, yet faithful, model for the surface of one subfield sufficient for nearoptimal inference in adjacent subfields. This basic idea leads to a treestructured algorithm which recursively builds a hierarchy of cavity models during an “upward pass ” and then builds a complementary set of blanket models during a reverse “downward pass. ” The marginal statistics of individual variables can then be approximated using their blanket models. Model thinning plays an important role, allowing us to develop thinned cavity and blanket models thereby providing tractable approximate inference. We develop a maximumentropy approach that exploits certain tractable representations of Fisher information on thin chordal graphs. Given the resulting set of thinned cavity models, we also develop a fast preconditioner, which provides a simple iterative method to compute optimal estimates. Thus, our overall approach combines recursive inference, variational learning and iterative estimation. We demonstrate the accuracy and scalability of this approach in several challenging, largescale remote sensing problems.
An Oracle Approach for Interaction Neighborhood Estimation in Random Fields
"... ar ..."
(Show Context)
Elementary Estimators for Graphical Models
"... We propose a class of closedform estimators for sparsitystructured graphical models, expressed as exponential family distributions, under highdimensional settings. Our approach builds on observing the precise manner in which the classical graphical model MLE “breaks down ” under highdimensional ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
We propose a class of closedform estimators for sparsitystructured graphical models, expressed as exponential family distributions, under highdimensional settings. Our approach builds on observing the precise manner in which the classical graphical model MLE “breaks down ” under highdimensional settings. Our estimator uses a carefully constructed, welldefined and closedform backward map, and then performs thresholding operations to ensure the desired sparsity structure. We provide a rigorous statistical analysis that shows that surprisingly our simple class of estimators recovers the same asymptotic convergence rates as those of the `1regularized MLEs that are much more difficult to compute. We corroborate this statistical performance, as well as significant computational advantages via simulations of both discrete and Gaussian graphical models. 1
Modeling and estimation in Gaussian graphical models: Maximumentropy relaxation and walksum analysis
 MASTER’S THESIS, MASSACHUSETTS INST
, 2007
"... ..."