Results 1  10
of
17
Structured learning and prediction in computer vision
 IN FOUNDATIONS AND TRENDS IN COMPUTER GRAPHICS AND VISION
, 2010
"... ..."
A bundle approach to efficient MAPinference by Lagrangian relaxation
 In CVPR
, 2012
"... Approximate inference by decomposition of discrete graphical models and Lagrangian relaxation has become a key technique in computer vision. The resulting dual objective function is convenient from the optimization pointofview, in principle. Due to its inherent nonsmoothness, however, it is not ..."
Abstract

Cited by 17 (8 self)
 Add to MetaCart
Approximate inference by decomposition of discrete graphical models and Lagrangian relaxation has become a key technique in computer vision. The resulting dual objective function is convenient from the optimization pointofview, in principle. Due to its inherent nonsmoothness, however, it is not directly amenable to efficient convex optimization. Related work either weakens the relaxation by smoothing or applies variations of the inefficient projected subgradient methods. In either case, heuristic choices of tuning parameters influence the performance and significantly depend on the specific problem at hand. In this paper, we introduce a novel approach based on bundle methods from the field of combinatorial optimization. It is directly based on the nonsmooth dual objective function, requires no tuning parameters and showed a markedly improved efficiency uniformly over a large variety of problem instances including benchmark experiments. Our code will be publicly available after publication of this paper. c©2012 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. 1.
Markov Random Field Modeling, Inference & Learning in Computer Vision & Image Understanding: A Survey
, 2013
"... ..."
Fast Planar Correlation Clustering for Image Segmentation
, 2012
"... We describe a new optimization scheme for finding highquality clusterings in planar graphs that uses weighted perfect matching as a subroutine. Our method provides lowerbounds on the energy of the optimal correlation clustering that are typically fast to compute and tight in practice. We demonstr ..."
Abstract

Cited by 13 (6 self)
 Add to MetaCart
(Show Context)
We describe a new optimization scheme for finding highquality clusterings in planar graphs that uses weighted perfect matching as a subroutine. Our method provides lowerbounds on the energy of the optimal correlation clustering that are typically fast to compute and tight in practice. We demonstrate our algorithm on the problem of image segmentation where this approach outperforms existing global optimization techniques in minimizing the objective and is competitive with the state of the art in producing highquality segmentations.
Submodular decomposition framework for inference in associative Markov networks with global constraints
 In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR
, 2011
"... In this paper we address the problem of finding the most probable state of discrete Markov random field (MRF) with associative pairwise terms. Although of practical importance, this problem is known to be NPhard in general. We propose a new type of MRF decomposition, submodular decomposition (SMD ..."
Abstract

Cited by 6 (0 self)
 Add to MetaCart
(Show Context)
In this paper we address the problem of finding the most probable state of discrete Markov random field (MRF) with associative pairwise terms. Although of practical importance, this problem is known to be NPhard in general. We propose a new type of MRF decomposition, submodular decomposition (SMD). Unlike existing decomposition approaches SMD decomposes the initial problem into subproblems corresponding to a specific class label while preserving the graph structure of each subproblem. Such decomposition enables us to take into account several types of global constraints in an efficient manner. We study theoretical properties of the proposed approach and demonstrate its applicability on a number of problems. The problem of effective Bayesian inference arises in
Sampling GMRFs by Subgraph Correction
 In: NIPS 2012 Workshop: Perturbations, Optimization, and Statistics
, 2012
"... The problem of efficiently drawing samples from a Gaussian Markov random field is studied. We introduce the subgraph correction sampling algorithm, which makes use of any preexisting tractable sampling algorithm for a subgraph by perturbing this algorithm so as to yield asymptotically exact sample ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
The problem of efficiently drawing samples from a Gaussian Markov random field is studied. We introduce the subgraph correction sampling algorithm, which makes use of any preexisting tractable sampling algorithm for a subgraph by perturbing this algorithm so as to yield asymptotically exact samples for the intended distribution. The subgraph can have any structure for which efficient sampling algorithms exist: for example, treestructured, with low treewidth, or with a small feedback vertex set. Preliminary experimental results demonstrate that the subgraph correction algorithm yields accurate samples much faster than many traditional sampling methods—such as Gibbs sampling—for many graph topologies. 1
Sampling from Gaussian Graphical Models Using Subgraph Perturbations
"... Abstract—The problem of efficiently drawing samples from a Gaussian graphical model or Gaussian Markov random field is studied. We introduce the subgraph perturbation sampling algorithm, which makes use of any preexisting tractable inference algorithm for a subgraph by perturbing this algorithm so ..."
Abstract

Cited by 1 (1 self)
 Add to MetaCart
(Show Context)
Abstract—The problem of efficiently drawing samples from a Gaussian graphical model or Gaussian Markov random field is studied. We introduce the subgraph perturbation sampling algorithm, which makes use of any preexisting tractable inference algorithm for a subgraph by perturbing this algorithm so as to yield asymptotically exact samples for the intended distribution. The subgraph can have any structure for which efficient inference algorithms exist: for example, treestructured, low treewidth, or having a small feedback vertex set. The experimental results demonstrate that this subgraph perturbation algorithm efficiently yields accurate samples for many graph topologies. I.
Blending Learning and Inference in Conditional Random Fields
, 2016
"... Abstract Conditional random fields maximize the loglikelihood of training labels given the training data, e.g., objects given images. In many cases the training labels are structures that consist of a set of variables and the computational complexity for estimating their likelihood is exponential ..."
Abstract
 Add to MetaCart
Abstract Conditional random fields maximize the loglikelihood of training labels given the training data, e.g., objects given images. In many cases the training labels are structures that consist of a set of variables and the computational complexity for estimating their likelihood is exponential in the number of the variables. Learning algorithms relax this computational burden using approximate inference that is nested as a subprocedure. In this paper we describe the objective function for nested learning and inference in conditional random fields. The devised objective maximizes the logbeliefs probability distributions over subsets of training variables that agree on their marginal probabilities. This objective is concave and consists of two types of variables that are related to the learning and inference tasks respectively. Importantly, we afterwards show how to blend the learning and inference procedure and effectively get to the identical optimum much faster. The proposed algorithm currently achieves the stateoftheart in various computer vision applications.
Approximating the Sum Operation for MarginalMAP Inference
"... We study the marginalMAP problem on graphical models, and present a novel approximation method based on direct approximation of the sum operation. A primary difficulty of marginalMAP problems lies in the noncommutativity of the sum and max operations, so that even in highly structured models, mar ..."
Abstract
 Add to MetaCart
(Show Context)
We study the marginalMAP problem on graphical models, and present a novel approximation method based on direct approximation of the sum operation. A primary difficulty of marginalMAP problems lies in the noncommutativity of the sum and max operations, so that even in highly structured models, marginalization may produce a densely connected graph over the variables to be maximized, resulting in an intractable potential function with exponential size. We propose a chain decomposition approach for summing over the marginalized variables, in which we produce a structured approximation to the MAP component of the problem consisting of only pairwise potentials. We show that this approach is equivalent to the maximization of a specific variational free energy, and it provides an upper bound of the optimal probability. Finally, experimental results demonstrate that our method performs favorably compared to previous methods.