Results 1  10
of
48
Markov Random Field Modeling, Inference & Learning in Computer Vision & Image Understanding: A Survey
, 2013
"... ..."
Higherorder segmentation via multicuts
 CORR ABS/1305.6387
"... Multicuts enable to conveniently represent discrete graphical models for unsupervised and supervised image segmentation, based on local energy functions that exhibit symmetries. The basic Potts model and natural extensions thereof to higherorder models provide a prominent class of representatives, ..."
Abstract

Cited by 7 (1 self)
 Add to MetaCart
(Show Context)
Multicuts enable to conveniently represent discrete graphical models for unsupervised and supervised image segmentation, based on local energy functions that exhibit symmetries. The basic Potts model and natural extensions thereof to higherorder models provide a prominent class of representatives, that cover a broad range of segmentation problems relevant to image analysis and computer vision. We show how to take into account such higherorder terms systematically in view of computational inference, and present results of a comprehensive and competitive numerical evaluation of a variety of dedicated cuttingplane algorithms. Our results reveal ways to evaluate a significant subset of models globally optimal, without compromising runtime. Polynomially solvable relaxations are studied as well, along with advanced rounding schemes for postprocessing.
Partial Optimality via Iterative Pruning for the Potts Model
"... Abstract. We propose a novel method to obtain a part of an optimal nonrelaxed integral solution for energy minimization problems with Potts interactions, known also as the minimal partition problem. The method empirically outperforms previous approaches like MQPBO and Kovtun’s method in most of ou ..."
Abstract

Cited by 7 (4 self)
 Add to MetaCart
(Show Context)
Abstract. We propose a novel method to obtain a part of an optimal nonrelaxed integral solution for energy minimization problems with Potts interactions, known also as the minimal partition problem. The method empirically outperforms previous approaches like MQPBO and Kovtun’s method in most of our test instances and especially in hard ones. As a starting point our approach uses the solution of a commonly accepted convex relaxation of the problem. This solution is then iteratively pruned until our criterion for partial optimality is satisfied. Due to its generality our method can employ any solver for the considered relaxed problem.
Playing with duality: An overview of recent primaldual approaches for . . .
, 2014
"... Optimization methods are at the core of many problems in signal/image processing, computer vision, and machine learning. For a long time, it has been recognized that looking at the dual of an optimization problem may drastically simplify its solution. Deriving efficient strategies jointly bringing i ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
Optimization methods are at the core of many problems in signal/image processing, computer vision, and machine learning. For a long time, it has been recognized that looking at the dual of an optimization problem may drastically simplify its solution. Deriving efficient strategies jointly bringing into play the primal and the dual problems is however a more recent idea which has generated many important new contributions in the last years. These novel developments are grounded on recent advances in convex analysis, discrete optimization, parallel processing, and nonsmooth optimization with emphasis on sparsity issues. In this paper, we aim at presenting the principles of primaldual approaches, while giving an overview of numerical methods which have been proposed in different contexts. We show the benefits which can be drawn from primaldual algorithms both for solving largescale convex optimization problems and discrete ones, and we provide various application examples to illustrate their usefulness.
PseudoBound Optimization for Binary Energies
"... Abstract. Highorder and nonsubmodular pairwise energies are important for image segmentation, surface matching, deconvolution, tracking and other computer vision problems. Minimization of such energies is generally NPhard. One standard approximation approach is to optimize an auxiliary function ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Abstract. Highorder and nonsubmodular pairwise energies are important for image segmentation, surface matching, deconvolution, tracking and other computer vision problems. Minimization of such energies is generally NPhard. One standard approximation approach is to optimize an auxiliary function an upper bound of the original energy across the entire solution space. This bound must be amenable to fast global solvers. Ideally, it should also closely approximate the original functional, but it is very dicult to nd such upper bounds in practice. Our main idea is to relax the upperbound condition for an auxiliary function and to replace it with a family of pseudobounds, which can better approximate the original energy. We use fast polynomial parametric max ow approach to explore all global minima for our family of submodular pseudobounds. The best solution is guaranteed to decrease the original energy because the family includes at least one auxiliary function. Our PseudoBound Cuts algorithm improves the stateoftheart in many applications: appearance entropy minimization, target distribution matching, curvature regularization, image deconvolution and interactive segmentation.
Cut, Glue & Cut: A Fast, Approximate Solver for Multicut Partitioning
"... Recently, unsupervised image segmentation has become increasingly popular. Starting from a superpixel segmentation, an edgeweighted region adjacency graph is constructed. Amongst all segmentations of the graph, the one which best conforms to the given image evidence, as measured by the sum of cu ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
(Show Context)
Recently, unsupervised image segmentation has become increasingly popular. Starting from a superpixel segmentation, an edgeweighted region adjacency graph is constructed. Amongst all segmentations of the graph, the one which best conforms to the given image evidence, as measured by the sum of cut edge weights, is chosen. Since this problem is NPhard, we propose a new approximate solver based on the movemaking paradigm: first, the graph is recursively partitioned into small regions (cut phase). Then, for any two adjacent regions, we consider alternative cuts of these two regions defining possible moves (glue & cut phase). For planar problems, the optimal move can be found, whereas for nonplanar problems, efficient approximations exist. We evaluate our algorithm on published and new benchmark datasets, which we make available here. The proposed algorithm finds segmentations that, as measured by a loss function, are as close to the groundtruth as the global optimum found by exact solvers. It does so significantly faster then existing approximate methods, which is important for largescale problems. 1.
Optimal Decisions from Probabilistic Models: the IntersectionoverUnion Case
"... A probabilistic model allows us to reason about the world and make statistically optimal decisions using Bayesian decision theory. However, in practice the intractability of the decision problem forces us to adopt simplistic loss functions such as the 0/1 loss or Hamming loss and as result we make ..."
Abstract

Cited by 4 (0 self)
 Add to MetaCart
(Show Context)
A probabilistic model allows us to reason about the world and make statistically optimal decisions using Bayesian decision theory. However, in practice the intractability of the decision problem forces us to adopt simplistic loss functions such as the 0/1 loss or Hamming loss and as result we make poor decisions through MAP estimates or through loworder marginal statistics. In this work we investigate optimal decision making for more realistic loss functions. Specifically we consider the popular intersectionoverunion (IoU) score used in image segmentation benchmarks and show that it results in a hard combinatorial decision problem. To make this problem tractable we propose a statistical approximation to the objective function, as well as an approximate algorithm based on parametric linear programming. We apply the algorithm on three benchmark datasets and obtain improved intersectionoverunion scores compared to maximumposteriormarginal decisions. Our work points out the difficulties of using realistic loss functions with probabilistic computer vision models. 1.
Maximum Persistency in Energy Minimization
"... We consider discrete pairwise energy minimization problem (weighted constraint satisfaction, maxsum labeling) and methods that identify a globally optimal partial assignment of variables. When finding a complete optimal assignment is intractable, determining optimal values for a part of variable ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
We consider discrete pairwise energy minimization problem (weighted constraint satisfaction, maxsum labeling) and methods that identify a globally optimal partial assignment of variables. When finding a complete optimal assignment is intractable, determining optimal values for a part of variables is an interesting possibility. Existing methods are based on different sufficient conditions. We propose a new sufficient condition for partial optimality which is: (1) verifiable in polynomial time (2) invariant to reparametrization of the problem and permutation of labels and (3) includes many existing sufficient conditions as special cases. We pose the problem of finding the maximum optimal partial assignment identifiable by the new sufficient condition. A polynomial method is proposed which is guaranteed to
Partial optimality by pruning for MAPinference with general graphical models
 In CVPR
, 2014
"... We consider the energy minimization problem for undirected graphical models, also known as MAPinference problem for Markov random fields which is NPhard in general. We propose a novel polynomial time algorithm to obtain a part of its optimal nonrelaxed integral solution. Our algorithm is initiali ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
(Show Context)
We consider the energy minimization problem for undirected graphical models, also known as MAPinference problem for Markov random fields which is NPhard in general. We propose a novel polynomial time algorithm to obtain a part of its optimal nonrelaxed integral solution. Our algorithm is initialized with variables taking integral values in the solution of a convex relaxation of the MAPinference problem and iteratively prunes those, which do not satisfy our criterion for partial optimality. We show that our pruning strategy is in a certain sense theoretically optimal. Also empirically our method outperforms previous approaches in terms of the number of persistently labelled variables. The method is very general, as it is applicable to models with arbitrary factors of an arbitrary order and can employ any solver for the considered relaxed problem. Our method’s runtime is determined by the runtime of the convex relaxation solver for the MAPinference problem. 1.
C.: Global MAPoptimality by shrinking the combinatorial search area with convex relaxation
, 2013
"... We consider energy minimization for undirected graphical models, also known as the MAPinference problem for Markov random fields. Although combinatorial methods, which return a provably optimal integral solution of the problem, made a significant progress in the past decade, they are still typicall ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
We consider energy minimization for undirected graphical models, also known as the MAPinference problem for Markov random fields. Although combinatorial methods, which return a provably optimal integral solution of the problem, made a significant progress in the past decade, they are still typically unable to cope with largescale datasets. On the other hand, large scale datasets are often defined on sparse graphs and convex relaxation methods, such as linear programming relaxations then provide good approximations to integral solutions. We propose a novel method of combining combinatorial and convex programming techniques to obtain a global solution of the initial combinatorial problem. Based on the information obtained from the solution of the convex relaxation, our method confines application of the combinatorial solver to a small fraction of the initial graphical model, which allows to optimally solve much larger problems. We demonstrate the efficacy of our approach on a computer vision energy minimization benchmark. 1