Results 1  10
of
257
Learning RealTime MRF Inference for Image Denoising
 In IEEE Conference on Computer Vision and Pattern Recognition
, 2009
"... Many computer vision problems can be formulated in a Bayesian framework with Markov Random Field (MRF) or Conditional Random Field (CRF) priors. Usually, the model assumes that a full Maximum A Posteriori (MAP) estimation will be performed for inference, which can be really slow in practice. In this ..."
Abstract

Cited by 9 (1 self)
 Add to MetaCart
Many computer vision problems can be formulated in a Bayesian framework with Markov Random Field (MRF) or Conditional Random Field (CRF) priors. Usually, the model assumes that a full Maximum A Posteriori (MAP) estimation will be performed for inference, which can be really slow in practice
Approximate MRF Inference Using Bounded Treewidth Subgraphs
"... Graph cut algorithms [9], commonly used in computer vision, solve a firstorder MRF over binary variables. The state of the art for this NPhard problem is QPBO [1, 2], which finds the values for a subset of the variables in the global minimum. While QPBO is very effective overall there are still ..."
Abstract

Cited by 3 (0 self)
 Add to MetaCart
Graph cut algorithms [9], commonly used in computer vision, solve a firstorder MRF over binary variables. The state of the art for this NPhard problem is QPBO [1, 2], which finds the values for a subset of the variables in the global minimum. While QPBO is very effective overall there are still
Efficient Semidefinite BranchandCut for MAPMRF Inference∗
"... We propose a new BranchandCut (B&C) method for solving general MAPMRF inference problems. The core of our method is a very efficient bounding procedure, which combines scalable semidefinite programming (SDP) and a cuttingplane method for seeking violated constraints. We analyze the performan ..."
Abstract
 Add to MetaCart
We propose a new BranchandCut (B&C) method for solving general MAPMRF inference problems. The core of our method is a very efficient bounding procedure, which combines scalable semidefinite programming (SDP) and a cuttingplane method for seeking violated constraints. We analyze
Efficient and Exact MAPMRF Inference using Branch and Bound
"... We propose two novel BranchandBound (BB) methods to efficiently solve exact MAPMRF inference on problems with a large number of states (per variable) H. By organizing the data in a suitable structure, the time complexity of our best method for evaluating the bound at each branch is reduced from O ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
We propose two novel BranchandBound (BB) methods to efficiently solve exact MAPMRF inference on problems with a large number of states (per variable) H. By organizing the data in a suitable structure, the time complexity of our best method for evaluating the bound at each branch is reduced from
Beyond Trees: MRF Inference via OuterPlanar Decomposition
, 2010
"... Maximum a posteriori (MAP) inference in Markov Random Fields (MRFs) is an NPhard problem, and thus research has focussed on either finding efficiently solvable subclasses (e.g. trees), or approximate algorithms (e.g. Loopy Belief Propagation (BP) and Treereweighted (TRW) methods). This paper prese ..."
Abstract

Cited by 17 (1 self)
 Add to MetaCart
Maximum a posteriori (MAP) inference in Markov Random Fields (MRFs) is an NPhard problem, and thus research has focussed on either finding efficiently solvable subclasses (e.g. trees), or approximate algorithms (e.g. Loopy Belief Propagation (BP) and Treereweighted (TRW) methods). This paper
Discrete MRF Inference of Marginal Densities for Nonuniformly Discretized Variable Space
"... This paper is concerned with the inference of marginal densities based on MRF models. The optimization algorithms for continuous variables are only applicable to a limited number of problems, whereas those for discrete variables are versatile. Thus, it is quite common to convert the continuous va ..."
Abstract
 Add to MetaCart
This paper is concerned with the inference of marginal densities based on MRF models. The optimization algorithms for continuous variables are only applicable to a limited number of problems, whereas those for discrete variables are versatile. Thus, it is quite common to convert the continuous
MRF Inference by kFan Decomposition and Tight Lagrangian Relaxation
"... We present a novel dual decomposition approach to MAP inference with highly connected discrete graphical models. Decompositions into cyclic kfan structured subproblems are shown to significantly tighten the Lagrangian relaxation relative to the standard local polytope relaxation, while enabling ef ..."
Abstract
 Add to MetaCart
We present a novel dual decomposition approach to MAP inference with highly connected discrete graphical models. Decompositions into cyclic kfan structured subproblems are shown to significantly tighten the Lagrangian relaxation relative to the standard local polytope relaxation, while enabling
Supplementary Material: Efficient and Exact MAPMRF Inference using Branch and Bound
"... By default, the edgeconsistent LPR is solved using Message Passing (MP) algorithm to initialize β until convergence 1 or for at most 1000 iterations, whichever comes first. If the gap between the upper and lower bounds is not smaller than 10 −4 (stopping criteria) already, we further apply our BB m ..."
Abstract
 Add to MetaCart
By default, the edgeconsistent LPR is solved using Message Passing (MP) algorithm to initialize β until convergence 1 or for at most 1000 iterations, whichever comes first. If the gap between the upper and lower bounds is not smaller than 10 −4 (stopping criteria) already, we further apply our BB method or MPLPCP method [4]. Both methods stop when the same stopping criteria (gap < 10 −4) is reached. For MPLPCP method [4], by default, we alternate between adding 20 clusters at a time and running MPLP for 100 more iterations. In the human pose estimation experiment, since the problems can be solved most of the time without cluster pursuit, we allow the MP algorithm to try harder to solve the edgeconsistent LPR. We follow the suggestions
Technical Report: Efficient and Exact MAPMRF Inference using Branch and Bound
"... By default, the edgeconsistent LPR is solved using Message Passing (MP) algorithm to initialize β until convergence 1 or for at most 1000 iterations, whichever comes first. If the gap between the upper and lower bounds is not smaller than 10 −4 (stopping criteria) already, we further apply our BB m ..."
Abstract
 Add to MetaCart
By default, the edgeconsistent LPR is solved using Message Passing (MP) algorithm to initialize β until convergence 1 or for at most 1000 iterations, whichever comes first. If the gap between the upper and lower bounds is not smaller than 10 −4 (stopping criteria) already, we further apply our BB method or MPLPCP method [3]. Both methods stop when the same stopping criteria (gap < 10 −4) is reached. For MPLPCP method [3], by default, we alternate between adding 20 clusters at a time and running MPLP for 100 more iterations. In the human pose estimation experiment, since the problems can be solved most of the time without cluster pursuit, we allow the MP algorithm to try harder to solve the edgeconsistent LPR. We follow the suggestions
Tighter Relaxations for MAPMRF Inference: A Local PrimalDual Gap based Separation Algorithm
"... We propose an efficient and adaptive method for MAPMRF inference that provides increasingly tighter upper and lower bounds on the optimal objective. Similar to Sontag et al. (2008b), our method starts by solving the firstorder LOCAL(G) linear programming relaxation. This is followed by an adaptive ..."
Abstract

Cited by 18 (1 self)
 Add to MetaCart
We propose an efficient and adaptive method for MAPMRF inference that provides increasingly tighter upper and lower bounds on the optimal objective. Similar to Sontag et al. (2008b), our method starts by solving the firstorder LOCAL(G) linear programming relaxation. This is followed
Results 1  10
of
257