Results 1  10
of
68
FusionFlow: DiscreteContinuous Optimization for Optical Flow Estimation
, 2008
"... Accurate estimation of optical flow is a challenging task, which often requires addressing difficult energy optimization problems. To solve them, most topperforming methods rely on continuous optimization algorithms. The modeling accuracy of the energy in this case is often traded for its tractabil ..."
Abstract

Cited by 57 (7 self)
 Add to MetaCart
(Show Context)
Accurate estimation of optical flow is a challenging task, which often requires addressing difficult energy optimization problems. To solve them, most topperforming methods rely on continuous optimization algorithms. The modeling accuracy of the energy in this case is often traded for its tractability. This is in contrast to the related problem of narrowbaseline stereo matching, where the topperforming methods employ powerful discrete optimization algorithms such as graph cuts and messagepassing to optimize highly nonconvex energies. In this paper, we demonstrate how similar nonconvex energies can be formulated and optimized discretely in the context of optical flow estimation. Starting with a set of candidate solutions that are produced by fast continuous flow estimation algorithms, the proposed method iteratively fuses these candidate solutions by the computation of minimum cuts on graphs. The obtained continuousvalued fusion result is then further improved using local gradient descent. Experimentally, we demonstrate that the proposed energy is an accurate model and that the proposed discretecontinuous optimization scheme not only finds lower energy solutions than traditional discrete or continuous optimization techniques, but also leads to flow estimates that outperform the current stateoftheart.
A Comparative Study of Modern Inference Techniques for Discrete Energy Minimization Problem
"... Seven years ago, Szeliski et al. published an influential study on energy minimization methods for Markov random fields (MRF). This study provided valuable insights in choosing the best optimization technique for certain classes of problems. While these insights remain generally useful today, the ph ..."
Abstract

Cited by 48 (13 self)
 Add to MetaCart
Seven years ago, Szeliski et al. published an influential study on energy minimization methods for Markov random fields (MRF). This study provided valuable insights in choosing the best optimization technique for certain classes of problems. While these insights remain generally useful today, the phenominal success of random field models means that the kinds of inference problems we solve have changed significantly. Specifically, the models today often include higher order interactions, flexible connectivity structures, large labelspaces of different cardinalities, or learned energy tables. To reflect these changes, we provide a modernized and enlarged study. We present an empirical comparison of 24 stateofart techniques on a corpus of 2,300 energy minimization instances from 20 diverse computer vision applications. To ensure reproducibility, we evaluate all methods in the OpenGM2 framework and report extensive results regarding runtime and solution quality. Key insights from our study agree with the results of Szeliski et al. for the types of models they studied. However, on new and challenging types of models our findings disagree and suggest that polyhedral methods and integer programming solvers are competitive in terms of runtime and solution quality over a large range of model types.
Parallel and Distributed Graph Cuts by Dual Decomposition
, 2010
"... Graph cuts methods are at the core of many stateoftheart algorithms in computer vision due to their efficiency in computing globally optimal solutions. In this paper, we solve the maximum flow/minimum cut problem in parallel by splitting the graph into multiple parts and hence, further increase th ..."
Abstract

Cited by 25 (3 self)
 Add to MetaCart
(Show Context)
Graph cuts methods are at the core of many stateoftheart algorithms in computer vision due to their efficiency in computing globally optimal solutions. In this paper, we solve the maximum flow/minimum cut problem in parallel by splitting the graph into multiple parts and hence, further increase the computational efficacy of graph cuts. Optimality of the solution is guaranteed by dual decomposition, or more specifically, the solutions to the subproblems are constrained to be equal on the overlap with dual variables. We demonstrate that our approach both allows (i) faster processing on multicore computers and (ii) the capability to handle larger problems by splitting the graph across multiple computers on a distributed network. Even though our approach does not give a theoretical guarantee of speedup, an extensive empirical evaluation on several applications with many different data sets consistently shows good performance. An open source implementation of the dual decomposition method is also made publicly available.
A Quantitative Analysis of Current Practices in Optical Flow Estimation and the Principles Behind Them
 INT J COMPUT VIS
, 2013
"... The accuracy of optical flow estimation algorithms has been improving steadily as evidenced by results on the Middlebury optical flow benchmark. The typical formulation, however, has changed little since the work of Horn and Schunck. We attempt to uncover what has made recent advances possible throu ..."
Abstract

Cited by 25 (6 self)
 Add to MetaCart
The accuracy of optical flow estimation algorithms has been improving steadily as evidenced by results on the Middlebury optical flow benchmark. The typical formulation, however, has changed little since the work of Horn and Schunck. We attempt to uncover what has made recent advances possible through a thorough analysis of how the objective function, the optimization method, and modern implementation practices influence accuracy. We discover that “classical” flow formulations perform surprisingly well when combined with modern optimization and implementation techniques. One key implementation detail is the median filtering of intermediate flow fields during optimization. While this improves the robustness of classical methods it actually leads to higher energy solutions, meaning that these methods are not optimizing the original objective function. To understand the principles behind this phenomenon, we derive a new objective function that formalizes the median filtering heuristic. This objective function includes a nonlocal smoothness term that robustly integrates flow estimates over large spatial neighborhoods. By modifying this
K.M.: Progressive graph matching: Making a move of graphs via probabilistic voting
"... Graph matching is widely used in a variety of scientific fields, including computer vision, due to its powerful performance, robustness, and generality. Its computational complexity, however, limits the permissible size of input graphs in practice. Therefore, in realworld applications, the initial ..."
Abstract

Cited by 21 (2 self)
 Add to MetaCart
(Show Context)
Graph matching is widely used in a variety of scientific fields, including computer vision, due to its powerful performance, robustness, and generality. Its computational complexity, however, limits the permissible size of input graphs in practice. Therefore, in realworld applications, the initial construction of graphs to match becomes a critical factor for the matching performance, and often leads to unsatisfactory results. In this paper, to resolve the issue, we propose a novel progressive framework which combines probabilistic progression of graphs with matching of graphs. The algorithm efficiently reestimates in a Bayesian manner the most plausible target graphs based on the current matching result, and guarantees to boost the matching objective at the subsequent graph matching. Experimental evaluation demonstrates that our approach effectively handles the limits of conventional graph matching and achieves significant improvement in challenging image matching problems. 1.
TriangleFlow: Optical Flow with Triangulationbased HigherOrder Likelihoods
"... Abstract. We use a simple yet powerful higherorder conditional random field (CRF) to model optical flow. It consists of a standard photoconsistency cost and a prior on affine motions both modeled in terms of higherorder potential functions. Reasoning jointly over a large set of unknown variables p ..."
Abstract

Cited by 16 (2 self)
 Add to MetaCart
(Show Context)
Abstract. We use a simple yet powerful higherorder conditional random field (CRF) to model optical flow. It consists of a standard photoconsistency cost and a prior on affine motions both modeled in terms of higherorder potential functions. Reasoning jointly over a large set of unknown variables provides more reliable motion estimates and a robust matching criterion. One of the main contributions is that unlike previous regionbased methods, we omit the assumption of constant flow. Instead, we consider local affine warps whose likelihood energy can be computed exactly without approximations. This results in a tractable, socalled, higherorder likelihood function. We realize this idea by employing triangulation meshes which immensely reduce the complexity of the problem. Optimization is performed by hierarchical QPBO moves and an adaptive mesh refinement strategy. Experiments show that we achieve highquality motion fields on several data sets including the Middlebury optical flow database. 1
Markov Random Field Modeling, Inference & Learning in Computer Vision & Image Understanding: A Survey
, 2013
"... ..."
Dynamic Programming and Graph Algorithms in Computer Vision
"... Optimization is a powerful paradigm for expressing and solving problems in a wide range of areas, and has been successfully applied to many vision problems. Discrete optimization techniques are especially interesting, since by carefully exploiting problem structure they often provide nontrivial gua ..."
Abstract

Cited by 14 (0 self)
 Add to MetaCart
(Show Context)
Optimization is a powerful paradigm for expressing and solving problems in a wide range of areas, and has been successfully applied to many vision problems. Discrete optimization techniques are especially interesting, since by carefully exploiting problem structure they often provide nontrivial guarantees concerning solution quality. In this paper we briefly review dynamic programming and graph algorithms, and discuss representative examples of how these discrete optimization techniques have been applied to some classical vision problems. We focus on the lowlevel vision problem of stereo; the midlevel problem of interactive object segmentation; and the highlevel problem of modelbased recognition.
Generalized roof duality and bisubmodular functions
, 2010
"... Consider a convex relaxation ˆ f of a pseudoboolean function f. We say that the relaxation is totally halfintegral if ˆ f(x) is a polyhedral function with halfintegral extreme points x, and this property is preserved after adding an arbitrary combination of constraints of the form xi = xj, xi = 1 ..."
Abstract

Cited by 12 (1 self)
 Add to MetaCart
(Show Context)
Consider a convex relaxation ˆ f of a pseudoboolean function f. We say that the relaxation is totally halfintegral if ˆ f(x) is a polyhedral function with halfintegral extreme points x, and this property is preserved after adding an arbitrary combination of constraints of the form xi = xj, xi = 1 − xj, and xi = γ where γ ∈ {0, 1, 1 2} is a constant. A wellknown example is the roof duality relaxation for quadratic pseudoboolean functions f. We argue that total halfintegrality is a natural requirement for generalizations of roof duality to arbitrary pseudoboolean functions. Our contributions are as follows. First, we provide a complete characterization of totally halfintegral relaxations ˆ f by establishing a onetoone correspondence with bisubmodular functions. Second, we give a new characterization of bisubmodular functions. Finally, we show some relationships between general totally halfintegral relaxations and relaxations based on the roof duality.
Minimizing Energies with Hierarchical Costs
 INTERNATIONAL JOURNAL OF COMPUTER VISION
, 2012
"... Computer vision is full of problems elegantly expressed in terms of energy minimization. We characterize a class of energies with hierarchical costs and propose a novel hierarchical fusion algorithm. Hierarchical costs are natural for modeling an array of difficult problems. For example, in semantic ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
Computer vision is full of problems elegantly expressed in terms of energy minimization. We characterize a class of energies with hierarchical costs and propose a novel hierarchical fusion algorithm. Hierarchical costs are natural for modeling an array of difficult problems. For example, in semantic segmentation one could rule out unlikely object combinations via hierarchical context. In geometric model estimation, one could penalize the number of unique model families in a solution, not just the number of models—a kind of hierarchical MDL criterion. Hierarchical fusion uses the wellknown αexpansion algorithm as a subroutine, and offers a much better approximation bound in important cases.