Results 11  20
of
179
FusionFlow: DiscreteContinuous Optimization for Optical Flow Estimation
, 2008
"... Accurate estimation of optical flow is a challenging task, which often requires addressing difficult energy optimization problems. To solve them, most topperforming methods rely on continuous optimization algorithms. The modeling accuracy of the energy in this case is often traded for its tractabil ..."
Abstract

Cited by 58 (7 self)
 Add to MetaCart
(Show Context)
Accurate estimation of optical flow is a challenging task, which often requires addressing difficult energy optimization problems. To solve them, most topperforming methods rely on continuous optimization algorithms. The modeling accuracy of the energy in this case is often traded for its tractability. This is in contrast to the related problem of narrowbaseline stereo matching, where the topperforming methods employ powerful discrete optimization algorithms such as graph cuts and messagepassing to optimize highly nonconvex energies. In this paper, we demonstrate how similar nonconvex energies can be formulated and optimized discretely in the context of optical flow estimation. Starting with a set of candidate solutions that are produced by fast continuous flow estimation algorithms, the proposed method iteratively fuses these candidate solutions by the computation of minimum cuts on graphs. The obtained continuousvalued fusion result is then further improved using local gradient descent. Experimentally, we demonstrate that the proposed energy is an accurate model and that the proposed discretecontinuous optimization scheme not only finds lower energy solutions than traditional discrete or continuous optimization techniques, but also leads to flow estimates that outperform the current stateoftheart.
Concise Integer Linear Programming Formulations for Dependency Parsing
, 2009
"... We formulate the problem of nonprojective dependency parsing as a polynomialsized integer linear program. Our formulation is able to handle nonlocal output features in an efficient manner; not only is it compatible with prior knowledge encoded as hard constraints, it can also learn soft constraint ..."
Abstract

Cited by 56 (9 self)
 Add to MetaCart
We formulate the problem of nonprojective dependency parsing as a polynomialsized integer linear program. Our formulation is able to handle nonlocal output features in an efficient manner; not only is it compatible with prior knowledge encoded as hard constraints, it can also learn soft constraints from data. In particular, our model is able to learn correlations among neighboring arcs (siblings and grandparents), word valency, and tendencies toward nearlyprojective parses. The model parameters are learned in a maxmargin framework by employing a linear programming relaxation. We evaluate the performance of our parser on data in several natural languages, achieving improvements over existing stateoftheart methods.
Track to the Future: Spatiotemporal Video Segmentation with Longrange Motion Cues
"... Video provides not only rich visual cues such as motion and appearance, but also much less explored longrange temporal interactions among objects. We aim to capture such interactions and to construct a powerful intermediatelevel video representation for subsequent recognition. Motivated by this goa ..."
Abstract

Cited by 53 (2 self)
 Add to MetaCart
(Show Context)
Video provides not only rich visual cues such as motion and appearance, but also much less explored longrange temporal interactions among objects. We aim to capture such interactions and to construct a powerful intermediatelevel video representation for subsequent recognition. Motivated by this goal, we seek to obtain spatiotemporal oversegmentation of a video into regions that respect object boundaries and, at the same time, associate object pixels over many video frames. The contributions of this paper are twofold. First, we develop an efficient spatiotemporal video segmentation algorithm, which naturally incorporates longrange motion cues from the past and future frames in the form of clusters of point tracks with coherent motion. Second, we devise a new track clustering cost function that includes occlusion reasoning, in the form of depth ordering constraints, as well as motion similarity along the tracks. We evaluate the proposed approach on a challenging set of video sequences of office scenes from feature length movies. 1.
Halfintegrality based algorithms for cosegmentation of images
 In CVPR
, 2009
"... We study the cosegmentation problem where the objective is to segment the same object (i.e., region) from a pair of images. The segmentation for each image can be cast using a partitioning/segmentation function with an additional constraint that seeks to make the histograms of the segmented regions ..."
Abstract

Cited by 52 (4 self)
 Add to MetaCart
(Show Context)
We study the cosegmentation problem where the objective is to segment the same object (i.e., region) from a pair of images. The segmentation for each image can be cast using a partitioning/segmentation function with an additional constraint that seeks to make the histograms of the segmented regions (based on intensity and texture features) similar. Using Markov Random Field (MRF) energy terms for the simultaneous segmentation of the images together with histogram consistency requirements using the squared L2 (rather than L1) distance, after linearization and adjustments, yields an optimization model with some interesting combinatorial properties. We discuss these properties which are closely related to certain relaxation strategies recently introduced in computer vision. Finally, we show experimental results of the proposed approach. 1.
Cosegmentation revisited: models and optimization
 ECCV
, 2010
"... Abstract. The problem of cosegmentation consists of segmenting the same object (or objects of the same class) in two or more distinct images. Recently a number of different models have been proposed for this problem. However, no comparison of such models and corresponding optimization techniques has ..."
Abstract

Cited by 50 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The problem of cosegmentation consists of segmenting the same object (or objects of the same class) in two or more distinct images. Recently a number of different models have been proposed for this problem. However, no comparison of such models and corresponding optimization techniques has been done so far. We analyze three existing models: the L1 norm model of Rother et al. [1], the L2 norm model of Mukherjee et al. [2] and the “reward ” model of Hochbaum and Singh [3]. We also study a new model, which is a straightforward extension of the BoykovJolly model for single image segmentation [4]. In terms of optimization, we use a Dual Decomposition (DD) technique in addition to optimization methods in [1,2]. Experiments show a significant improvement of DD over published methods. Our main conclusion, however, is that the new model is the best overall because it: (i) has fewest parameters; (ii) is most robust in practice, and (iii) can be optimized well with an efficient EMstyle procedure. 1
Exact Inference in Multilabel CRFs with Higher Order Cliques
, 2008
"... This paper addresses the problem of exactly inferring the maximum a posteriori solutions of discrete multilabel MRFs or CRFs with higher order cliques. We present a framework to transform special classes of multilabel higher order functions to submodular second order boolean functions (referred to ..."
Abstract

Cited by 49 (11 self)
 Add to MetaCart
(Show Context)
This paper addresses the problem of exactly inferring the maximum a posteriori solutions of discrete multilabel MRFs or CRFs with higher order cliques. We present a framework to transform special classes of multilabel higher order functions to submodular second order boolean functions (referred to as F 2 s), which can be minimized exactly using graph cuts and we characterize those classes. The basic idea is to use two or more boolean variables to encode the states of a single multilabel variable. There are many ways in which this can be done and much interesting research lies in finding ways which are optimal or minimal in some sense. We study the space of possible encodings and find the ones that can transform the most general class of functions to F 2 s. Our main contributions are twofold. First, we extend the subclass of submodular energy functions that can be minimized exactly using graph cuts. Second, we show how higher order potentials can be used to improve single view 3D reconstruction results. We believe that our work on exact minimization of higher order energy functions will lead to similar improvements in solutions of other labelling problems. 1.
Dense Nonrigid Surface Registration Using HighOrder Graph Matching
"... In this paper, we propose a highorder graph matching formulation to address nonrigid surface matching. The singleton terms capture the geometric and appearance similarities (e.g., curvature and texture) while the highorder terms model the intrinsic embedding energy. The novelty of this paper incl ..."
Abstract

Cited by 49 (11 self)
 Add to MetaCart
(Show Context)
In this paper, we propose a highorder graph matching formulation to address nonrigid surface matching. The singleton terms capture the geometric and appearance similarities (e.g., curvature and texture) while the highorder terms model the intrinsic embedding energy. The novelty of this paper includes: 1) casting 3D surface registration into a graph matching problem that combines both geometric and appearance similarities and intrinsic embedding information, 2) the first implementation of highorder graph matching algorithm that solves a nonconvex optimization problem, and 3) an efficient twostage optimization approach to constrain the search space for dense surface registration. Our method is validated through a series of experiments demonstrating its accuracy and efficiency, notably in challenging cases of large and/or nonisometric deformations, or meshes that are partially occluded. 1.
The Complexity of Soft Constraint Satisfaction
, 2006
"... Over the past few years there has been considerable progress in methods to systematically analyse the complexity of constraint satisfaction problems with specified constraint types. One very powerful theoretical development in this area links the complexity of a set of constraints to a corresponding ..."
Abstract

Cited by 44 (13 self)
 Add to MetaCart
Over the past few years there has been considerable progress in methods to systematically analyse the complexity of constraint satisfaction problems with specified constraint types. One very powerful theoretical development in this area links the complexity of a set of constraints to a corresponding set of algebraic operations, known as polymorphisms. In this paper we extend the analysis of complexity to the more general framework of combinatorial optimisation problems expressed using various forms of soft constraints. We launch a systematic investigation of the complexity of these problems by extending the notion of a polymorphism to a more general algebraic operation, which we call a multimorphism. We show that many tractable sets of soft constraints, both established and novel, can be characterised by the presence of particular multimorphisms. We also show that a simple set of NPhard constraints has very restricted multimorphisms. Finally, we use the notion of multimorphism to give a complete classification of complexity for the Boolean case which extends several earlier classification results for particular special cases.
New inference rules for MaxSAT
 JAIR
, 2007
"... Abstract. Exact MaxSAT solvers, compared with SAT solvers, apply little inference at each node of the proof tree. Commonly used SAT inference rules like unit propagation produce a simplified formula that preserves satisfiability but, unfortunately, solving the MaxSAT problem for the simplified for ..."
Abstract

Cited by 42 (9 self)
 Add to MetaCart
Abstract. Exact MaxSAT solvers, compared with SAT solvers, apply little inference at each node of the proof tree. Commonly used SAT inference rules like unit propagation produce a simplified formula that preserves satisfiability but, unfortunately, solving the MaxSAT problem for the simplified formula is not equivalent to solving it for the original formula. In this paper, we define a number of original inference rules that, besides being applied efficiently, transform MaxSAT instances into equivalent MaxSAT instances which are easier to solve. The soundness of the rules, that can be seen as refinements of unit resolution adapted to MaxSAT, are proved in a novel and simple way via an integer programming transformation. Aiming to find out how powerful the inference rules are in practice, we have developed a new MaxSAT solver, called MaxSatz, which incorporates those rules, and performed an experimental investigation. The results obtained provide empirical evidence that MaxSatz is very competitive and greatly outperforms the best stateoftheart MaxSAT solvers on random Max2SAT, random Max3SAT, MaxCut, and Graph 3coloring instances, as well as benchmarks submitted to the MaxSAT Evaluation 2006. 1