Results 1  10
of
32
Geometric structure of degeneracy for multibody motion segmentation
 In Workshop on Statistical Methods in Video Processing
, 2004
"... Abstract. Many techniques have been proposed for segmenting feature point trajectories tracked through a video sequence into independent motions. It has been found, however, that methods that perform very well in simulations perform very poorly for real video sequences. This paper resolves this myst ..."
Abstract

Cited by 31 (0 self)
 Add to MetaCart
(Show Context)
Abstract. Many techniques have been proposed for segmenting feature point trajectories tracked through a video sequence into independent motions. It has been found, however, that methods that perform very well in simulations perform very poorly for real video sequences. This paper resolves this mystery by analyzing the geometric structure of the degeneracy of the motion model. This leads to a new segmentation algorithm: a multistage unsupervised learning scheme first using the degenerate motion model and then using the general 3D motion model. We demonstrate by simulated and real video experiments that our method is superior to all existing methods in practical situations. 1
Registration with uncertainties and statistical modeling of shapes with variable metric kernels
 IEEE Transactions on Pattern Analysis and Machine Intelligence
"... ..."
(Show Context)
Statistical optimization for geometric fitting: Theoretical accuracy analysis and high order error analysis
 Int. J. Comput. Vis
, 2008
"... A rigorous accuracy analysis is given to various techniques for estimating parameters of geometric models from noisy data for computer vision applications. First, it is pointed out that parameter estimation for vision applications is very different in nature from traditional statistical analysis and ..."
Abstract

Cited by 18 (12 self)
 Add to MetaCart
(Show Context)
A rigorous accuracy analysis is given to various techniques for estimating parameters of geometric models from noisy data for computer vision applications. First, it is pointed out that parameter estimation for vision applications is very different in nature from traditional statistical analysis and hence a different mathematical framework is necessary in such a domain. After general theories on estimation and accuracy are given, typical existing techniques are selected, and their accuracy is evaluated up to higher order terms. This leads to a “hyperaccurate ” method that outperforms existing methods. 1.
Interacting multiple model monocular SLAM
 In IEEE International Conference on Robotics and Automation (ICRA
, 2008
"... Abstract — Recent work has demonstrated the benefits of adopting a fully probabilistic SLAM approach in sequential motion and structure estimation from an image sequence. Unlike standard Structure from Motion (SFM) methods, this ‘monocular SLAM ’ approach is able to achieve driftfree estimation wit ..."
Abstract

Cited by 12 (2 self)
 Add to MetaCart
(Show Context)
Abstract — Recent work has demonstrated the benefits of adopting a fully probabilistic SLAM approach in sequential motion and structure estimation from an image sequence. Unlike standard Structure from Motion (SFM) methods, this ‘monocular SLAM ’ approach is able to achieve driftfree estimation with high framerate realtime operation, particularly benefitting from highly efficient active feature search, map management and mismatch rejection. A consistent thread in this research on realtime monocular SLAM has been to reduce the assumptions required. In this paper we move towards the logical conclusion of this direction by implementing a fully Bayesian Interacting Multiple Models (IMM) framework which can switch automatically between parameter sets in a dimensionless formulation of monocular SLAM. Remarkably, our approach of full sequential probability propagation means that there is no need for penalty terms to achieve the Occam property of favouring simpler models — this arises automatically. We successfully tackle the known stiffness in onthefly monocular SLAM start up without known patterns in the scene. The search regions for matches are also reduced in size with respect to single model EKF increasing the rejection of spurious matches. We demonstrate our method with results on a complex real image sequence with varied motion. I.
On the Representation of Shapes Using Implicit Functions
"... In this chapter, we explore shape representation, registration and modeling through implicit functions. To this end, we propose novel techniques to global and local registration of shapes through the alignment of the corresponding distance transforms, that consists of defining objective functions th ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
In this chapter, we explore shape representation, registration and modeling through implicit functions. To this end, we propose novel techniques to global and local registration of shapes through the alignment of the corresponding distance transforms, that consists of defining objective functions that minimize metrics between the implicit representations of shapes. Registration methods...
Robust Factorization Methods Using a Gaussian/Uniform Mixture Model
"... this paper we address the problem of building a class of robust factorization algorithms that solve for the shape and motion parameters with both affine (weak perspective) and perspective camera models. We introduce a Gaussian/uniform mixture model and its associated EM algorithm. This allows us to ..."
Abstract

Cited by 10 (7 self)
 Add to MetaCart
this paper we address the problem of building a class of robust factorization algorithms that solve for the shape and motion parameters with both affine (weak perspective) and perspective camera models. We introduce a Gaussian/uniform mixture model and its associated EM algorithm. This allows us to address parameter estimation within a data clustering approach. We propose a robust technique that works with any affine factorization method and makes it resilient to outliers. In addition, we show how such a framework can be further embedded into an iterative perspective factorization scheme. We carry out a large number of experiments to validate our algorithms and to compare them with existing ones. We also compare our approach with factorization methods that use Mestimators.
Exploiting Uncertainty in Random Sample Consensus
"... In this work, we present a technique for robust estimation, which by explicitly incorporating the inherent uncertainty of the estimation procedure, results in a more efficient robust estimation algorithm. In addition, we build on recent work in randomized model verification, and use this to characte ..."
Abstract

Cited by 10 (1 self)
 Add to MetaCart
(Show Context)
In this work, we present a technique for robust estimation, which by explicitly incorporating the inherent uncertainty of the estimation procedure, results in a more efficient robust estimation algorithm. In addition, we build on recent work in randomized model verification, and use this to characterize the ‘nonrandomness ’ of a solution. The combination of these two strategies results in a robust estimation procedure that provides a significant speedup over existing RANSAC techniques, while requiring no prior information to guide the sampling process. In particular, our algorithm requires, on average, 310 times fewer samples than standard RANSAC, which is in close agreement with theoretical predictions. The efficiency of the algorithm is demonstrated on a selection of geometric estimation problems. 1.
Further improving geometric fitting
 Proc. 5th Int. Conf. 3D Digital Imaging and Modeling
, 2005
"... We give a formal definition of geometric fitting in a way that suits computer vision applications. We point out that the performance of geometric fitting should be evaluated in the limit of small noise rather than in the limit of a large number of data as recommended in the statistical literature. T ..."
Abstract

Cited by 6 (4 self)
 Add to MetaCart
(Show Context)
We give a formal definition of geometric fitting in a way that suits computer vision applications. We point out that the performance of geometric fitting should be evaluated in the limit of small noise rather than in the limit of a large number of data as recommended in the statistical literature. Taking the KCR lower bound as an optimality requirement and focusing on the linearized constraint case, we compare the accuracy of Kanatani’s renormalization with maximum likelihood (ML) approaches including the FNS of Chojnacki et al. and the HEIV of Leedan and Meer. Our analysis reveals the existence of a method superior to all these. 1.
Modelling Shapes with Uncertainties: Higher Order Polynomials, Variable Bandwidth Kernels and non Parametric Density Estimation
"... In this paper, we introduce a new technique for shape modelling in the space of implicit polynomials. Registration consists of recovering an optimal onetoone transformation of a higher order polynomial along with uncertainties measures that are determined according to the covariance matrix of the ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
In this paper, we introduce a new technique for shape modelling in the space of implicit polynomials. Registration consists of recovering an optimal onetoone transformation of a higher order polynomial along with uncertainties measures that are determined according to the covariance matrix of the correspondences at the zero isosurface. In the modelling phase, these measures are used to weight the importance of the training samples phase according to a variable bandwidth nonparametric density estimation process. The selection of the most appropriate kernels to represent the training set is done through the maximum likelihood criterion. Excellent results for patterns of digits, related with the registration and the modelling aspects of our approach demonstrate the potentials of our method.
Optimality of maximum likelihood estimation for geometric fitting and the KCR lower bound
 Memoirs Fac. Engin. Okayama Univ
, 2005
"... Geometric fitting is one of the most fundamental problems of computer vision. In [8], the author derived a theoretical accuracy bound (KCR lower bound) for geometric fitting in general and proved that maximum likelihood (ML) estimation is statistically optimal. Recently, Chernov and Lesort [3] prove ..."
Abstract

Cited by 3 (2 self)
 Add to MetaCart
(Show Context)
Geometric fitting is one of the most fundamental problems of computer vision. In [8], the author derived a theoretical accuracy bound (KCR lower bound) for geometric fitting in general and proved that maximum likelihood (ML) estimation is statistically optimal. Recently, Chernov and Lesort [3] proved a similar result, using a weaker assumption. In this paper, we compare their formulation with the author’s and describe the background of the problem. We also review recent topics including semiparametric models and discuss remaining issues. 1. What Is the Problem? By geometric fitting, we mean fitting a geometric constraint to observed data and discerning the underlying geometric structure from the coefficients of the fitted equation [8]. A large class of computer vision problems fall into this framework. The simplest example is to fit a parametric curve (e.g., a line, a circle, an ellipse, or a polynomial curve) in the form F (x;u) = 0 (1) to N points {(xα, yα)} in the image, where x = (x, y)> is the position vector, and u = (u1,..., up)> is the parameter vector. For noisy data {(xα, yα)}, no parameter u satisfies F (xα;u) = 0 for all α = 1,..., N, so one often computes a u such that JLS = N∑ α=1 F (xα;u)2 → min. (2) This is called the leastsquares (LS) method or algebraic distance minimization. However, it is widely known that the solution has strong statistical bias. A better method known to yield higher accuracy is to regard the data {xα} as perturbed from their true positions {x̄α} which are exactly on the curve F (x;u) = 0 and to simultaneously estimate the true positions {x̄α} and the parameter u that maximize the statistical likelihood. If noise is subject to isotropic, independent, and identical Gaussian distribution, this reduces to the minimization