Results 1  10
of
27
A generic camera model and calibration method for conventional, wideangle, and fisheye lenses
 IEEE TRANS. PATTERN ANALYSIS AND MACHINE INTELLIGENCE
, 2006
"... Fisheye lenses are convenient in such applications where a very wide angle of view is needed but their use for measurement purposes has been limited by the lack of an accurate, generic, and easytouse calibration procedure. We hence propose a generic camera model, which is suitable for fisheye le ..."
Abstract

Cited by 71 (2 self)
 Add to MetaCart
Fisheye lenses are convenient in such applications where a very wide angle of view is needed but their use for measurement purposes has been limited by the lack of an accurate, generic, and easytouse calibration procedure. We hence propose a generic camera model, which is suitable for fisheye lens cameras as well as for conventional and wideangle lens cameras, and a calibration method for estimating the parameters of the model. The achieved level of calibration accuracy is comparable to the previously reported stateoftheart.
Parameterfree radial distortion correction with centre of distortion estimation
 In ICCV
, 2005
"... Abstract—We propose a method of simultaneously calibrating the radial distortion function of a camera and the other internal calibration parameters. The method relies on the use of a planar (or, alternatively, nonplanar) calibration grid which is captured in several images. In this way, the determin ..."
Abstract

Cited by 58 (0 self)
 Add to MetaCart
(Show Context)
Abstract—We propose a method of simultaneously calibrating the radial distortion function of a camera and the other internal calibration parameters. The method relies on the use of a planar (or, alternatively, nonplanar) calibration grid which is captured in several images. In this way, the determination of the radial distortion is an easy addon to the popular calibration method proposed by Zhang [24]. The method is entirely noniterative and, hence, is extremely rapid and immune to the problem of local minima. Our method determines the radial distortion in a parameterfree way, not relying on any particular radial distortion model. This makes it applicable to a large range of cameras from narrowangle to fisheye lenses. The method also computes the center of radial distortion, which, we argue, is important in obtaining optimal results. Experiments show that this point may be significantly displaced from the center of the image or the principal point of the camera. Index Terms—Radial distortion, camera calibration, fundamental matrix. 1
A minimal solution to the autocalibration of radial distortion
, 2007
"... Epipolar geometry and relative camera pose computation are examples of tasks which can be formulated as minimal problems and solved from a minimal number of image points. Finding the solution leads to solving systems of algebraic equations. Often, these systems are not trivial and therefore special ..."
Abstract

Cited by 39 (12 self)
 Add to MetaCart
(Show Context)
Epipolar geometry and relative camera pose computation are examples of tasks which can be formulated as minimal problems and solved from a minimal number of image points. Finding the solution leads to solving systems of algebraic equations. Often, these systems are not trivial and therefore special algorithms have to be designed to achieve numerical robustness and computational efficiency. In this paper we provide a solution to the problem of estimating radial distortion and epipolar geometry from eight correspondences in two images. Unlike previous algorithms, which were able to solve the problem from nine correspondences only, we enforce the determinant of the fundamental matrix be zero. This leads to a system of eight quadratic and one cubic equation in nine variables. We simplify the system by eliminating six of these variables. Then, we solve the system by finding eigenvectors of an action matrix of a suitably chosen polynomial. We show how to construct the action matrix without computing complete Gröbner basis, which provides an efficient and robust solver. The quality of the solver is demonstrated on synthetic and real data. 1.
A Theory of MultiLayer Flat Refractive Geometry
"... Flat refractive geometry corresponds to a perspective camera looking through single/multiple parallel flat refractive mediums. We show that the underlying geometry of rays corresponds to an axial camera. This realization, while missing from previous works, leads us to develop a general theory of cal ..."
Abstract

Cited by 13 (3 self)
 Add to MetaCart
(Show Context)
Flat refractive geometry corresponds to a perspective camera looking through single/multiple parallel flat refractive mediums. We show that the underlying geometry of rays corresponds to an axial camera. This realization, while missing from previous works, leads us to develop a general theory of calibrating such systems using 2D3D correspondences. The pose of 3D points is assumed to be unknown and is also recovered. Calibration can be done even using a single image of a plane. We show that the unknown orientation of the refracting layers corresponds to the underlying axis, and can be obtained independently of the number of layers, their distances from the camera and their refractive indices. Interestingly, the axis estimation can be mapped to the classical essential matrix computation and 5point algorithm [15] can be used. After computing the axis, the thicknesses of layers can be obtained linearly when refractive indices are known, and we derive analytical solutions when they are unknown. We also derive the analytical forward projection (AFP) equations to compute the projection of a 3D point via multiple flat refractions, which allows nonlinear refinement by minimizing the reprojection error. For two refractions, AFP is either4 th or12 th degree equation depending on the refractive indices. We analyze ambiguities due to small field of view, stability under noise, and show how a two layer system can be well approximated as a single layer system. Real experiments using a water tank validate our theory. 1.
Single Image Calibration of MultiAxial Imaging Systems
"... Imaging systems consisting of a camera looking at multiple spherical mirrors (reflection) or multiple refractive spheres (refraction) have been used for wideangle imaging applications. We describe such setups as multiaxial imaging systems, since a single sphere results in an axial system. Assum ..."
Abstract

Cited by 6 (1 self)
 Add to MetaCart
(Show Context)
Imaging systems consisting of a camera looking at multiple spherical mirrors (reflection) or multiple refractive spheres (refraction) have been used for wideangle imaging applications. We describe such setups as multiaxial imaging systems, since a single sphere results in an axial system. Assuming an internally calibrated camera, calibration of such multiaxial systems involves estimating the sphere radii and locations in the camera coordinate system. However, previous calibration approaches require manual intervention or constrained setups. We present a fully automatic approach using a single photo of a 2D calibration grid. The pose of the calibration grid is assumed to be unknown and is also recovered. Our approach can handle unconstrained setups, where the mirrors/refractive balls can be arranged in any fashion, not necessarily on a grid. The axial nature of rays allows us to compute the axis of each sphere separately. We then show that by choosing rays from two or more spheres, the unknown pose of the calibration grid can be obtained linearly and independently of sphere radii and locations. Knowing the pose, we derive analytical solutions for obtaining the sphere radius and location. This leads to an interesting result that 6DOF pose estimation of a multiaxial camera can be done without the knowledge of full calibration. Simulations and real experiments demonstrate the applicability of our algorithm. 1.
Planebased selfcalibration of radial distortion
"... We present an algorithm for planebased selfcalibration of cameras with radially symmetric distortions given a set of sparse feature matches in at least two views. The projection function of such cameras can be seen as a projection with a pinhole camera, followed by a nonparametric displacement of ..."
Abstract

Cited by 5 (1 self)
 Add to MetaCart
(Show Context)
We present an algorithm for planebased selfcalibration of cameras with radially symmetric distortions given a set of sparse feature matches in at least two views. The projection function of such cameras can be seen as a projection with a pinhole camera, followed by a nonparametric displacement of the image points in the direction of the distortion center. The displacement is a function of the points’ distance to the center. Thus, the generated distortion is radially symmetric. Regular cameras, fisheyes as well as the most popular central catadioptric devices can be described by such a model. Our approach recovers a distortion function consistent with all the views, or estimates one for each view if they are taken by different cameras. We consider a least squares algebraic solution for computing the homography between two views that is valid for rectified (undistorted) point correspondences. We observe that the terms of the function are bilinear in the unknowns of the homography and the distortion coefficient associated to each point. Our contribution is to approximate this nonconvex problem by a convex one. To do so, we replace the bilinear terms by a set of new variables and obtain a linear least squares problem. We show that like the distortion coefficients, these variables are subject to monotonicity constraints. Thus, the approximate problem is a convex quadratic program. We show that solving it is sufficient for accurately estimating the distortion parameters. We validate our approach on simulated data as well as on fisheye and catadioptric cameras. We also compare our solution to three stateoftheart algorithms and show similar performance. 1.
P.F.: Efficient generic calibration method for general cameras with single centre of projection
 In: Proceedings of the IEEE International Conference on Computer Vision, Rio de
, 2007
"... Generic camera calibration is a nonparametric calibration technique that is applicable to any type of vision sensor. However, the standard generic calibration method was developed with the goal of generality, and it is therefore suboptimal for the common case of cameras with a single centre of proj ..."
Abstract

Cited by 3 (1 self)
 Add to MetaCart
(Show Context)
Generic camera calibration is a nonparametric calibration technique that is applicable to any type of vision sensor. However, the standard generic calibration method was developed with the goal of generality, and it is therefore suboptimal for the common case of cameras with a single centre of projection (e.g. pinhole, fisheye, hyperboloidal catadioptric). This paper proposes novel improvements to the standard generic calibration method for central cameras that reduce its complexity, and improve its accuracy and robustness. Improvements are achieved by taking advantage of the geometric constraints resulting from a single centre of projection. Input data for the algorithm is acquired using active grids, the performance of which is characterised. A new linear estimation stage to the generic algorithm is proposed incorporating classical pinhole calibration techniques, and it is shown to be significantly more accurate than the linear estimation stage of the standard method. A linear method for pose estimation is also proposed and evaluated against the existing polynomial method. Distortion correction and motion reconstruction experiments are conducted with real data for a hyperboloidal catadioptric sensor for both the standard and proposed methods. Results show the accuracy and robustness of the proposed method to be superior to those of the standard method. 1.
MultiView Matching Tensors from Lines for General Camera Models
"... Abstract General camera models relax the constraint on central projection and characterize cameras as mappings between each pixel and the corresponding projection rays. This allows to describe most cameras types, including classical pinhole cameras, cameras with various optical distortions, catadiop ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
Abstract General camera models relax the constraint on central projection and characterize cameras as mappings between each pixel and the corresponding projection rays. This allows to describe most cameras types, including classical pinhole cameras, cameras with various optical distortions, catadioptric cameras and other acquisition devices. We deal with the structure from motion problem for such general models. We first consider an hierarchy of general cameras first introduced in [28] where the cameras are described according to the number of points and lines that have a nonempty intersection with all the projection rays. Then we propose a study of the multiview geometry of such cameras and a new formulation of multiview matching tensors working for projection rays crossing the same 3D line, the counterpart of the fundamental matrices and the multifocal tensors of the standard perspective cameras. We also delineate a method to estimate such tensors and recover the motion between the views. 1
Radial Distortion SelfCalibration
"... In cameras with radial distortion, straight lines in space are in general mapped to curves in the image. Although epipolar geometry also gets distorted, there is a set of special epipolar lines that remain straight, namely those that go through the distortion center. By finding these straight epipo ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
(Show Context)
In cameras with radial distortion, straight lines in space are in general mapped to curves in the image. Although epipolar geometry also gets distorted, there is a set of special epipolar lines that remain straight, namely those that go through the distortion center. By finding these straight epipolar lines in camera pairs we can obtain constraints on the distortion center(s) without any calibration object or plumbline assumptions in the scene. Although this holds for all radial distortion models we conceptually prove this idea using the division distortion model and the radial fundamental matrix which allow for a very simple closed form solution of the distortion center from two views (same distortion) or three views (different distortions). The noniterative nature of our approach makes it immune to local minima and allows finding the distortion center also for cropped images or those where no good prior exists. Besides this, we give comprehensive relations between different undistortion models and discuss advantages and drawbacks. 1.
Unknown radial distortion centers in multiple view geometry problems
 In Proceedings of the Asian Conference on Computer Vision
, 2012
"... Abstract. The radial undistortion model proposed by Fitzgibbon and the radial fundamental matrix were early steps to extend classical epipolar geometry to distorted cameras. Later minimal solvers have been proposed to find relative pose and radial distortion, given point correspondences between i ..."
Abstract

Cited by 2 (1 self)
 Add to MetaCart
(Show Context)
Abstract. The radial undistortion model proposed by Fitzgibbon and the radial fundamental matrix were early steps to extend classical epipolar geometry to distorted cameras. Later minimal solvers have been proposed to find relative pose and radial distortion, given point correspondences between images. However, a big drawback of all these approaches is that they require the distortion center to be exactly known. In this paper we show how the distortion center can be absorbed into a new radial fundamental matrix. This new formulation is much more practical in reality as it allows also digital zoom, cropped images and cameralens systems where the distortion center does not exactly coincide with the image center. In particular we start from the setting where only one of the two images contains radial distortion, analyze the structure of the particular radial fundamental matrix and show that the technique also generalizes to other linear multiview relationships like trifocal tensor and homography. For the new radial fundamental matrix we propose different estimation algorithms from 9,10 and 11 points. We show how to extract the epipoles and prove the practical applicability on several epipolar geometry image pairs with strong distortion that to the best of our knowledge no other existing algorithm can handle properly. 1