Results 11  20
of
31,596
Mean shift: A robust approach toward feature space analysis
 In PAMI
, 2002
"... A general nonparametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure, the mean shift. We prove for discrete data the convergence ..."
Abstract

Cited by 2395 (37 self)
 Add to MetaCart
the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and thus its utility in detecting the modes of the density. The equivalence of the mean shift procedure to the Nadaraya–Watson estimator from kernel regression and the robust M
Robust wide baseline stereo from maximally stable extremal regions
 In Proc. BMVC
, 2002
"... The widebaseline stereo problem, i.e. the problem of establishing correspondences between a pair of images taken from different viewpoints is studied. A new set of image elements that are put into correspondence, the so called extremal regions, is introduced. Extremal regions possess highly desir ..."
Abstract

Cited by 1016 (35 self)
 Add to MetaCart
sirable properties: the set is closed under 1. continuous (and thus projective) transformation of image coordinates and 2. monotonic transformation of image intensities. An efficient (near linear complexity) and practically fast detection algorithm (near frame rate) is presented for an affinelyinvariant stable
Robust Algorithms for Object Localization
 International Journal of Computer Vision
, 1998
"... Object localization using sensed data features and corresponding model features is a fundamental problem in machine vision. We reformulate object localization as a least squares problem: the optimal pose estimate minimizes the squared error (discrepancy) between the sensed and predicted data. The re ..."
Abstract

Cited by 5 (2 self)
 Add to MetaCart
. The resulting problem is nonlinear and previous attempts to estimate the optimal pose using local methods such as gradient descent suffer from local minima and, at times, return incorrect results. In this paper, we describe an exact, accurate and efficient algorithm based on resultants, linear algebra
CURE: An Efficient Clustering Algorithm for Large Data sets
 Published in the Proceedings of the ACM SIGMOD Conference
, 1998
"... Clustering, in data mining, is useful for discovering groups and identifying interesting distributions in the underlying data. Traditional clustering algorithms either favor clusters with spherical shapes and similar sizes, or are very fragile in the presence of outliers. We propose a new clustering ..."
Abstract

Cited by 722 (5 self)
 Add to MetaCart
clustering algorithm called CURE that is more robust to outliers, and identifies clusters having nonspherical shapes and wide variances in size. CURE achieves this by representing each cluster by a certain fixed number of points that are generated by selecting well scattered points from the cluster
“Robust Algorithms for Discrete Tomography”
, 2012
"... Delft, the NetherlandsPreface Since early in the 20th century tomography has been of major interest for it provides means to do noninvasive visualisation of the interior of objects such as the human body. Tomography methods concentrate on reconstructing objects from multiple projections that are ob ..."
Abstract
 Add to MetaCart
Delft, the NetherlandsPreface Since early in the 20th century tomography has been of major interest for it provides means to do noninvasive visualisation of the interior of objects such as the human body. Tomography methods concentrate on reconstructing objects from multiple projections that are obtained by sending, for example, Xrays through the object. Applications of these methods are, among others, radiology (CT, MRI and PET scans), geophysics and material science. The tomographic problems can be formulated as a system of linear equations. Unfortunately, these systems are not square and thus not symmetric or positive (semi)definite and in general rank deficient. Inmaterialscienceoneisoftenpresentedwithverysmallobjects(likecrystalsornanostructures) that consist of one or a small number of different materials, each with its own density. Scanning these small objects can cause damage to the structure and thus one can only take a very limited amount of projections. Fortunately, one can use the prior knowledge about the object to arrive at a reconstruction of the original object. How to arrive at this reconstruction is studied by the field of discrete tomography (DT). With every kind of tomography, and thus also with DT, one is faced with noisy data. Because of
"GrabCut”  interactive foreground extraction using iterated graph cuts
 ACM TRANS. GRAPH
, 2004
"... The problem of efficient, interactive foreground/background segmentation in still images is of great practical importance in image editing. Classical image segmentation tools use either texture (colour) information, e.g. Magic Wand, or edge (contrast) information, e.g. Intelligent Scissors. Recently ..."
Abstract

Cited by 1130 (36 self)
 Add to MetaCart
of the iterative algorithm is used to simplify substantially the user interaction needed for a given quality of result. Thirdly, a robust algorithm for “border matting ” has been developed to estimate simultaneously the alphamatte around an object boundary and the colours of foreground pixels. We show
Robust Algorithm for Impulse Noise Reduction
"... Abstract – This Paper presents highly efficient two phase schema for removing impulse noise. In the first phase, robust algorithm for noise detection is used to identify noisy pixels. In the second phase, the image is restored using special noise control algorithm. Efficiency of algorithm is tested ..."
Abstract
 Add to MetaCart
Abstract – This Paper presents highly efficient two phase schema for removing impulse noise. In the first phase, robust algorithm for noise detection is used to identify noisy pixels. In the second phase, the image is restored using special noise control algorithm. Efficiency of algorithm is tested
LucasKanade 20 Years On: A Unifying Framework: Part 3
 International Journal of Computer Vision
, 2002
"... Since the LucasKanade algorithm was proposed in 1981 image alignment has become one of the most widely used techniques in computer vision. Applications range from optical flow, tracking, and layered motion, to mosaic construction, medical image registration, and face coding. Numerous algorithms hav ..."
Abstract

Cited by 706 (30 self)
 Add to MetaCart
appearance variation with the robust error functions described in Part 2 of this series. We first derive robust versions of the simultaneous and normalization algorithms. Since both of these algorithms are very inefficient, as in Part 2 we derive efficient approximations based on spatial coherence. We end
Results 11  20
of
31,596