#### DMCA

## On Spectral Clustering: Analysis and an algorithm (2001)

Venue: | ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS |

Citations: | 1708 - 13 self |

### Citations

1573 | Nonlinear component analysis as a kernel eigenvalue problem.
- Scholkopf, Smola, et al.
- 1998
(Show Context)
Citation Context ...al. Spectral Algorithm I. (See text.) 5 Discussion There are some intriguing similarities between spectral clustering methods and KernelsPCA, which has been empirically observed to perform clustering =-=[7, 2-=-]. The main dierence between thesrst steps of our algorithm and Kernel PCA with a Gaussian kernel is the normalization of A (to form L) and X . These normalizations do improve the performance of the a... |

907 | Matrix perturbation theory
- Stewart, Sun
- 1990
(Show Context)
Citation Context ...ws of Y to cluster similarly to the rows of ^ Y ? Specically, when will the eigenvectors of L, which we now view as a perturbed version of ^ L, be \close" to those of ^ L? Matrix perturbation the=-=ory [10]-=- indicates that the stability of the eigenvectors of a matrix is determined by the eigengap. More precisely, the subspace spanned by ^ L'ssrst 3 eigenvectors will be stable to small changes to ^ L if ... |

404 | Contour and texture analysis for image segmentation.
- Malik, Belongie, et al.
- 2001
(Show Context)
Citation Context ...g. Here, one uses the top eigenvectors of a matrix derived from the distance between points. Such algorithms have been successfully used in many applications including computer vision and VLSI design =-=[5, 1-=-]. But despite their empirical successes, dierent authors still disagree on exactly which eigenvectors to use and how to derive clusters from them (see [11] for a review). Also, the analysis of these ... |

380 | Segmentation using eigenvectors: A unifying view.
- Weiss
- 1999
(Show Context)
Citation Context ...including computer vision and VLSI design [5, 1]. But despite their empirical successes, dierent authors still disagree on exactly which eigenvectors to use and how to derive clusters from them (see [=-=11-=-] for a review). Also, the analysis of these algorithms, which we brie y review below, has tended to focus on simplied algorithms that only use one eigenvector at a time. One line of analysis makes th... |

141 | Learning segmentation by random walks.
- Meila, Shi
- 2001
(Show Context)
Citation Context ...perimentally it has been observed that using more eigenvectors and directly computing a k way partitioning is better (e.g. [5, 1]). Here, we build upon the recent work of Weiss [11] and Meila and Shi =-=[6]-=-, who analyzed algorithms that use k eigenvectors simultaneously in simple settings. We propose a particular manner to use the k eigenvectors simultaneously, and give conditions under which the algori... |

123 |
Spectral Graph Theory. Number 92
- Chung
- 1997
(Show Context)
Citation Context ..., in which the second eigenvector of a graph's Laplacian is used to dene a semi-optimal cut. Here, the eigenvector is seen as a solving a relaxation of an NP-hard discrete graph partitioning problem [=-=3]-=-, and it can be shown that cuts based on the second eigenvector give a guaranteed approximation to the optimal cut [9, 3]. This analysis can be extended to clustering by building a weighted graph in w... |

51 | Feature grouping by relocalisation of eigenvectors of the proximity matrix.
- Scott, Longuet-Higgins
- 1990
(Show Context)
Citation Context ..., they form tight clusters (Figure 1h) from which our method obtains the good clustering shown in Figure 1e. We note that the clusters in Figure 1h lie at 90 to each other relative to the origin (cf. =-=[8]-=-). 1 Readers familiar with spectral graph theory [3] may be more familiar with the LaplaciansI L. But as replacing L with I L would complicate our later discussion, and only changes the eigenvalues (f... |

28 |
On clusterings-good, bad and spectral.
- Kannan, Vempala, et al.
- 2000
(Show Context)
Citation Context ...the performance of the algorithm, but it is also straightforward to extend our analysis to prove conditions under which Kernel PCA will indeed give clustering. While dierent in detail, Kannan et al. [=-=4]-=- give an analysis of spectral clustering that also makes use of matrix perturbation theory, for the case of an anity matrix with row sums equal to one. They also present a clustering algorithm based o... |

12 |
Spectral partitioning: the more eigenvectors
- Alpert, Kahng, et al.
- 1999
(Show Context)
Citation Context ...g. Here, one uses the top eigenvectors of a matrix derived from the distance between points. Such algorithms have been successfully used in many applications including computer vision and VLSI design =-=[5, 1-=-]. But despite their empirical successes, dierent authors still disagree on exactly which eigenvectors to use and how to derive clusters from them (see [11] for a review). Also, the analysis of these ... |

9 |
Spectral partitioning works: planar graphs and element meshes
- Spielman, Teng
- 1996
(Show Context)
Citation Context ... seen as a solving a relaxation of an NP-hard discrete graph partitioning problem [3], and it can be shown that cuts based on the second eigenvector give a guaranteed approximation to the optimal cut =-=[9, 3]-=-. This analysis can be extended to clustering by building a weighted graph in which nodes correspond to datapoints and edges are related to the distance between the points. Since the majority of analy... |

8 |
Spectral kernel methods for clustering
- Christianini, Shawe-Taylor, et al.
- 2002
(Show Context)
Citation Context ...al. Spectral Algorithm I. (See text.) 5 Discussion There are some intriguing similarities between spectral clustering methods and KernelsPCA, which has been empirically observed to perform clustering =-=[7, 2-=-]. The main dierence between thesrst steps of our algorithm and Kernel PCA with a Gaussian kernel is the normalization of A (to form L) and X . These normalizations do improve the performance of the a... |