• Documents
  • Authors
  • Tables
  • Log in
  • Sign up
  • MetaCart
  • DMCA
  • Donate

CiteSeerX logo

Advanced Search Include Citations

Tools

Sorted by:
Try your query at:
Semantic Scholar Scholar Academic
Google Bing DBLP
Results 1 - 10 of 409,782
Next 10 →

Demystifying Information-Theoretic Clustering

by Greg Ver Steeg, Aram Galstyan, Fei Sha - Proceedings of The 31st International Conference on Machine Learning; arXiv:1310.4210 , 2014
"... We propose a novel method for clustering data which is grounded in information-theoretic prin-ciples and requires no parametric assumptions. Previous attempts to use information theory to define clusters in an assumption-free way are based on maximizing mutual information be-tween data and cluster l ..."
Abstract - Cited by 3 (1 self) - Add to MetaCart
We propose a novel method for clustering data which is grounded in information-theoretic prin-ciples and requires no parametric assumptions. Previous attempts to use information theory to define clusters in an assumption-free way are based on maximizing mutual information be-tween data and cluster

C.: Robust information-theoretic clustering

by Christos Faloutsos, Jia-yu Pan, Claudia Plant - In: KDD , 2006
"... How do we find a natural clustering of a real world point set, which contains an unknown number of clusters with different shapes, and which may be contaminated by noise? Most clustering algorithms were designed with certain as-sumptions (Gaussianity), they often require the user to give input param ..."
Abstract - Cited by 10 (4 self) - Add to MetaCart
parameters, and they are sensitive to noise. In this pa-per, we propose a robust framework for determining a nat-ural clustering of a given data set, based on the minimum description length (MDL) principle. The proposed frame-work, Robust Information-theoretic Clustering (RIC), is or-thogonal to any known

ABSTRACT Robust Information-theoretic Clustering

by Christian Böhm, Christos Faloutsos, Jia-yu Pan, Claudia Plant
"... How do we find a natural clustering of a real world point set, which contains an unknown number of clusters with different shapes, and which may be contaminated by noise? Most clustering algorithms were designed with certain assumptions (Gaussianity), they often require the user to give input parame ..."
Abstract - Add to MetaCart
parameters, and they are sensitive to noise. In this paper, we propose a robust framework for determining a natural clustering of a given data set, based on the minimum description length (MDL) principle. The proposed framework, Robust Information-theoretic Clustering (RIC), is orthogonal to any known

ITCH: information-theoretic cluster hierarchies

by Frank Fiedler, Annahita Oswald, Claudia Plant, Bianca Wackersreuther, Peter Wackersreuther - In Proceedings of the European conference on machine learning and knowledge discovery in databases (ECML PKDD , 2010
"... Abstract. Hierarchical clustering methods are widely used in various scientific domains such as molecular biology, medicine, economy, etc. Despite the maturity of the research field of hierarchical clustering, we have identified the following four goals which are not yet fully satisfied by previous ..."
Abstract - Cited by 2 (0 self) - Add to MetaCart
parameter settings. With ITCH, we propose a novel clus-tering method that is built on a hierarchical variant of the information-theoretic principle of Minimum Description Length (MDL), referred to as hMDL. Inter-preting the hierarchical cluster structure as a statistical model of the data set, it can

Information Theoretic Clustering using Minimum Spanning Trees

by Andreas C. Müller, Sebastian Nowozin, Christoph H. Lampert
"... Abstract. In this work we propose a new information-theoretic clustering algorithm that infers cluster memberships by direct optimization of a non-parametric mutual information estimate between data distribution and cluster assignment. Although the optimization objective has a solid theoretical foun ..."
Abstract - Cited by 5 (0 self) - Add to MetaCart
Abstract. In this work we propose a new information-theoretic clustering algorithm that infers cluster memberships by direct optimization of a non-parametric mutual information estimate between data distribution and cluster assignment. Although the optimization objective has a solid theoretical

Information Theoretical Clustering via Semidefinite Programming

by Meihong Wang, Fei Sha
"... We propose techniques of convex optimization for information theoretical clustering. The clustering objective is to maximize the mutual information between data points and cluster assignments. We formulate this problem first as an instance of max k cut on weighted graphs. We then apply the technique ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
We propose techniques of convex optimization for information theoretical clustering. The clustering objective is to maximize the mutual information between data points and cluster assignments. We formulate this problem first as an instance of max k cut on weighted graphs. We then apply

Efficient Information Theoretic Clustering on Discrete Lattices

by Christian Bauckhage, Kristian Kersting
"... We consider the problem of clustering data that reside on discrete, low dimensional lattices. Canonical examples for this setting are found in image segmentation and key point extraction. Our solution is based on a recent approach to information theoretic clustering where clusters result from an ite ..."
Abstract - Cited by 1 (1 self) - Add to MetaCart
We consider the problem of clustering data that reside on discrete, low dimensional lattices. Canonical examples for this setting are found in image segmentation and key point extraction. Our solution is based on a recent approach to information theoretic clustering where clusters result from

Information Theoretic Clustering of Sparse Co-Occurrence Data

by unknown authors
"... A novel approach to clustering co-occurrence data poses it as an optimization problem in information theory which minimizes the resulting loss in mutual information. A divisive clustering algorithm that monotonically reduces this loss function was recently proposed. In this paper we show that sparse ..."
Abstract - Add to MetaCart
local minima. Finally, we combine these solutions to get a robust algorithm that is computationally efficient. We present experimental results to show that the proposed method is effective in clustering document collections and outperforms previous information-theoretic clustering approaches. 1

Sail: summation-based incremental learning for information-theoretic clustering

by Junjie Wu, Hui Xiong, Jian Chen - In KDD , 2008
"... Information-theoretic clustering aims to exploit information theoretic measures as the clustering criteria. A common practice on this topic is so-called INFO-K-means, which performs K-means clustering with the KL-divergence as the proximity function. While expert efforts on INFO-K-means have shown p ..."
Abstract - Cited by 4 (0 self) - Add to MetaCart
Information-theoretic clustering aims to exploit information theoretic measures as the clustering criteria. A common practice on this topic is so-called INFO-K-means, which performs K-means clustering with the KL-divergence as the proximity function. While expert efforts on INFO-K-means have shown

Information theoretic clustering of sparse co-occurrence data

by Inderjit S. Dhillon, Yuqiang Guandepartment, Computer Sciences - In Proceedings of the Third IEEE International Conference on Data Mining (ICDM-03 , 2003
"... ..."
Abstract - Cited by 34 (1 self) - Add to MetaCart
Abstract not found
Next 10 →
Results 1 - 10 of 409,782
Powered by: Apache Solr
  • About CiteSeerX
  • Submit and Index Documents
  • Privacy Policy
  • Help
  • Data
  • Source
  • Contact Us

Developed at and hosted by The College of Information Sciences and Technology

© 2007-2019 The Pennsylvania State University