Results 1  10
of
169
CHAMELEON: A Hierarchical Clustering Algorithm Using Dynamic Modeling
, 1999
"... Clustering in data mining is a discovery process that groups a set of data such that the intracluster similarity is maximized and the intercluster similarity is minimized. Existing clustering algorithms, such as Kmeans, PAM, CLARANS, DBSCAN, CURE, and ROCK are designed to find clusters that fit s ..."
Abstract

Cited by 268 (19 self)
 Add to MetaCart
Clustering in data mining is a discovery process that groups a set of data such that the intracluster similarity is maximized and the intercluster similarity is minimized. Existing clustering algorithms, such as Kmeans, PAM, CLARANS, DBSCAN, CURE, and ROCK are designed to find clusters that fit some static models. These algorithms can breakdown if the choice of parameters in the static model is incorrect with respect to the data set being clustered, or if the model is not adequate to capture the characteristics of clusters. Furthermore, most of these algorithms breakdown when the data consists of clusters that are of diverse shapes, densities, and sizes. In this paper, we present a novel hierarchical clustering algorithm called CHAMELEON that measures the similarity of two clusters based on a dynamic model. In the clustering process, two clusters are merged only if the interconnectivity and closeness (proximity) between two clusters are high relative to the internal intercon...
Dragon2000: StandardCell Placement Tool For Large Industry Circuits
 In Proc. Int. Conf. on Computer Aided Design
, 2000
"... In this paper, we develop a new standard cell placement tool, Dragon2000, to solve large scale placement problem effectively. A topdown hierarchical approach is used in Dragon2000. Stateoftheart partitioning tools are tightly integrated with wirelength minimization techniques to achieve superior ..."
Abstract

Cited by 90 (0 self)
 Add to MetaCart
(Show Context)
In this paper, we develop a new standard cell placement tool, Dragon2000, to solve large scale placement problem effectively. A topdown hierarchical approach is used in Dragon2000. Stateoftheart partitioning tools are tightly integrated with wirelength minimization techniques to achieve superior performance. We argue that netcut minimization is a good and important shortcut to solve the large scale placement problem. Experimental results show that minimizing netcut is more important than greedily obtain a wirelength optimal placement at intermediate hierarchical levels. We run Dragon2000 on recently released large benchmark suite ISPD98 as well as MCNC circuits. For circuits which have more than 100k cells, comparing to iTools1.4.0, Dragon2000 can produce slightly better placement results (1:4%) while spending much less amount of time (2\Theta speedup). This is also the first published placement result on the publicly available large industrial circuits. 1. INTRODUCTION Placeme...
Multilevel hypergraph partitioning
 Applications in VLSI design, ACM/IEEE Design Automation Conference
, 1997
"... Traditional hypergraph partitioning algorithms compute a bisection a graph such that the number of hyperedges that are cut by the partitioning is minimized and each partition has an equal number of vertices. The task of minimizing the cut can be considered as the objective and the requirement that t ..."
Abstract

Cited by 73 (3 self)
 Add to MetaCart
Traditional hypergraph partitioning algorithms compute a bisection a graph such that the number of hyperedges that are cut by the partitioning is minimized and each partition has an equal number of vertices. The task of minimizing the cut can be considered as the objective and the requirement that the partitions will be of the same size can be considered as the constraint. In this paper we extend the partitioning problem by incorporating an arbitrary number of balancing constraints. In our formulation, a vector of weights is assigned to each vertex, and the goal is to produce a bisection such that the partitioning satisfies a balancing constraint associated with each weight, while attempting to minimize the cut. We present new multiconstraint hypergraph partitioning algorithms that are based on the multilevel partitioning paradigm. We experimentally evaluate the effectiveness of our multiconstraint partitioners on a variety of synthetically generated problems.
Symbolic compositional verification by learning assumptions
 In CAV
, 2005
"... Abstract. The verification problem for a system consisting of components can be decomposed into simpler subproblems for the components using assumeguarantee reasoning. However, such compositional reasoning requires user guidance to identify appropriate assumptions for components. In this paper, we ..."
Abstract

Cited by 68 (7 self)
 Add to MetaCart
(Show Context)
Abstract. The verification problem for a system consisting of components can be decomposed into simpler subproblems for the components using assumeguarantee reasoning. However, such compositional reasoning requires user guidance to identify appropriate assumptions for components. In this paper, we propose an automated solution for discovering assumptions based on the L \Lambda algorithm for active learning of regular languages. We present a symbolic implementation of the learning algorithm, and incorporate it in the model checker NuSMV. Our experiments demonstrate significant savings in the computational requirements of symbolic model checking.
A Graph Theoretic Approach to Software Watermarking
, 2001
"... We present a graph theoretic approach for watermarking software in a robust fashion. While watermarking software that are small in size (e.g. a few kilobytes) may be infeasible through this approach, it seems to be a viable scheme for large applications. Our approach works with control/data flow gra ..."
Abstract

Cited by 64 (0 self)
 Add to MetaCart
We present a graph theoretic approach for watermarking software in a robust fashion. While watermarking software that are small in size (e.g. a few kilobytes) may be infeasible through this approach, it seems to be a viable scheme for large applications. Our approach works with control/data flow graphs and uses abstractions,approximate kpartitions,and a random walk method to embed the watermark,with the goal of minimizing and controlling the additions to be made for embedding,while keeping the estimated effort to undo the watermark (WM) as high as possible. The watermarks are so embedded that small changes to the software or flow graph are unlikely to disable detection by a probabilistic algorithm that has a secret. This is done by using some relatively robust graph properties and error correcting codes. Under some natural assumptions about the code added to embed the WM,locating the WM by an attacker is related to some graph approximation problems. Since little theoretical foundation exists for hardness of typical instances of graph approximation problems,we present heuristics to generate such hard instances and,in a limited case,present a heuristic analysis of how hard it is to separate the WM in an information theoretic model. We describe some related experimental work. The approach and methods described here also suitable for solving the problem of software tamper resistance.
Beyond pairwise clustering
 in IEEE Computer Society Conference on Computer Vision and Pattern Recognition
"... We consider the problem of clustering in domains where the affinity relations are not dyadic (pairwise), but rather triadic, tetradic or higher. The problem is an instance of the hypergraph partitioning problem. We propose a twostep algorithm for solving this problem. In the first step we use a nove ..."
Abstract

Cited by 61 (3 self)
 Add to MetaCart
(Show Context)
We consider the problem of clustering in domains where the affinity relations are not dyadic (pairwise), but rather triadic, tetradic or higher. The problem is an instance of the hypergraph partitioning problem. We propose a twostep algorithm for solving this problem. In the first step we use a novel scheme to approximate the hypergraph using a weighted graph. In the second step a spectral partitioning algorithm is used to partition the vertices of this graph. The algorithm is capable of handling hyperedges of all orders including order two, thus incorporating information of all orders simultaneously. We present a theoretical analysis that relates our algorithm to an existing hypergraph partitioning algorithm and explain the reasons for its superior performance. We report the performance of our algorithm on a variety of computer vision problems and compare it to several existing hypergraph partitioning algorithms. 1.
Improved Algorithms for Hypergraph Bipartitioning
 IN PROCEEDINGS OF THE ASIA AND SOUTH PACIFIC DESIGN AUTOMATION CONFERENCE
, 2000
"... Multilevel FiducciaMattheyses (MLFM) hypergraph partitioning [3, 22, 24] is a fundamental optimization in VLSI CAD physical design. The leading implementation, hMetis [23], has since 1997 proved itself substantially superior in both runtime and solution quality to even very recent works (e.g., [13, ..."
Abstract

Cited by 59 (15 self)
 Add to MetaCart
(Show Context)
Multilevel FiducciaMattheyses (MLFM) hypergraph partitioning [3, 22, 24] is a fundamental optimization in VLSI CAD physical design. The leading implementation, hMetis [23], has since 1997 proved itself substantially superior in both runtime and solution quality to even very recent works (e.g., [13, 17, 25]). In this work, we present two sets of results: (i) new techniques for flat FMbased hypergraph partitioning (which is the core of multilevel implementations), and (ii) a new multilevel implementation that offers leadingedge performance. Our new techniques for flat partitioning confirm the conjecture from [10], suggesting that specialized partitioning heuristics may be able to actively exploit fixed nodes in partitioning instances arising in the driving topdown placement context. Our FM variant is competitive with traditional FM on instances without terminals [1] and considerably superior on instances with fixed nodes (i.e., arising during topdown placement [8]). Our multilevel ...
Multilevel Refinement for Combinatorial Optimisation Problems
 SE10 9LS
, 2001
"... Abstract. We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found (some ..."
Abstract

Cited by 57 (5 self)
 Add to MetaCart
(Show Context)
Abstract. We consider the multilevel paradigm and its potential to aid the solution of combinatorial optimisation problems. The multilevel paradigm is a simple one, which involves recursive coarsening to create a hierarchy of approximations to the original problem. An initial solution is found (sometimes for the original problem, sometimes the coarsest) and then iteratively refined at each level. As a general solution strategy, the multilevel paradigm has been in use for many years and has been applied to many problem areas (most notably in the form of multigrid techniques). However, with the exception of the graph partitioning problem, multilevel techniques have not been widely applied to combinatorial optimisation problems. In this paper we address the issue of multilevel refinement for such problems and, with the aid of examples and results in graph partitioning, graph colouring and the travelling salesman problem, make a case for its use as a metaheuristic. The results provide compelling evidence that, although the multilevel framework cannot be considered as a panacea for combinatorial problems, it can provide an extremely useful addition to the combinatorial optimisation toolkit. We also give a possible explanation for the underlying process and extract some generic guidelines for its future use on other combinatorial problems.
Edge Separability Based Circuit Clustering with Application to Circuit Partitioning
 IEEE/ACM Asia South Pacific Design Automation Conference
, 2000
"... In this paper, we introduce a new efficient O(n log n) graph search based bottomup clustering algorithm named ESC (Edge Separability based Clustering). Unlike existing bottomup algorithms that are based on local connectivity information of the netlist, ESC exploits more global connectivity inform ..."
Abstract

Cited by 45 (23 self)
 Add to MetaCart
(Show Context)
In this paper, we introduce a new efficient O(n log n) graph search based bottomup clustering algorithm named ESC (Edge Separability based Clustering). Unlike existing bottomup algorithms that are based on local connectivity information of the netlist, ESC exploits more global connectivity information "edge separability" to guide clustering process while carefully monitoring cluster area balance. Computing the edge separability for a given edge e = (x; y) in an edge weighted undirected graph G(V; E; s; w) is equivalent to finding the xy mincut. Then, we show that a simple and efficient algorithm CAPFOREST [14] can be used to provide a good estimation of edge separability for all edges in G without using any network flow computation. Related experiments based on large scale ISPD98 [1] benchmark circuits confirm that exploiting edge separability yields better quality partitioning solution compared to various bottomup clustering algorithms proposed in the literature including Absorpt...