Results 1  10
of
19
Clear and compress: Computing persistent homology in chunks
 In TopoInVis
, 2013
"... We present a parallelizable algorithm for computing the persistent homology of a filtered chain complex. Our approach differs from the commonly used reduction algorithm by first computing persistence pairs within local chunks, then simplifying the unpaired columns, and finally applying standard redu ..."
Abstract

Cited by 10 (3 self)
 Add to MetaCart
(Show Context)
We present a parallelizable algorithm for computing the persistent homology of a filtered chain complex. Our approach differs from the commonly used reduction algorithm by first computing persistence pairs within local chunks, then simplifying the unpaired columns, and finally applying standard reduction on the simplified matrix. The approach generalizes a technique by Günther et al., which uses discrete Morse Theory to compute persistence; we derive the same worstcase complexity bound in a more general context. The algorithm employs several practical optimization techniques which are of independent interest. Our sequential implementation of the algorithm is competitive with stateoftheart methods, and we improve the performance through parallelized computation. 1
Parallel Computation of 2D MorseSmale Complexes
"... Abstract—The MorseSmale complex is a useful topological data structure for the analysis and visualization of scalar data. This paper describes an algorithm that processes all mesh elements of the domain in parallel to compute the MorseSmale complex of large twodimensional data sets at interactive ..."
Abstract

Cited by 9 (3 self)
 Add to MetaCart
(Show Context)
Abstract—The MorseSmale complex is a useful topological data structure for the analysis and visualization of scalar data. This paper describes an algorithm that processes all mesh elements of the domain in parallel to compute the MorseSmale complex of large twodimensional data sets at interactive speeds. We employ a reformulation of the MorseSmale complex using Forman’s Discrete Morse Theory and achieve scalability by computing the discrete gradient using local accesses only. We also introduce a novel approach to merge gradient paths that ensures accurate geometry of the computed complex. We demonstrate that our algorithm performs well on both multicore environments and on massively parallel architectures such as the GPU. Index Terms—Topologybased methods, discrete Morse theory, large datasets, gradient pairs, multicore, 2D scalar functions.
Efficient Computation of 3D MorseSmale Complexes and Persistent Homology using Discrete Morse Theory
"... We propose an efficient algorithm that computes the MorseSmale complex for 3D grayscale images. This complex allows for a efficient computation of persistent homology since it is, in general, much smaller than the input data but still contains all necessary information. Our method improves a rece ..."
Abstract

Cited by 8 (3 self)
 Add to MetaCart
We propose an efficient algorithm that computes the MorseSmale complex for 3D grayscale images. This complex allows for a efficient computation of persistent homology since it is, in general, much smaller than the input data but still contains all necessary information. Our method improves a recently proposed algorithm to extract the MorseSmale complex in terms of memory consumption and running time. It also allows for a parallel computation of the complex. The computational complexity of the MorseSmale complex extraction solely depends on the topological complexity of the input data. The persistence is then computed using the MorseSmale complex by applying an existing algorithm with a good practical running time. We demonstrate that our method allows for the computation of persistent homology for large data on commodity hardware.
Efficient computation of a hierarchy of discrete 3D gradient vector fields
 IN PROC. TOPOINVIS
, 2011
"... This paper introduces a novel combinatorial algorithm to compute a hierarchy of discrete gradient vector fields for threedimensional scalar fields. The hierarchy is defined by an importance measure and represents the combinatorial gradient flow at different levels of detail. The presented algorit ..."
Abstract

Cited by 6 (3 self)
 Add to MetaCart
(Show Context)
This paper introduces a novel combinatorial algorithm to compute a hierarchy of discrete gradient vector fields for threedimensional scalar fields. The hierarchy is defined by an importance measure and represents the combinatorial gradient flow at different levels of detail. The presented algorithm is based on Forman’s discrete Morse theory, which guarantees topological consistency and algorithmic robustness. In contrast to previous work, our algorithm combines memory and runtime efficiency. It thereby lends itself to the analysis of large data sets. A discrete gradient vector field is also a compact representation of the underlying extremal structures – the critical points, separation lines and surfaces. Given a certain level of detail, an explicit geometric representation of these structures can be extracted using simple and fast graph algorithms.
Distributed Merge Trees
"... Improved simulations and sensors are producing datasets whose increasing complexity exhausts our ability to visualize and comprehend them directly. To cope with this problem, we can detect and extract significant features in the data and use them as the basis for subsequent analysis. Topological met ..."
Abstract

Cited by 4 (2 self)
 Add to MetaCart
(Show Context)
Improved simulations and sensors are producing datasets whose increasing complexity exhausts our ability to visualize and comprehend them directly. To cope with this problem, we can detect and extract significant features in the data and use them as the basis for subsequent analysis. Topological methods are valuable in this context because they provide robust and general feature definitions. As the growth of serial computational power has stalled, data analysis is becoming increasingly dependent on massively parallel machines. To satisfy the computational demand created by complex datasets, algorithms need to effectively utilize these computer architectures. The main strength of topological methods, their emphasis on global information, turns into an obstacle during parallelization. We present two approaches to alleviate this problem. We develop a distributed representation of the merge tree that avoids computing the global tree on a single processor and lets us parallelize subsequent queries. To account for the increasing number of cores per processor, we develop a new data structure that lets us take advantage of multiple sharedmemory cores to parallelize the work on a single node. Finally, we present experiments that illustrate the strengths of our approach as well as help identify future challenges.
Generalized Topological Simplification of Scalar Fields on Surfaces
"... Fig. 1. Given an input scalar field f (left), our combinatorial algorithm generates a simplified function g that provably admits only critical points from a constrained subset of the singularities of f. Our approach is completely oblivious to the employed feature selection strategy, while guaranteei ..."
Abstract

Cited by 4 (3 self)
 Add to MetaCart
(Show Context)
Fig. 1. Given an input scalar field f (left), our combinatorial algorithm generates a simplified function g that provably admits only critical points from a constrained subset of the singularities of f. Our approach is completely oblivious to the employed feature selection strategy, while guaranteeing a small distance f − g ∞ for datafitting purpose. Thus it supports applicationdependent simplification scenarios such as the removal of singularities based on local geometrical measures, interactive user selection or even random selection. The topology of the resulting field is summarized with the inset Reeb graphs for illustration purpose. Abstract — We present a combinatorial algorithm for the general topological simplification of scalar fields on surfaces. Given a scalar field f, our algorithm generates a simplified field g that provably admits only critical points from a constrained subset of the singularities of f, while guaranteeing a small distance f − g ∞ for datafitting purpose. In contrast to previous algorithms, our approach is oblivious to the strategy used for selecting features of interest and allows critical points to be removed arbitrarily. When topological persistence is used to select the features of interest, our algorithm produces a standard ɛsimplification. Our approach is based on a new iterative algorithm for the constrained reconstruction of sub and surlevel sets. Extensive experiments show that the number of iterations required for our algorithm to converge is rarely greater than 2 and never greater than 5, yielding O(n log(n)) practical time performances. The algorithm handles triangulated surfaces with or without boundary and is robust to the presence of multisaddles in the input. It is simple to implement, fast in practice and more general than previous techniques. Practically, our approach allows a user to arbitrarily simplify the topology of an input function and robustly generate the corresponding
Extraction of Dominant Extremal Structures in Volumetric Data using Separatrix Persistence
 COMPUTER GRAPHICS FORUM
, 2012
"... Extremal lines and surfaces are features of a 3D scalar field where the scalar function becomes minimal or maximal with respect to a local neighborhood. These features are important in many applications, e.g., computer tomography, fluid dynamics, cell biology. We present a novel topological method t ..."
Abstract

Cited by 4 (1 self)
 Add to MetaCart
Extremal lines and surfaces are features of a 3D scalar field where the scalar function becomes minimal or maximal with respect to a local neighborhood. These features are important in many applications, e.g., computer tomography, fluid dynamics, cell biology. We present a novel topological method to extract these features using discrete Morse theory. In particular, we extend the notion of Separatrix Persistence from 2D to 3D, which gives us a robust estimation of the feature strength for extremal lines and surfaces. Not only does it allow us to determine the most important (parts of) extremal lines and surfaces, it also serves as a robust filtering measure of noiseinduced structures. Our purely combinatorial method does not require derivatives or any other numerical computations.
Combinatorial Gradient Fields for 2D Images with Empirically Convergent Separatrices
, 1208
"... This paper proposes an efficient probabilistic method that computes combinatorial gradient fields for two dimensional image data. In contrast to existing algorithms, this approach yields a geometric MorseSmale complex that converges almost surely to its continuous counterpart when the image resolut ..."
Abstract

Cited by 2 (0 self)
 Add to MetaCart
This paper proposes an efficient probabilistic method that computes combinatorial gradient fields for two dimensional image data. In contrast to existing algorithms, this approach yields a geometric MorseSmale complex that converges almost surely to its continuous counterpart when the image resolution is increased. This approach is motivated using basic ideas from probability theory and builds upon an algorithm from discrete Morse theory with a strong mathematical foundation. While a formal proof is only hinted at, we do provide a thorough numerical evaluation of our method and compare it to established algorithms. 1
Total Variation Meets Topological Persistence: A First Encounter
, 2010
"... We present first insights into the relation between two popular yet apparently dissimilar approaches to denoising of one dimensional signals, based on (i) total variation (TV) minimization and (ii) ideas from topological persistence. While a close relation between (i) and (ii) might phenomenologic ..."
Abstract

Cited by 1 (0 self)
 Add to MetaCart
(Show Context)
We present first insights into the relation between two popular yet apparently dissimilar approaches to denoising of one dimensional signals, based on (i) total variation (TV) minimization and (ii) ideas from topological persistence. While a close relation between (i) and (ii) might phenomenologically not be unexpected, our work appears to be the first to make this connection precise for one dimensional signals. We provide a link between (i) and (ii) that builds on the equivalence between TVL 2 regularization and taut strings and leads to a novel and efficient denoising algorithm that is contrast preserving and operates in O(nlogn) time, where n is the size of the input.