**Parallel Computation of Persistent Homology using the Blowup Complex**

Joint Work with Dmitriy Morozov

Full Manuscript,Proceedings of the Annual Symposium on Parallelism in Algorithms and Architectures, 2015.

We describe a parallel algorithm that computes persistent homology, an algebraic descriptor of a filtered topological space. Our algorithm is distinguished by operating on a spatial decomposition of the domain, as opposed to a decomposition with respect to the filtration. We rely on a classical construction, called the Mayer–Vietoris blowup complex, to glue global topological information about a space from its disjoint subsets. We introduce an efficient algorithm to perform this gluing operation, which may be of independent interest, and describe how to process the domain hierarchically. We report on a set of experiments that help assess the strengths and identify the limitations of our method.

We design and implement a framework for parallel computation of homology of cellular spaces over field coefficients, by decomposing the space. Theoretically, we show that optimal decomposition into local pieces is NP-Hard. In practice, we achieve roughly an speedup of homology computation on a 3-dimensional complex with about 10 million simplices using 11 cores.

Recently a large number of graph separator problems have been proven to be NP-Hard.

Amazingly we have found that -Subgraph-Balanced-Vertex-Separator, an important variant, has been overlooked. In this work “Yet Another Graph Partitioning Problem is NP-Hard” we present the surprising result that -Subgraph-Balanced-Vertex-Separator is infact NP-Hard. This is despite the fact that the constraints

of our new problem are harder to satisfy than the original problem.

## One response

Do you want to comment?

Comments RSS and TrackBack Identifier URI ?

## Trackbacks