# Topological data analysis

Topological data analysis (TDA) is a new area of study aimed at having applications in areas such as data mining and computer vision. The main problems are:

1. how one infers high-dimensional structure from low-dimensional representations; and
2. how one assembles discrete points into global structure.

The human brain can easily extract global structure from representations in a strictly lower dimension, i.e. we infer a 3D environment from a 2D image from each eye. The inference of global structure also occurs when converting discrete data into continuous images, e.g. dot-matrix printers and televisions communicate images via arrays of discrete points.

The main method used by topological data analysis is:

1. Replace a set of data points with a family of simplicial complexes, indexed by a proximity parameter.
2. Analyse these topological complexes via algebraic topology — specifically, via the theory of persistent homology.[1]
3. Encode the persistent homology of a data set in the form of a parameterized version of a Betti number which is called a barcode.[1]

## Point cloud data

Data is often represented as points in a Euclidean n-dimensional space En. The global shape of the data may provide information about the phenomena that the data represent.

One type of data set for which global features are certainly present is the so-called point cloud data coming from physical objects in 3D. E.g. a laser can scan an object at a set of discrete points and the cloud of such points can be used in a computer representation of the object. Point cloud data is any collection of points in En or a (perhaps noisy) sample of points on a lower-dimensional subset.

For point clouds in low-dimensional spaces there are numerous approaches for inferring features based on planar projections in the fields of computer graphics and statistics. Topological data analysis is needed when the spaces are high-dimensional or too twisted to allow planar projections to faithfully represent the features of the point cloud.

To convert a point cloud in a metric space into a global object, use the point cloud as the vertices of a graph whose edges are determined by proximity, then turn the graph into a simplicial complex and use algebraic topology to study it. An alternative approach is the minimum spanning tree-based method in the geometric data clustering.[2] If a group of data points forms a cluster, then the geometry of this point cloud can be determined.

## Background

Topological data analysis includes different methods and representations whose purpose is to cluster variegated data via a point cloud stated above. The following are various methods to do so.

## Combinatorial representations

1. Cech complex. The Cech complex $C_\varepsilon$ is the nerve of the cover of balls of radius $\epsilon$ around each point in a set. Since balls are convex and convex sets are contractible, its nerve captures the topology of the cover. The Cech complex is not computed in practice due to its computational complexity. The uniform ball radii imply an assumption of uniform sampling on the input, which is not valid in a real world dataset. Non-uniform radii methods can also be used, such as in the case of the alpha simplex.
2. Alpha complex. The Voronoi diagram is the set of all Voronoi regions for the points in $S\subseteq Y$. This diagram is considered a closed cover for $Y$. The Delaunay complex is the nerve of the Voronoi diagram. The Voronoi cover and its nerve are fundamental geometric objects and have been extensively studied within computational geometry. Alpha complexes are constructed by first building the Delaunay complex. For each simplex of the Delaunay complex, we compute the minimum scale at which each simplex enters the alpha complex. Then the simplices are sorted by their minimum scale to get a partial order of simplices. The alpha complex is not formed with any scale $\epsilon$ using this ordering. Efficient algorithms and software exist for computing Delaunay complexes, and in turn, alpha complexes in 2 and 3 dimensions. However, the construction of the Delaunay complex is difficult in higher dimensions.
3. Vietoris–Rips complex

## Multiscale invariants

1. Multifiltration model. Morse theory enables one to analyze the topology of a manifold by studying differentiable functions on that manifold. According to the basic insights of Marston Morse, a typical differentiable function on a manifold will reflect the topology quite directly. Morse theory allows one to find CW structures and handle decompositions on manifolds and to obtain substantial information about their homology.
2. Persistent homology. See homology for an introduction to the notation.

Persistent homology essentially calculates homology groups at different spatial resolutions to see which features persist over a wide range of length scales. It is assumed that important features and structures are the ones that persist. We define persistent homology as follows: Let $K^l$ be a filtration. The p-persistent kth homology group of $K^l$ is $H_k^{l,p}=Z_k^l/(B_k^{l+p}\cap Z_k^l)$.

Let $z$ be a nonbounding $k$-cycle created at time $I$ by simplex $\sigma$ and let $z'\sim z$ be a homologous $k$-cycle that becomes a boundary cycle at time $J$ by simplex $\tau$. Then we can define the persistence interval associated to $z$ as $(I,J)$. We call $\sigma$ the creator of $z$ and $\tau$ the destroyer of $z$. If $z$ does not have a destroyer, its persistence is $\infty$. Instead of using an index-based filtration, we can use a time-based filtration. Let $K$ be a simplicial complex and $K^\rho=\{ \sigma^i\in K\mid\rho (\sigma^i)\le \rho \}$ be a filtration defined for an associated map $\rho : S(K)\rightarrow \mathbb{R}$ that maps simplices in the final complex to real numbers. Then for all real numbers $\pi \ge 0$, the $\pi$-persistent kth homology group of $K^\rho$ is $H_k^{\rho, \pi}=Z_k^\rho /(B_k^{\rho + \pi }\cap Z_k^\rho )$. The persistence of a $k$-cycle created at time $\rho_i$ and destroyed at $\rho_j$ is $\rho_j - \rho_i$. [3]

There are various software packages for computing persistence intervals of a finite filtration, such as javaPlex, Dionysus, Perseus, and PHAT.