My master thesis

Scalable solving of boundary element methods utilizing the Green cross approximation method and GPUs

This project is maintained by Bennet Carstensen

May 22, 2017 by Bennet Carstensen •
\(\mathcal{H}^2\)-Matrices

As we already mentioned, matrices resulting from integral equations are typically non-sparse, thus we consider data-sparsity: We are interested in an alternative representation or approximation of the matrix requiring just a small number of coefficients. Hierarchical matrix methods [8], [9] achieve this by dividing the matrix into submatrices approximating those who hold an admissibility condition or storing them in standard two-dimensional arrays if they are small enough.

Definition (Tree) [10] Let be a finite set, let and . Define . We call a tree if for each there is precisely one sequence with such that

We call the nodes, the root and the edges of . Furthermore, we call

the set of sons of and the father of m. The father is unique for each except the root, which has no father. Hereinafter we are using as a synonym for , e.g., we write .

One might interpret a tree as a directed graph where each node represents a subset of a index set, e.g. those resulting from the discretization of the Galerkin approximation. To divide the set and thus the matrix into subsets we consider a clustering method.

Definition (Cluster tree) Let be a finite set, let be a tree and let be a subset given for each .
We call a cluster tree for if the tree fullfills the follwing requirements:

  • The index set corresponding to the root is , i.e.,
  • Given a non-leaf node, the index set consists of the union of the index sets of its sons, i.e.,
  • Given two different sons of a node, their index sets are disjoint, i.e.,

This way, we can represent submatrices of a matrix , with index sets and , by pairs of cluster chosen from two cluster trees and . To find suitable submatrices to approximate, we introduce another tree structure.

Definition (Block tree) Let and be cluster trees for index sets and . We call a block tree for and if the tree fullfills the follwing requirements:

  • The nodes are pairs of clusters, i.e.,
  • The root consists of the roots of and , e.g.,
  • If has sons, one can form them by the sons of and , i.e.,

We call the nodes of a block tree for cluster trees and blocks and denote the set of leaves by .

A block tree represents a cluster tree for the Cartesian product index set .

For any cluster tree, the labels of the leaves form a disjoint partition of the index set, as one can show by induction. The leaves of a block tree in particular form the disjoint partition

of the index set , i.e, resulting from the discretization of the Galerkin approximation.

Since we won’t be able to approximate each block, we define an admissiblity condition

which indicates whether we approximate a submatrix . We call a block admissible if adm$$(t, s) = true holds.

Definition (Admissibility condition) Let and be cluster trees for index sets and . We call a mapping

an admissibility condition for and if

Since being a disjoint partition of the index set, the leaves of a block tree give rise to a decomposition of a matrix into submatrices: We expect the leaves to correspond to those submatrices we approximate, i.e. the ones who are admissible, or those we represent as an array of coefficients since they are small enough.

Definition (Admissible and inadmissible leaves) Let be a block tree for cluster trees and and let adm be an admissiblity condition. We split the set of leaves of into the sets

of admissible and

inadmissible leaves.

Definition (Admissible block tree) Let be a block tree, let and *denote the admissible and inadmissible leaves of .

We call a block tree admissible if

and we call it strictly admissible if

With cluster trees , and an admissibility condition given, we can construct a (strictly) admissible block tree by checking the admissibility condition for the root . We finished our construction if the condition holds true. Otherwise, we check the sons and further descendants.

Constructing an approximation of now means defining approximations for all with .

Definition (Hierarchical matrix) Let be a block tree for cluster trees , , and let denote the admissible leaves of .

We call a matrix a hierarchical matrix (or -matrix) of local rank of , if for each we find and such that

where denotes the transposed of .

Approximating admissible submatrices leads in typical applications to reduced storage requierements of a hierarchical matrix to , with and [9].

We can avoid the logarithmic factor by refining the representation: we choose sets of basis vectors for all clusters , and represent the admissible blocks by these basis vectors.

Definition (Cluster index) We call a family of an finite index set a cluster index for .

Definition (Cluster basis) Let be a cluster index for . We call a family of matrices a cluster basis of rank if

  • for all holds and
  • there is a matrix for all and all satisfying

In this case, we call the matrices basis matrices and the matrices transfer matrices.

Definition (Nested representation) Let be a cluster basis of rank and let be a family of transfer matrices satisfying .

*We call a nested representation of the cluster basis .

In a nested representation, we give the matrices of explicitly and otherwise implicitly by .

Definition (-matrix) Let be a hierarchical matrix for nested cluster bases of rank and of rank , if for each we can find such that

we call a -matrix. We call the matrices coupling matrices.

Representing all admissible submatrices and the cluster basis by the coupling and transfer matrices respectively, we can reduces the storage requirements of an -matrix* to , where .