# Laplacian matrix

In the mathematical field of graph theory the Laplacian matrix, sometimes called admittance matrix, Kirchhoff matrix or discrete Laplacian, is a matrix representation of a graph. Together with Kirchhoff's theorem it can be used to calculate the number of spanning trees for a given graph. The Laplacian matrix can be used to find many other properties of the graph; see spectral graph theory. Cheeger's inequality from Riemannian geometry has a discrete analogue involving the Laplacian Matrix; this is perhaps the most important theorem in Spectral Graph theory and one of the most useful facts in algorithmic applications. It approximates the sparsest cut of a graph through the second eigenvalue of its Laplacian.

## Definition

Given a simple graph G with n vertices, its Laplacian matrix $L:=(\ell_{i,j})_{n \times n}$ is defined as:[1]

$L = D - A.$

That is, it is the difference of the degree matrix D and the adjacency matrix A of the graph. In the case of directed graphs, either the indegree or outdegree might be used, depending on the application.

From the definition follows that:

$\ell_{i,j}:= \begin{cases} \deg(v_i) & \mbox{if}\ i = j \\ -1 & \mbox{if}\ i \neq j\ \mbox{and}\ v_i \mbox{ is adjacent to } v_j \\ 0 & \mbox{otherwise} \end{cases}$

where deg(vi) is degree of the vertex i.

The normalized Laplacian matrix is defined as:[1]

$\ell_{i,j}:= \begin{cases} 1 & \mbox{if}\ i = j\ \mbox{and}\ \deg(v_i) \neq 0\\ -\frac{1}{\sqrt{\deg(v_i)\deg(v_j)}} & \mbox{if}\ i \neq j\ \mbox{and}\ v_i \mbox{ is adjacent to } v_j \\ 0 & \mbox{otherwise}. \end{cases}$

## Example

Here is a simple example of a labeled graph and its Laplacian matrix.

Labeled graph Degree matrix Adjacency matrix Laplacian matrix
$\left(\begin{array}{rrrrrr} 2 & 0 & 0 & 0 & 0 & 0\\ 0 & 3 & 0 & 0 & 0 & 0\\ 0 & 0 & 2 & 0 & 0 & 0\\ 0 & 0 & 0 & 3 & 0 & 0\\ 0 & 0 & 0 & 0 & 3 & 0\\ 0 & 0 & 0 & 0 & 0 & 1\\ \end{array}\right)$ $\left(\begin{array}{rrrrrr} 0 & 1 & 0 & 0 & 1 & 0\\ 1 & 0 & 1 & 0 & 1 & 0\\ 0 & 1 & 0 & 1 & 0 & 0\\ 0 & 0 & 1 & 0 & 1 & 1\\ 1 & 1 & 0 & 1 & 0 & 0\\ 0 & 0 & 0 & 1 & 0 & 0\\ \end{array}\right)$ $\left(\begin{array}{rrrrrr} 2 & -1 & 0 & 0 & -1 & 0\\ -1 & 3 & -1 & 0 & -1 & 0\\ 0 & -1 & 2 & -1 & 0 & 0\\ 0 & 0 & -1 & 3 & -1 & -1\\ -1 & -1 & 0 & -1 & 3 & 0\\ 0 & 0 & 0 & -1 & 0 & 1\\ \end{array}\right)$

## Properties

For a graph G and its Laplacian matrix L with eigenvalues $\lambda_0 \le \lambda_1 \le \cdots \le \lambda_{n-1}$:

• L is always positive-semidefinite ($\forall i, \lambda_i \ge 0;\quad \lambda_0 = 0$).
• The number of times 0 appears as an eigenvalue in the Laplacian is the number of connected components in the graph.
• L is an M-matrix.
• $\lambda_0$ is always 0 because every Laplacian matrix has an eigenvector $\mathbf{v_0}=[1,1,\dots,1]$ that, for each row, adds the corresponding node's degree (from the diagonal) to a "-1" for each neighbor so that $L \mathbf{v_0} = 0 .$
• The smallest non-zero eigenvalue of L is called the spectral gap.
• If we define an oriented incidence matrix M with element Mev for edge e (connecting vertex i and j, with i > j) and vertex v given by
$M_{ev} = \left\{ \begin{array}{rl}1, & \text{if}\,v=i\\-1, & \text{if}\,v=j\\0, & \text{otherwise},\end{array}\right.$

then the Laplacian matrix L satisfies

$L = M^\text{T} M\,,$

where $M^\text{T}$ is the matrix transpose of M.

• The Laplacian is an operator on the function of g. When G is k-regular,
$\ell = I-\frac{1}{k} A$

Where A is the adjacency matrix of G and I is an identity matrix. All matrices are n × n where n is the number of vertices in G.

## Deformed Laplacian

The deformed Laplacian is commonly defined as

$\Delta(s)=I-sA+s^2(D-I)$

where I is the unit matrix, A is the adjacency matrix, and D is the degree matrix, and s is a (complex-valued) number. Note that the standard Laplacian is just $\Delta(1)$.

## Symmetric normalized Laplacian

The symmetric normalized Laplacian (a.k.a. normalized Laplacian) is defined as

$L^{norm}:=I-D^{-1/2}AD^{-1/2}$
where A is the adjacency matrix and D is the degree matrix. Since the degree matrix D is diagonal, its reciprocal square root $D^{-1/2}$ is simply defined as a diagonal matrix, having diagonal entries which are the reciprocals of the positive square roots of the corresponding positive diagonal entries of D. The symmetric normalized Laplacian is a symmetric matrix.
$L^{norm} = S S^{*}$.
Where S is the matrix whose rows are indexed by the vertices and whose columns are indexed by the edges of G such that each column corresponding to an edge e = {u, v} has an entry $\frac{1}{\sqrt d_{u}}$ in the row corresponding to u, an entry $-\frac{1}{\sqrt d_{v}}$ in the row corresponding to v, and has 0 entries elsewhere. (Note:$S^{*}$ denotes the transpose of S).

The symmetric normalized Laplacian can also be written as

$L^{norm}=I-D^{-1/2}AD^{-1/2} = D^{-1/2} L D^{-1/2}.$

All eigenvalues of the normalized Laplacian are real and non-negative. In fact,if λ is an eigenvalue of L, then 0 ≤ λ ≤ 2. These eigenvalues (known as spectra of the normalized Laplacian) relate well to other graph invariants for general graphs.[2]

Since $L^{norm}$ is symmetric, its eigenvalues are real and non-negative. We can use its characterization in terms of Rayleigh quotient of $L^{norm}$.
Let g is an arbitrary function which assigns to each vertex v of G a real value g(v). We can treat g as a column vector.
$\frac{g,L^{norm}_{g}}{g, g}$
$= \frac{g, T^{-1/2} L T^{-1/2} g}{g,g}$
$=\frac{f, Lf}{T^{1/2} f, T^{1/2} f}$
$=\frac{\sum_{u~v}(f(u) - f(v) )^2}{\sum_{v} f(v)^2 d_{v}}$
where $g = T^{1/2} f$ and $\sum_{u~v}$ denotes the sum over all unordered pairs {u,v} for which u and v are adjacent. $\sum_{u~v}(f(u) - f(v) )^2$ is called the Dirichlet sum of G. The left hand side of the equation is Rayleigh quotient.
The eigenvalues of $L^{norm}$ by $0 = \lambda_0\le\lambda_1\le\cdots\lambda_{n-1}$. the set of $\lambda_i$ is usually called the spectrum of $L^{norm}$ Let 1 be the function which assumes the value 1 on each vertex. Then $T^{1/2} 1$ is an eigenfunction of $L^{norm}$ with eigenvalue 0.

[3]

## Random walk normalized Laplacian

The random walk normalized Laplacian is defined as

$T:=D^{-1}A$

where A is the adjacency matrix and D is the degree matrix. Since the degree matrix D is diagonal, its inverse $D^{-1}$ is simply defined as a diagonal matrix, having diagonal entries which are the reciprocals of the corresponding positive diagonal entries of D. The name of this operator comes from the fact that, indeed, $T$ is the transition matrix of a standard random walk on the given graph.

One can check that

$T=D^{-\frac12} \left(I-L^{norm}\right) D^{\frac12}$,

i.e., $T$ is similar to a scalar perturbation of the normalized Laplacian $L^{norm}$. For this reason, even if $T$ is in general not hermitian, it has real eigenvalues. Indeed, its eigenvalues agree with those of $L^{norm}$ (which is hermitian), up to a mirroring in the point $1/2$.

## As a matrix representation of the negative discrete Laplace operator

The Laplacian matrix can be interpreted as a matrix representation of a particular case of the negative discrete Laplace operator. Such an interpretation allows one, e.g., to generalise the Laplacian matrix to the case of graphs with an infinite number of vertices and edges, leading to a Laplacian matrix of an infinite size.

## As an approximation to the negative continuous Laplacian

The graph Laplacian matrix can be further viewed as a matrix form of an approximation to the negative Laplacian operator obtained by the finite difference method[citation needed]. In this interpretation, every graph vertex is treated as a grid point; the local connectivity of the vertex determines the finite difference approximation stencil at this grid point, the grid size is always one for every edge, and there are no constraints on any grid points, which corresponds to the case of the homogeneous Neumann boundary condition, i.e., free boundary.