# Eigenvector centrality

In graph theory, eigenvector centrality (also called eigencentrality) is a measure of the influence of a node in a network. Relative scores are assigned to all nodes in the network based on the concept that connections to high-scoring nodes contribute more to the score of the node in question than equal connections to low-scoring nodes. A high eigenvector score means that a node is connected to many nodes who themselves have high scores.

Google's PageRank and the Katz centrality are variants of the eigenvector centrality.[1]

## Using the adjacency matrix to find eigenvector centrality

For a given graph ${\displaystyle G:=(V,E)}$ with ${\displaystyle |V|}$ vertices let ${\displaystyle A=(a_{v,t})}$ be the adjacency matrix, i.e. ${\displaystyle a_{v,t}=1}$ if vertex ${\displaystyle v}$ is linked to vertex ${\displaystyle t}$, and ${\displaystyle a_{v,t}=0}$ otherwise. The relative centrality score of vertex ${\displaystyle v}$ can be defined as:

${\displaystyle x_{v}={\frac {1}{\lambda }}\sum _{t\in M(v)}x_{t}={\frac {1}{\lambda }}\sum _{t\in G}a_{v,t}x_{t}}$

where ${\displaystyle M(v)}$ is a set of the neighbors of ${\displaystyle v}$ and ${\displaystyle \lambda }$ is a constant. With a small rearrangement this can be rewritten in vector notation as the eigenvector equation

${\displaystyle \mathbf {Ax} =\lambda \mathbf {x} }$

In general, there will be many different eigenvalues ${\displaystyle \lambda }$ for which a non-zero eigenvector solution exists. However, the additional requirement that all the entries in the eigenvector be non-negative implies (by the Perron–Frobenius theorem) that only the greatest eigenvalue results in the desired centrality measure.[2] The ${\displaystyle v^{\text{th}}}$ component of the related eigenvector then gives the relative centrality score of the vertex ${\displaystyle v}$ in the network. The eigenvector is only defined up to a common factor, so only the ratios of the centralities of the vertices are well defined. To define an absolute score one must normalise the eigen vector e.g. such that the sum over all vertices is 1 or the total number of vertices n. Power iteration is one of many eigenvalue algorithms that may be used to find this dominant eigenvector.[1] Furthermore, this can be generalized so that the entries in A can be real numbers representing connection strengths, as in a stochastic matrix.

## Applications

In neuroscience, the eigenvector centrality of a neuron in a model neural network has been found to correlate with its relative firing rate.[3]