**Algebraic Graph Theory- Adjacency Matrix and Spectrum**

This tutorial will introduce the adjacency matrix, as well as spectral graph theory. For those familiar with Linear Algebra, the spectrum of a matrix denotes its eigenvalues and their algebraic multiplicities. Spectral Graph Theory deals with the eigenvalues and eigenvectors associated with the Adjacency and Laplacian matrices. Note that the Laplacian matrix will not be explored in this tutorial.

Below is a typeset copy of this tutorial:

Adjacency_Matrix.pdf

**(158.3K)**

Number of downloads: 1355

**Adjacency Matrix**

The Adjacency Matrix describes the adjacency relation between two vertices; that is, whether or not there exists an edge that is incident to the two vertices. Let A be used to denote the adjacency matrix of a graph. If A describes a simple graph, then A

_{ij}= 1 if vertices v

_{i}and v

_{j}are adjacent. Otherwise, A

_{ij}= 0. If the simple graph is undirected, then the matrix is symmetric. That is, if v

_{i}is adjacent to v

_{j}, then v

_{j}is adjacent to v

_{i}. This is, in fact, the statement of the Handshake Lemma. If the graph is directed, then the matrix is only symmetric if, for each vertex v

_{i}, the in-degree of v

_{i}and the out-degree of v

_{i}are the same.

Weighted graphs can also be described more precisely using cost matrices. The cost matrix is defined the same way as an adjacency matrix. Rather than an indication of an adjacency, a numeric value is included in the cell to indicate the weight. If a cell has a 0, that means there is no edge connecting two vertices.

Let's now consider some examples.

The Adjacency Matrix has application in counting the number of walks between two vertices of a given length. A walk is a sequence alternating between vertices and edges, starting and ending with vertices. The two vertices on either side of an edge are incident to that edge. Consider some examples based on the graph K

_{5}:

Some examples of walks in K

_{5}include:

- v
_{1}- v_{2}- v_{5}, which is a walk of length 2 - v
_{2}- v_{4}- v_{2}, which is also a walk of length 2

Now let's explore how the adjacency matrix can be used to count the number of walks of length n in a graph. This is done through matrix exponentiation. That is, let G be a Graph and let A be the corresponding adjacency matrix. Then A

^{n}

_{ij}counts the number of walks of length n between vertices v

_{i}and v

_{j}in G.

Consider again the example of K

_{5}, counting the 2-walks. Let A be the adjacency matrix of K

_{5}, with A

^{2}shown below.

From A

^{2}, it can be seen that for all walks that are not closed, there are three walks of length 2 in K

_{5}. There are four closed walks of length 2 in K

_{5}.

Let's examine what exactly is happening. Consider the case when the 2-walk is not closed. For the purpose of illustrating the concept, consider the 2-walks between vertices v

_{1}and v

_{2}. Since K

_{5}is a simple graph, there are no loops or multiple edges between two vertices. Thus, there are only three vertices that can connect v

_{1}and v

_{2}, without violating the length requirement.

**Proof that A**:

^{n}describes the number of n-walks in a graphThe proof is by induction on n a Natural number. Consider the base case of n = 1. So A

^{1}= A. By definition, if vertices v

_{i}and v

_{j}are adjacent, then A

_{ij}= 1. Otherwise, A

_{ij}= 0. Thus, A counts the number of walks of length 1 in the graph.

The theorem will be assumed true up to an arbitrary integer k, and proven true for the k+1 case. By definition of matrix multiplication, A

^{k+1}= A

^{k}* A. By the inductive hypothesis, A

^{k}counts the number of k-walks in the graph, and A counts the number of walks of length 1 in the graph. Consider A

_{ij}

^{k+1}= \sum_{x=1}^{k} A

_{ix}

^{k}* A

_{xj}. Thus, for each x, if A

_{ix}

^{k}has a non-zero value, then there exists at least one walk of length k from vertex v

_{i}to v

^{x}. Similarly, if A

_{xj}= 1, then there exists an edge connecting vertices v

_{x}and v

_{j}. If both values are non-zero, there exists A

_{ix}

^{k}walks of length k+1 from v

_{i}to v

_{j}. Thus, the k+1 case has been shown.

Thus, by the principle of mathematical induction, A

^{n}describes the number of n-walks between two vertices in a graph.

**Spectral Graph Theory**

This section will briefly introduce the topic of Spectral Graph Theory. At this point, it is assumed that the reader is familiar with Eigentheory and Matrix Diagonalization. The spectrum of the graph provides numerous insights into graph properties, that are otherwise difficult to ascertain and prove. This section will discuss some of those properties.

**Trace of a Graph**

The first property to explore deals with the trace of the adjacency matrix. The trace of the matrix, denoted tr(M), is defined as the sum of the diagonal entries of a matrix. Consider an adjacency matrix shown above, all of which are simply graphs. It is easy to see that their traces are 0, as the diagonal entries are 0. So tr(A(K

_{5})) = 0 since the diagonal entries are 0.

The trace of a matrix can also be written as the sum of the eigenvalues for the matrix. Consider a diagonalizable matrix A, written as A = QDQ

^{-1}(through diagonalization). The trace has the property that given a product of matrices, their cyclic permutations do not change the result. So tr(A) = tr(QDQ

^{-1}) = tr(Q

^{-1}QD) = tr(DQ

^{-1}Q). It follows that since QQ

^{-1}= Q

^{-1}Q = I, that tr(A) = tr(D), which means that tr(A) is equal to the sum of the eigenvalues of A.

This property that tr(A) = \sum_{i=1}^{n} lambda_{i}, where lambda_{i} is an eigenvalue of A, is useful in counting edges and triangles in a graph. Consider the Handshake Lemma, which states that the sum of the vertex degrees is equal to twice the number of edges (\sum_{i} deg(v_{i}) = 2E). The trace-eigenvalue property provides a combinatorial proof for the Handshake Lemma. Consider A

^{2}, the square of the adjacency matrix of the given simple graph. The diagonal entries of A

^{2}are the closed 2-walks of the graph. The only way for a 2-walk on a simple graph to be closed is, starting at v

_{i}, go to an adjacent vertex v

_{j}, then return to v

_{i}. Thus, v

_{i}is adjacent to v

_{j}if and only if v

_{j}is adjacent to v

_{i}. So a closed 2-walk on v

_{i}counts the number of edges incident to v

_{i}(ie., the degree of v

_{i}). Consequently, the diagonal entry of A

^{2}corresponding to the closed 2-walks of v

_{j}also counts the same edge incident to v

_{j}and v

_{i}, so the edge is counted twice. Thus, tr(A

^{2}) = tr(D * D) = \sum_{i=1}^{n} lambda_{i}^{2} = 2E. In other words, if lambda_{i} is an eigenvalue for A, then lambda_{i}^{2} is an eigenvalue for A

^{2}and the sum of all the lambda_{i}^{2} counts the number of edges in the graph.

Through a similar argument, it is possible to count the number of triangles in a graph. Consider A

^{3}, the cube of the adjacency matrix. Its diagonal entries describe the closed 3-walks of the graph. On a simple graph, these are restricted to triangles (C

_{3}). Thus, tr(A

^{3}) = \sum_{i=1}^{n} lambda_{i} = 6C

_{3}. Here, each edge in a given triangle is double counted, in a similar manner as the edges. Thus, double counting three edges returns 6C

_{3}.

It should be noted that the trace-eigenvalue property does scale, but with a nuance. Certainly tr(A

^{n}) for n > 3 counts the number of closed n-walks in the graph. However, there are multiple ways to achieve closed n-walks; whereas with closed walks of length 1-3, there is exactly one way to accomplish each on a simple graph.

**Characteristic Polynomial**

Just as the trace of the adjacency matrix provides tools for examining properties of the graph, the characteristic polynomial in and of itself also provides the same information. Consider the characteristic polynomial of a graph, f(x) = det(A - xI) = x

^{n}+ \sum_{i=1}^{n-1} c

_{i}x

^{n-i}. From the calculation on the determinant, the c

_{i}coefficients are the sum of the ith principal minors of the matrix A. More exactingly, c

_{i}= (-1)

^{i}* \sum_{j} p

_{j}, where p

_{j}is the jth principal minor of A whose corresponding principal matrix is of dimension i x i. A principal minor is the determinant of a principal matrix, which is a square sub-matrix whose top-left element is on the diagonal of the parent matrix. Additional rows and columns are removed from the parent matrix in the formation of a principal matrix.

Let's explore exactly how the principal minors behave with respect to adjacency matrices (of simple graphs). On the surface, it looks like a lot of linear algebra without a lot of graph theory. While the linear algebra is definitely there, the graph theory is underneath the surface. Consider an example with the graph C

_{4}. Below is C

_{4}, along with its adjacency matrix and distinct 2x2 principal matrices.

It is easy to see the 1x1 principal minors are all 0's, as the diagonal entries are 0's. So clearly c

_{1}= tr(A) = 0 for simple graphs. Now consider the 2x2 principal minors of C

_{4}. There are six 2x2 principal matrices, with the distinct ones listed above. The first 2x2 principal matrix is noted to indicate an adjacency. Let's examine why this is. The first time this principal matrix appear is with A

_{11}, keeping the second column. Thus, from the image, it is easy to see that there is an edge between vertices v

_{1}and v

_{2}. The given principal matrix is also the adjacency matrix of P

_{2}, the Path graph on two vertices, or an edge with two vertices. Thus, clearly, if there is no path of length one between two vertices, then there is no adjacency. Hence, the other two 2x2 principal matrices clearly cannot indicate an adjacency, or an edge. Thus, the sum of the 2x2 principal minors counts the edges in the graph, with -c

_{2}= |E(G)|. Using C

_{4}as an example, there are four edges, indicated by the principal matrices starting at A

_{11}keeping the second row and second and third columns; A

_{22}keeping the fourth row and fourth column; and A

_{33}keeping the fourth row and fourth column.

The idea is the same in looking for triangles in a graph. Simply look for the 3x3 principal matrices which are the same as the adjacency matrix for C

_{3}, which is shown below. The determinant of the adjacency matrix of C

_{3}is 2. All other 3x3 principal minors have a determinant of 0. Thus, adding up all the 3x3 principal minors produces the twice the count for the number of triangles in the graph. More exactingly, c

_{3}= -2 C

_{3}. There are no triangles in C

_{4}, so c

_{3}= 0 in the characteristic polynomial of C

_{4}.

**Conclusion**

This tutorial served as a brief introduction into the broad topic of spectral graph theory. Hopefully it has provided a better appreciation and understanding of the various graph properties that can be explored through use of the graph's spectrum.