0 Replies - 4213 Views - Last Post: 05 January 2014 - 06:28 PM Rate Topic: -----

#1 macosxnerd101  Icon User is online

  • Self-Trained Economist
  • member icon




Reputation: 10767
  • View blog
  • Posts: 40,091
  • Joined: 27-December 08

Algebraic Graph Theory- Adjacency Matrix and Spectrum

Posted 05 January 2014 - 06:28 PM

Algebraic Graph Theory- Adjacency Matrix and Spectrum

This tutorial will introduce the adjacency matrix, as well as spectral graph theory. For those familiar with Linear Algebra, the spectrum of a matrix denotes its eigenvalues and their algebraic multiplicities. Spectral Graph Theory deals with the eigenvalues and eigenvectors associated with the Adjacency and Laplacian matrices. Note that the Laplacian matrix will not be explored in this tutorial.

Below is a typeset copy of this tutorial:
Attached File  Adjacency_Matrix.pdf (158.3K)
Number of downloads: 354

Adjacency Matrix
The Adjacency Matrix describes the adjacency relation between two vertices; that is, whether or not there exists an edge that is incident to the two vertices. Let A be used to denote the adjacency matrix of a graph. If A describes a simple graph, then Aij = 1 if vertices vi and vj are adjacent. Otherwise, Aij = 0. If the simple graph is undirected, then the matrix is symmetric. That is, if vi is adjacent to vj, then vj is adjacent to vi. This is, in fact, the statement of the Handshake Lemma. If the graph is directed, then the matrix is only symmetric if, for each vertex vi, the in-degree of vi and the out-degree of vi are the same.

Weighted graphs can also be described more precisely using cost matrices. The cost matrix is defined the same way as an adjacency matrix. Rather than an indication of an adjacency, a numeric value is included in the cell to indicate the weight. If a cell has a 0, that means there is no edge connecting two vertices.

Let's now consider some examples.

Attached Image


The Adjacency Matrix has application in counting the number of walks between two vertices of a given length. A walk is a sequence alternating between vertices and edges, starting and ending with vertices. The two vertices on either side of an edge are incident to that edge. Consider some examples based on the graph K5:

Attached Image

Some examples of walks in K5 include:
  • v1 - v2 - v5, which is a walk of length 2
  • v2 - v4 - v2, which is also a walk of length 2


Now let's explore how the adjacency matrix can be used to count the number of walks of length n in a graph. This is done through matrix exponentiation. That is, let G be a Graph and let A be the corresponding adjacency matrix. Then Anij counts the number of walks of length n between vertices vi and vj in G.

Consider again the example of K5, counting the 2-walks. Let A be the adjacency matrix of K5, with A2 shown below.

Attached Image

From A2, it can be seen that for all walks that are not closed, there are three walks of length 2 in K5. There are four closed walks of length 2 in K5.

Let's examine what exactly is happening. Consider the case when the 2-walk is not closed. For the purpose of illustrating the concept, consider the 2-walks between vertices v1 and v2. Since K5 is a simple graph, there are no loops or multiple edges between two vertices. Thus, there are only three vertices that can connect v1 and v2, without violating the length requirement.

Proof that An describes the number of n-walks in a graph:
The proof is by induction on n a Natural number. Consider the base case of n = 1. So A1 = A. By definition, if vertices vi and vj are adjacent, then Aij = 1. Otherwise, Aij = 0. Thus, A counts the number of walks of length 1 in the graph.

The theorem will be assumed true up to an arbitrary integer k, and proven true for the k+1 case. By definition of matrix multiplication, Ak+1 = Ak * A. By the inductive hypothesis, Ak counts the number of k-walks in the graph, and A counts the number of walks of length 1 in the graph. Consider Aijk+1 = \sum_{x=1}^{k} Aixk * Axj. Thus, for each x, if Aixk has a non-zero value, then there exists at least one walk of length k from vertex vi to vx. Similarly, if Axj = 1, then there exists an edge connecting vertices vx and vj. If both values are non-zero, there exists Aixk walks of length k+1 from vi to vj. Thus, the k+1 case has been shown.

Thus, by the principle of mathematical induction, An describes the number of n-walks between two vertices in a graph.


Spectral Graph Theory
This section will briefly introduce the topic of Spectral Graph Theory. At this point, it is assumed that the reader is familiar with Eigentheory and Matrix Diagonalization. The spectrum of the graph provides numerous insights into graph properties, that are otherwise difficult to ascertain and prove. This section will discuss some of those properties.


Trace of a Graph
The first property to explore deals with the trace of the adjacency matrix. The trace of the matrix, denoted tr(M), is defined as the sum of the diagonal entries of a matrix. Consider an adjacency matrix shown above, all of which are simply graphs. It is easy to see that their traces are 0, as the diagonal entries are 0. So tr(A(K5)) = 0 since the diagonal entries are 0.

The trace of a matrix can also be written as the sum of the eigenvalues for the matrix. Consider a diagonalizable matrix A, written as A = QDQ-1 (through diagonalization). The trace has the property that given a product of matrices, their cyclic permutations do not change the result. So tr(A) = tr(QDQ-1) = tr(Q-1QD) = tr(DQ-1Q). It follows that since QQ-1 = Q-1Q = I, that tr(A) = tr(D), which means that tr(A) is equal to the sum of the eigenvalues of A.

This property that tr(A) = \sum_{i=1}^{n} lambda_{i}, where lambda_{i} is an eigenvalue of A, is useful in counting edges and triangles in a graph. Consider the Handshake Lemma, which states that the sum of the vertex degrees is equal to twice the number of edges (\sum_{i} deg(v_{i}) = 2E). The trace-eigenvalue property provides a combinatorial proof for the Handshake Lemma. Consider A2, the square of the adjacency matrix of the given simple graph. The diagonal entries of A2 are the closed 2-walks of the graph. The only way for a 2-walk on a simple graph to be closed is, starting at vi, go to an adjacent vertex vj, then return to vi. Thus, vi is adjacent to vj if and only if vj is adjacent to vi. So a closed 2-walk on vi counts the number of edges incident to vi (ie., the degree of vi). Consequently, the diagonal entry of A2 corresponding to the closed 2-walks of vj also counts the same edge incident to vj and vi, so the edge is counted twice. Thus, tr(A2) = tr(D * D) = \sum_{i=1}^{n} lambda_{i}^{2} = 2E. In other words, if lambda_{i} is an eigenvalue for A, then lambda_{i}^{2} is an eigenvalue for A2 and the sum of all the lambda_{i}^{2} counts the number of edges in the graph.

Through a similar argument, it is possible to count the number of triangles in a graph. Consider A3, the cube of the adjacency matrix. Its diagonal entries describe the closed 3-walks of the graph. On a simple graph, these are restricted to triangles (C3). Thus, tr(A3) = \sum_{i=1}^{n} lambda_{i} = 6C3. Here, each edge in a given triangle is double counted, in a similar manner as the edges. Thus, double counting three edges returns 6C3.

It should be noted that the trace-eigenvalue property does scale, but with a nuance. Certainly tr(An) for n > 3 counts the number of closed n-walks in the graph. However, there are multiple ways to achieve closed n-walks; whereas with closed walks of length 1-3, there is exactly one way to accomplish each on a simple graph.


Characteristic Polynomial
Just as the trace of the adjacency matrix provides tools for examining properties of the graph, the characteristic polynomial in and of itself also provides the same information. Consider the characteristic polynomial of a graph, f(x) = det(A - xI) = xn + \sum_{i=1}^{n-1} ci xn-i. From the calculation on the determinant, the ci coefficients are the sum of the ith principal minors of the matrix A. More exactingly, ci = (-1)i * \sum_{j} pj, where pj is the jth principal minor of A whose corresponding principal matrix is of dimension i x i. A principal minor is the determinant of a principal matrix, which is a square sub-matrix whose top-left element is on the diagonal of the parent matrix. Additional rows and columns are removed from the parent matrix in the formation of a principal matrix.

Let's explore exactly how the principal minors behave with respect to adjacency matrices (of simple graphs). On the surface, it looks like a lot of linear algebra without a lot of graph theory. While the linear algebra is definitely there, the graph theory is underneath the surface. Consider an example with the graph C4. Below is C4, along with its adjacency matrix and distinct 2x2 principal matrices.

Attached Image

It is easy to see the 1x1 principal minors are all 0's, as the diagonal entries are 0's. So clearly c1 = tr(A) = 0 for simple graphs. Now consider the 2x2 principal minors of C4. There are six 2x2 principal matrices, with the distinct ones listed above. The first 2x2 principal matrix is noted to indicate an adjacency. Let's examine why this is. The first time this principal matrix appear is with A11, keeping the second column. Thus, from the image, it is easy to see that there is an edge between vertices v1 and v2. The given principal matrix is also the adjacency matrix of P2, the Path graph on two vertices, or an edge with two vertices. Thus, clearly, if there is no path of length one between two vertices, then there is no adjacency. Hence, the other two 2x2 principal matrices clearly cannot indicate an adjacency, or an edge. Thus, the sum of the 2x2 principal minors counts the edges in the graph, with -c2 = |E(G)|. Using C4 as an example, there are four edges, indicated by the principal matrices starting at A11 keeping the second row and second and third columns; A22 keeping the fourth row and fourth column; and A33 keeping the fourth row and fourth column.

The idea is the same in looking for triangles in a graph. Simply look for the 3x3 principal matrices which are the same as the adjacency matrix for C3, which is shown below. The determinant of the adjacency matrix of C3 is 2. All other 3x3 principal minors have a determinant of 0. Thus, adding up all the 3x3 principal minors produces the twice the count for the number of triangles in the graph. More exactingly, c3 = -2 C3. There are no triangles in C4, so c3 = 0 in the characteristic polynomial of C4.

Attached Image


Conclusion
This tutorial served as a brief introduction into the broad topic of spectral graph theory. Hopefully it has provided a better appreciation and understanding of the various graph properties that can be explored through use of the graph's spectrum.

Is This A Good Question/Topic? 0
  • +

Page 1 of 1