How do we get Eigenvalues?

Say we have a elementary first-order scalar ODE. Such as:

where and u(t) is the unknown scalar function. The solution will follow an exponential form. The general solution is: . c is the integration constant and derived via separation of variables. The initial condition defines the unique solution for the ODE. So our final solution is then given by:

A first-order dynamical system consists of a system of n coupled first-order ODE like that explained above.

So, time for vectors! We write the system in vector form like this:

where is the vector-valued solution. This representation parameterizes a curve in .

The matrix formulation for these linear systems is:

, where A is constant n x n matrix for the n equations with its n terms. So now, we seek a solution of the form , where lambda is a scalar and v is constant vector. Taking the derivative, we end up with . Substituting into the system we end up with . Which will finally become . This is the eigenvalue equation.

This eigenvalue equation then can be rewritten as: . So, must be singular and the rank of it must be less than n.

Finally, to get eigenvalues and eigenvectors:

  1. Solve for
  2. For each eigenvalue, solve for v

What is an Eigenspace and how do they relate to Eigenvalues?

Given a real eigenvalue , the corresponding eigenspace is the subspace spanned by all eigenvectors associated with . Equivalently, the eigenspace is equal to . The properties of an eigenspace are as follows:

  • is an eigenvalue, so and vice versa
  • Every nonzero element in is the corresponding eigenvector
  • A basis of an eigenspace is the easiest way to represent it

From that first property, if then and vice versa. As such, the matrix A is singular.Therefore, a matrix is singular if and only if it has a zero eigenvalue.

To compute an eigenspace, you take the matrix and solve for . It should give a characteristic polynomial, and solve for its roots. Those roots are the eigenvalues. Then plug-in those eigenvalues into . Solve for v. These vectors v for each lambda is the corresponding eigenvector. The eigenvectors span the eigenspace.

If A is a real matrix with a complex eigenvalue: with the corresponding complex eigenvector then the complex conjugate is also an eigenvalue of A. The complex conjugate .

If A is symmetric then all eigenvalues are real and the eigenvectors corresponding to distinct eigenvalues are orthogonal.

The constant term in the characteristic polynomial is the determinant of A. Every matrix n x n possess at least one and at most n distinct complex eigenvalues. An eigenvalue’s multiplicity is the power of term in the factorization of the characteristic polynomial.

A square matrix A and its transpose have the same characteristic equation and the same values and multiplicities. The sum of the eigenvalues of A equals its trace. The product of the eigenvalues equals its determinant.

What is the key lemma relating eigenvalues of A and their corresponding eigenvectors?

If the eigenvalues are distinct values of A, then the corresponding eigenvectors are linearly independent. The proof for this is inductive. First, any eigenvector is trivially independent. The inductive step works as follows: Subtracting these two, we get: We know that any two distinct eigenvalues have different values, so their difference is non-zero. Therefore, all coefficients must be zero for this equation to hold. Therefore, the eigenvector is linearly independent with all other eigenvectors.

What is the definition of completeness?

An eigenvalue of a matrix A is called complete if the corresponding eigenspace has the same dimension as its multiplicity.

The matrix A is said to be complete if all its eigenvalues are complete.

A simple eigenvalue is automatically complete. Only multiple eigenvalues can cause incompleteness. Completeness requires both geometric and algebraic multiplicity, the dimension of the eigenspace, and multiplicity as root of characteristic polynomial respectively. A matrix is complete if and only if its eigenvectors span .

So, complex eigenvectors of a real matrix appear in conjugate pairs: . For complete real matrices, the real eigenvectors and the real and imaginary parts of the complex eigenvectors together form a basis of .

How does PageRank work?

Take all outgoing links from a webpage, treat it as a vector a la an adjacency matrix. Then, normalize this vector by dividing by total number of links on the page. Combine to create an adjacency matrix where the columns are normalized. Then plug in a page A into the following equation: or .

We begin by assuming r is a vector of all ones divided by total pages in the graph. We iteratively solve till r stabilizes.