What is the key lemma relating eigenvalues of A and their corresponding eigenvectors?

If the eigenvalues are distinct values of A, then the corresponding eigenvectors are linearly independent. The proof for this is inductive. First, any eigenvector is trivially independent. The inductive step works as follows: Subtracting these two, we get: We know that any two distinct eigenvalues have different values, so their difference is non-zero. Therefore, all coefficients must be zero for this equation to hold. Therefore, the eigenvector is linearly independent with all other eigenvectors.

What is the definition of completeness?

An eigenvalue of a matrix A is called complete if the corresponding eigenspace has the same dimension as its multiplicity.

The matrix A is said to be complete if all its eigenvalues are complete.

A simple eigenvalue is automatically complete. Only multiple eigenvalues can cause incompleteness. Completeness requires both geometric and algebraic multiplicity, the dimension of the eigenspace, and multiplicity as root of characteristic polynomial respectively. A matrix is complete if and only if its eigenvectors span .

So, complex eigenvectors of a real matrix appear in conjugate pairs: . For complete real matrices, the real eigenvectors and the real and imaginary parts of the complex eigenvectors together form a basis of .

How does PageRank work?

Take all outgoing links from a webpage, treat it as a vector a la an adjacency matrix. Then, normalize this vector by dividing by total number of links on the page. Combine to create an adjacency matrix where the columns are normalized. Then plug in a page A into the following equation: or .

We begin by assuming r is a vector of all ones divided by total pages in the graph. We iteratively solve till r stabilizes.