Class Matrix.EVD
- All Implemented Interfaces:
Serializable
- Enclosing class:
Matrix
A = V*D*V<sup>-1</sup>
If A is symmetric, then A = V*D*V' where the eigenvalue matrix D is
diagonal and the eigenvector matrix V is orthogonal.
Given a linear transformation A, a non-zero vector x is defined to be an eigenvector of the transformation if it satisfies the eigenvalue equation
A x = λ x
for some scalar λ. In this situation, the scalar λ is called
an eigenvalue of A corresponding to the eigenvector x.
The word eigenvector formally refers to the right eigenvector, which is defined by the above eigenvalue equation A x = λ x, and is the most commonly used eigenvector. However, the left eigenvector exists as well, and is defined by x A = λ x.
Let A be a real n-by-n matrix with strictly positive entries aij
> 0
. Then the following statements hold.
- There is a positive real number r, called the Perron-Frobenius
eigenvalue, such that r is an eigenvalue of A and any other eigenvalue λ
(possibly complex) is strictly smaller than r in absolute value,
|λ|
< r
. - The Perron-Frobenius eigenvalue is simple: r is a simple root of the characteristic polynomial of A. Consequently, both the right and the left eigenspace associated to r is one-dimensional.
- There exists a left eigenvector v of A associated with r (row vector) having strictly positive components. Likewise, there exists a right eigenvector w associated with r (column vector) having strictly positive components.
- The left eigenvector v (respectively right w) associated with r, is the only eigenvector which has positive components, i.e. for all other eigenvectors of A there exists a component which is not positive.
A stochastic matrix, probability matrix, or transition matrix is used to describe the transitions of a Markov chain. A right stochastic matrix is a square matrix each of whose rows consists of non-negative real numbers, with each row summing to 1. A left stochastic matrix is a square matrix whose columns consist of non-negative real numbers whose sum is 1. A doubly stochastic matrix where all entries are non-negative and all rows and all columns sum to 1. A stationary probability vector π is defined as a vector that does not change under application of the transition matrix; that is, it is defined as a left eigenvector of the probability matrix, associated with eigenvalue 1: πP = π. The Perron-Frobenius theorem ensures that such a vector exists, and that the largest eigenvalue associated with a stochastic matrix is always 1. For a matrix with strictly positive entries, this vector is unique. In general, however, there may be several such vectors.
- See Also:
-
Field Summary
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptiondiag()
Returns the block diagonal eigenvalue matrix whose diagonal are the real part of eigenvalues, lower subdiagonal are positive imaginary parts, and upper subdiagonal are negative imaginary parts.sort()
Sorts the eigenvalues in descending order and reorders the corresponding eigenvectors.
-
Field Details
-
wr
public final double[] wrThe real part of eigenvalues. By default, the eigenvalues and eigenvectors are not always in sorted order. Thesort
function puts the eigenvalues in descending order and reorder the corresponding eigenvectors. -
wi
public final double[] wiThe imaginary part of eigenvalues. -
Vl
The left eigenvectors. -
Vr
The right eigenvectors.
-
-
Constructor Details
-
EVD
Constructor.- Parameters:
w
- eigenvalues.V
- eigenvectors.
-
EVD
Constructor.- Parameters:
wr
- the real part of eigenvalues.wi
- the imaginary part of eigenvalues.Vl
- the left eigenvectors.Vr
- the right eigenvectors.
-
-
Method Details
-
diag
Returns the block diagonal eigenvalue matrix whose diagonal are the real part of eigenvalues, lower subdiagonal are positive imaginary parts, and upper subdiagonal are negative imaginary parts.- Returns:
- the diagonal eigenvalue matrix.
-
sort
Sorts the eigenvalues in descending order and reorders the corresponding eigenvectors.- Returns:
- sorted eigen decomposition.
-