Exercise "matrix diagonalization"
Objective
Implement the Jacobi eigenvalue algorithm.
Hints:
my_prog
can be determined by the POSIX time
utility.
The command
\time --format "%U" --append --output time.txt ./my_prog > /dev/nullor, in short
\time -f "%U" -ao time.txt ./my_prog > /dev/nullruns the program (disregarding the standard output) and appends the number of consumed CPU-seconds to the file
time.txt
.
Without the --append
option (or, in short, -a
)
the file time.txt
gets overwritten. Backslash here is
needed to run the actual utility rather than built-in bash command (with
similar possibilities, actually).
An example of the usage of time
can be found here (see the
makefile).
Tasks
Implement the Jacobi eigenvalue algorithm for real symmetric matrices using the cyclic sweeps: eliminate the off-diagonal elements in a predefined sequence which spans all off-diagonal elements, for example row after row, and repeat the sweeps until converged. The convergence criterion could be that the eigenvalues did not change after a whole sweep.
Prove that your implementation works as intended: generate a random symmetric matrix A, apply your routine to perform the eigenvaluedecomposition, A=VDVT (where V is the orthogonal matrix of eigenvectors and D is the diagonal matrix with the corresponding eigenvalues), and check that VTAV==D.
Check that the number of operations for matrix diagonalization scales as O(n³) (by measuring the time it takes to diagonalize a random matrix of size n).
Hints:
(3 points) Jacobi diagonalization eigenvalue-by-eigenvalue
Eliminate the off-diagonal elements in the first row only (by running the sweeps not over the whole matrix, but only over the first row until the off-diagonal elements of the first row are all zeroed). Argue, that the corresponding diagonal element is the lowest eigenvalue.
If needed, eliminate the off-diagonal elements in the second row. Argue, that the eliminated elements in the first row will not be affected and can therefore be omitted from the elimination loop. Argue that the corresponding diagonal element is the second lowest eigenvalue.
If needed, continue elimination of the subsequent rows (omitting the operations with the already eliminated rows!) until the given number of the lowest eigenvalues is calculated.
(1 point) Classic Jacobi eigenvalue algorithm
In the classic Jacobi eigenvalue algorithm one searches for the largest off-diagonal element and eliminates it. Unfortunately, the search for the largest element is O(n²) process which is not effective if implemented directly. Instead, in the beginning one creates an array with indices of the largest elements in each row and then, after each rotation, only updates the indices of the affected rows.
A possible algorithm might look like this,
create array of indices of largest elements in rows; do : for each row : eliminate the largest element of the row by Jacobi transformation; update the array of indices for the affected rows; until converged.
Find out whether this gives any improvements in i) the number of rotations, ii) the run-time.