Course Materials and Schedule
Introduction & Linear algebra review:
- A practical example
- Vectors and matrices
- System of linear equations and matrix-vector forms
- Definition of vector norms
- ||x||p, p=1,2, the infinity.
- Cauchy-Schwartz inequality and application
- Which norm is bigger/biggest/smaller/smallest?
- Equivalence of the norms in the finite dimensional space.
- Subordinate/associated/natural matrix norms
- ||A||p, p=1,2, and infinity.
Direct methods for solving linear system of equations
Backward substitution for a triangular system of
equations
Gaussian elimination to reduce a general system of equations
to a
triangular
system
Elementary row transform matrices and their
properties,
the
relation
to Gaussian eliminations
Augmented matrix and Gaussian elimination
Gaussian elimination and LU decomposition, forward and
backward
substitution
Find the determinant of the matrix using Gaussian elimination
Operation counts of Gaussian elimination, substitution, O(n3/3),
O(n2/2)
Partial row pivoting: Why and how?
How to find the inverse matrix of A? What is the computation
cost?
How to evaluate y = A-1b or A-k
B? What is
the operation account?
Error analysis
Condition number of A
Growth factor and upper bounds
Banach's Lemma
Perturbation analysis of (A + E) x = b + e
Residual vector and relation to the relative error
Gaussian elimination for special matrices
Symmetric positive/negative definite matrices, how to
tell, Cholesky decomposition
and LDLT decomposition, storage and operation
counts
Banded matrices: tri-diagonal, penta-diagonal
Strictly column diagonally dominant matrices
Basic iterative method for solving linear system of equations
What is an iterative method? Why and when do we want
to use
iterative methods?
What is a basic iterative method?
Jacobi method (solve for the diagonals) and the iteration
matrix D-1(L+U)
Gauss-Seidel method (solve for the diagonal and use whatever
is
available),
the iteration matrix (D+L)-1U
SOR(w) iterative method: x(k+1) = (1-w) x(k)
+ w xGS(k+1)
Convergence of iterative method
Sufficient conditions for iterative matrices:
There is one associate matrix norm such that ||R|| < 1
Sufficient conditions for the original matrix A
Strictly row diagonally dominant
Weakly strictly diagonally dominant and irreducible
Symmetric positive definite matrices
Sufficient AND necessary
condition of the convergence, the
spectral radius
p(R) < 1.
- Convergence speed and the minimum number of
iterations are needed
- Krylov Methods in General
Algebraic eigenvalue problems
-
Review eigenvalues and eigen-vectors, Jordan canonical forms
- The Power method for the dominant eigenvalues and variations,
convergence order, condition of convergence, proof
- Gerschgorin circle theorem for eigenvalues
- QR algorithm and shifted QR algorithm
- Generalize eigenvalue problems
Least squares and SVD solutions
- Direct formula for small problems
- Least squares via QR decomposition
- SVD decomposition and least square solution
- Pseudo-inverse and SVD
- Nonlinear Least Squares Problems