Download Image Processing Fundamentals

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Linear Algebra Review
CS479/679 Pattern Recognition
Dr. George Bebis
1
n-dimensional Vector
• An n-dimensional vector v is denoted as
follows:
• The transpose vT is denoted as follows:
Inner (or dot) product
• Given vT = (x1, x2, . . . , xn) and wT = (y1, y2, . . . , yn),
their dot product defined as follows:
(scalar)
or
Orthogonal / Orthonormal vectors
• A set of vectors x1, x2, . . . , xn is orthogonal if
k
• A set of vectors x1, x2, . . . , xn is orthonormal if
Linear combinations
• A vector v is a linear combination of the vectors
v1, ..., vk if:
where c1, ..., ck are constants.
• Example: vectors in R3 can be expressed as a
linear combinations of unit vectors i = (1, 0, 0),
j = (0, 1, 0), and k = (0, 0, 1)
Space spanning
• A set of vectors S=(v1, v2, . . . , vk ) span some
space W if every vector in W can be written
as a linear combination of the vectors in S
w
- The unit vectors i, j, and k span R3
Linear dependence
• A set of vectors v1, ..., vk are linearly
dependent if at least one of them is a linear
combination of the others.
(i.e., vj does not appear on the right side)
Linear independence
• A set of vectors v1, ..., vk is linearly independent
if no vector can be represented as a linear
combination of the remaining vectors, i.e.:
Example:
c1=c2=0
Vector basis
• A set of vectors (v1, ..., vk) forms a basis in
some vector space W if:
(1) (v1, ..., vk) are linearly independent
(2) (v1, ..., vk) span W
• Standard bases:
R2
R3
Rn
Matrix Operations
• Matrix addition/subtraction
– Add/Subtract corresponding elements.
– Matrices must be of same size.
• Matrix multiplication
mxn
qxp
mxp
n
Condition: n = q
Identity Matrix
Matrix Transpose
Symmetric Matrices
Example:
Determinants
2x2
3x3
(expanded along 1st column)
nxn
(expanded along kth column)
Properties:
Matrix Inverse
• The inverse of a matrix A, denoted as A-1, has the
property:
AA-1=A-1A=I
• A-1 exists only if
• Terminology
– Singular matrix: A-1 does not exist
– Ill-conditioned matrix: A is “close” to being singular
Matrix Inverse (cont’d)
• Properties of the inverse:
Matrix trace
Properties:
Rank of matrix
• Equal to the dimension of the largest square submatrix of A that has a non-zero determinant.
Example:
has rank 3
Rank of matrix (cont’d)
• Alternative definition: the maximum number of
linearly independent columns (or rows) of A.
Example:
i.e., rank is not 4!
Rank of matrix (cont’d)
Eigenvalues and Eigenvectors
• The vector v is an eigenvector of matrix A and
λ is an eigenvalue of A if:
(assume non-zero v)
Geometric interpretation: the linear transformation
implied by A can not change the direction of the
eigenvectors v, only their magnitude.
Computing λ and v
• To find the eigenvalues λ of a matrix A, find
the roots of the characteristic polynomial:
Example:
Properties of λ and v
• Eigenvalues and eigenvectors are only
defined for square matrices.
• Eigenvectors are not unique (e.g., if v is an
eigenvector, so is kv).
• Suppose λ1, λ2, ..., λn are the eigenvalues of
A, then:
Matrix diagonalization
• Given an n x n matrix A, find P such that:
P-1AP=Λ where Λ is diagonal
• Solution: Set P = [v1 v2 . . . vn], where v1,v2 ,. . .
vn are the eigenvectors of A:
Matrix diagonalization (cont’d)
Example:
Matrix diagonalization (cont’d)
• If A is diagonalizable, then the corresponding
eigenvectors v1,v2 ,. . . vn form a basis in Rn
Are all n x n matrices
diagonalizable P-1AP ?
• An n x n matrix A is diagonalizable iff it has n
linearly independent eigenvectors.
– i.e., if P-1 exists, that is, rank(P)=n
• Theorem: If the eigenvalues of A are all
distinct, their corresponding eigenvectors are
linearly independent (i.e., A is diagonalizable).
Are all n x n matrices
diagonalizable P-1AP ? (cont’d)
λ1=λ2=1 and λ3=2
non-diagonalizable
λ1=λ2=0 and λ3=-2
diagonalizable
Matrix decomposition
• If A is diagonalizable, then A can be
decomposed as follows:
Special case: symmetric matrices
• The eigenvalues of a symmetric matrix are real
and its eigenvectors are orthogonal.
P-1=PT
A=PDPT=
Related documents