Download Definitions in Problem 1 of Exam Review

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Quadratic form wikipedia , lookup

Euclidean vector wikipedia , lookup

Symmetry in quantum mechanics wikipedia , lookup

Tensor operator wikipedia , lookup

Determinant wikipedia , lookup

Matrix (mathematics) wikipedia , lookup

Vector space wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Jordan normal form wikipedia , lookup

System of linear equations wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Dual space wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Gaussian elimination wikipedia , lookup

Cartesian tensor wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Bra–ket notation wikipedia , lookup

Matrix multiplication wikipedia , lookup

Four-vector wikipedia , lookup

Basis (linear algebra) wikipedia , lookup

Linear algebra wikipedia , lookup

Matrix calculus wikipedia , lookup

Transcript
Final Exam Definitions
Ma322-003 Fall 2013
Definitions of terms in Question 1 of the Exam Review
1. Complete each of the following to provide proper definitions or complete, general descriptions. Operational definitions (i.e. descriptions of how the object is calculated ) will receive at most half credit.
Note that there are many equivalent ways to express the definitions of these terms. Mathematically equivalent statements of any of these definitions are perfectly acceptable.
(a) Precisely, <2 and <3 are defined to be:
"
<2 = {

<3 = {

#
x
| x, y ∈ <}
y

x
y 
 | x, y, z ∈ <}
z
(b) If S = {v1 , v2 , v3 , v4 } is a set of vectors in <n then
i. S is linearly dependent if there exist numbers α1 , α2 , α3 , α4 which are not all zero
such that α1 v1 + α2 v2 + α3 v3 + α4 v4 = O, the zero vector.
ii. S is linearly independent if whenever α1 v1 + α2 v2 + α3 v3 + α4 v4 = O, the zero vector,
it must be true that α1 = α2 = α3 = α4 = 0.
iii. the linear span of S is the set of all linear combinations of elements of S. That is the
linear span of S is {α1 v1 + α2 v2 + α3 v3 + α4 v4 | αi ∈ <}.
iv. S is a spanning set for the vector space V if V is the linear span of S. Equivalently,
every element of V is a linear combination of the elements of S.
(c) If A is a matrix then the rank of A is the dimension col(A), the linear span of the columns
of A.
(d) V ⊂ <n is a subspace of <n if whenever v1 , v2 ∈ V and α ∈ <
• v1 + v2 ∈ V
• αv1 ∈ V
Equivalently, V is a subspace if it is closed under addition and scalar multiplication.
(e) S = {s1 , · · · , sm } is a spanning set for the vector space V ⊂ <n if ever element of V is a
linear combination of the elements of S. Equivalently, for any w ∈ V then there are scalars
α1 , · · · , αm such that w = α1 s1 + · · · + αm vm .
(f) B = {b1 , · · · , bm } is a basis for the vector space V ⊂ <n if B is a linearly independent
spanning set for V .
(g) If V is a vector space then the dimension of V is the number of elements in any basis for
V.
(h) If A is an m by n matrix then the column space of A is the linear span of the columns of
A. Equivalently, the column space of A is {AX | X ∈ <m }.
(i) If A is an m by n matrix then the null space of A is {X ∈ <n | AX = O}.
(j) If A is an m by n matrix and B ∈ <m then the linear system AX = B is consistent if it
has at least one solution. Equivalently, if there is a vector X such that AX = B.
(k) If A is an n by n matrix then A is invertible if there is an n by n matrix B such that
AB = I, the identity matrix.
A and B have to be square in this definition. It is true that if A and B are square matrices
then AB = identity implies that AB = BA = identity. If A is not square then AB can
 be the

"
#
1 0
1 0 0

identity matrix without BA being the identity. For instance A =
,B = 
 0 1 
0 1 0
0 0
"
#
1 0
then AB =
6= BA .
0 1
(l) If A is an n by n matrix then α is an s an eigenvalue of A if Av = αv for some non zero
vector v ∈ <n .
(m) If A is an n by n matrix and v ∈ <n then v is an eigenvector of A if v is not a zero vector
and Av = λv for some λ ∈ <.
(n) If A is an n by n matrix and α is an eigenvalue of A then the eigenspace of A for the
eigenvalue α is {X ∈ <n | AX = αX}.
(o) If A is an n by n matrix then A is diagonalizable if there is an invertible matrix Q such
that Q−1 AQ is a diagonal matrix.
(p) If A is an n by n matrix then A then the characteristic polynomial of A is det(A − xI)
where x is a variable and I is the n by n identity matrix.
(q) If V ⊂ <n is a vector space then the orthogonal complement of V is V ⊥ = {x ∈ <n | <
x, v >= 0 f or every x ∈ V }. Here < x, y > is the dot or scalar product. V ⊥ can also be
defined to be the set of all x ∈ V such that x is orthogonal to each v ∈ V .
(r) If A is an n by n matrix then the Cayley-Hamilton Theorem says that if f (x) is the
characteristic polynomial of f then f (A) = the n by n zero matrix.
(s) If v ∈ <n is a non-zero vector then the unit vector having the same direction as v is
√
where ||v|| , the length of v, is < v, v >.

(t) If X =





x1
x2
..
.
xn




,Y







=
y1
y2
..
.
1
v
||v||






yn
||X||, the length of X is
are vectors in <n then < X, Y >= x1 y1 + x2 y2 + · · · + xn yn and
q
√
< X, X > = x21 + x22 + · · · + x2n .
(u) If T is a mapping from <n to <m then T is a linear transformation if:
a. for every u, v ∈ <n , T (u + v) = T (u) + T (v)
b. for every α ∈ < and v ∈ <n T (αv) = αT (v).
2
(v) If T is a linear transformation from <n to <m then
i. T is onto (surjective) if for every y ∈ <m there is x ∈ Ren such that T (x) = y.
ii. T is one to one (injective) whenever T (x1 ) = T (x2 ) then x1 = x2 .
2. If T is a linear transformation from <n to <m and A is a matrix such that T (X) = AX then A
must have m rows and n columns.
In terms of the rank of the matrix A, m, and n
i. T is onto (surjective) if rank(A) = m.
ii. T is one to one (injective) if rank(A) = n
3. If V ⊂ <n is a subspace of <n and y ∈ <n then the orthogonal projection of y into V is the
unique vector yV such that y − yV is perpendicular to V .
4. The vectors {v1 , v2 , · · · , vs } are mutually orthogonal if < vi , vj >= 0 whenever i 6= j.
3