Download Section 4.2: Null Spaces, Column Spaces and Linear Transforma

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

System of linear equations wikipedia , lookup

Matrix multiplication wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Euclidean vector wikipedia , lookup

Vector space wikipedia , lookup

Matrix calculus wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Lp space wikipedia , lookup

Four-vector wikipedia , lookup

Transcript
20F Discussion Section 6
Josh Tobin: http://www.math.ucsd.edu/~rjtobin/
Section 4.2: Null Spaces, Column Spaces and Linear Transformations
ˆ The null space of a matrix, Nul A, is the set of all vectors x such that Ax = 0. It is a subspace (this
is theorem 2). In order to find an explicit description of the null space, just solve the system Ax = 0
and write the solution in parametric vector form.
ˆ The column space of a matrix A, Col A, is the set of linear combinations of the matrix (or we could
say, the span of the columns of A). It is also a subspace(theorem 3). If we want to check if a vector b
is in the column space, we just have to check if Ab = b is consistent.
ˆ Note that the column space of A is all of Rm if Ax = b is consistent for every b, or alternatively if
there is a pivot in every row.
ˆ Note: The column space of an m × n matrix A consists of vectors with m entries. The null space
consists of vectors with n entries. So they are very different things (though there are various connections
we will explore).
Section 4.3: Linearly Independent Sets; Bases
ˆ Linear independence in an abstract vector space is defined in the same way as for vectors in Rn : the
set {v1 , v2 , · · · , vp } are linearly independent if the vector equation
c1 v1 + c2 v2 + · · · + cp vp = 0
has only the trivial solution c1 = c2 = · · · = cp = 0. Equivalently, a set is linearly independent if some
vector in the set is a linear combination of the others (again, the same as in Rn ).
ˆ Definition: Let H be a subspace of a vector space V . Then the set of vectors B = {b1 , b2 , · · · , bp }
is a basis of H if
(i) B is a linearly independent set
(ii) the span of B is H
ˆ The Spanning Set Theorem says two things: Firstly, if we have a set of vectors, and one of the
vectors is a linear combination of the others, we can remove it and the span of the set does not change
(the importance of this is that it guarantees we can always remove ‘redundant’ vectors from sets).
Secondly, every subspace H has a set that spans it (actually there are lots).
ˆ In the last section we talked about how to find an explicit description for the null space of A. To give
an explicit description of the column space, we just need to specify a basis (then the column space is
all linear combinations of this basis, which is a very explicit description). To find a basis of the column
space, first find which columns are picot columns, and then a basis of the column space is given by the
columns of A which are pivot columns. Note: This is not the same thing as taking the columns from
the (Reduced) Echelon Form. You first have to reduce it to find where the pivots are, then go back to
the original matrix A and take the columns that you found had pivots in the Echelon Form.
1
20F Discussion Section 6
Josh Tobin: http://www.math.ucsd.edu/~rjtobin/
Section 4.5: The Dimension of a Vector Space
ˆ Theorem 10: If a vector space has a basis of n vectors, then every basis has n vectors. (That is, two
bases for the same vector space have the same size).
ˆ If there is any finite set that spans a vector space, then that vector space is called finite dimensional
and the dimension is the number of vectors in the basis. If a vector space isn’t spanned by any finite set
of vectors, then it is infinite dimensional. For example, the vector space of all polynomials is infinite
dimensional, because there is no finite list of polynomials you can give that you can add together to
get all other polynomials (the problem is, if you give a list of finite polynomials, say the biggest power
in these polynomials is xk ; then there’s no way to get a polynomial with power xk+1 ).
ˆ Theorem 11: If H is a subspace of a finite dimensional vector space V , then any basis for H can be
expanded to a basis for V by adding in some vectors. Also dim(H) ≤ dim(V ).
ˆ Theorem 12: If V is p-dimensional vector space, then any p vectors that are linearly independent are
automatically a basis, and also any set of p vectors that span V is a basis.
ˆ The dimension of Nul A is the number of free variables in Ax = 0 (ie the number of columns without
pivots in A), and the dimension of Col A is the number of pivot columns.
Section 4.6: Rank
ˆ Definition: The rank of A is the dimension of the column space of A (ie. the number of pivot columns
in A).
ˆ The row space of a matrix A is the space spanned by the rows of the matrix. To find a basis for the
row space, bring the matrix to echelon form, and then the pivot rows are a basis. This means that the
dimension of the row space is the same as the dimension of the column space: they are both given by
the number of pivots, which is the same thing as the rank.
ˆ Rank Theorem: rank A + dim Nul A = n.
ˆ We can now add a bunch of new equivalent conditions to the invertible matrix theorem. An n × n
matrix is invertible is equivalent to all of the following things (as well as all of the things proved in
Chapter 2):
(m) The columns form a basis of Rn .
(n) Col A = Rn
(o) dim Col A = n
(p) rank A = n
(q) Nul A = {0}
(r) dim Nul A = 0
2