Download Slide 1

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Singular-value decomposition wikipedia , lookup

Matrix multiplication wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Vector space wikipedia , lookup

Matrix calculus wikipedia , lookup

Four-vector wikipedia , lookup

Lp space wikipedia , lookup

Space wikipedia , lookup

Transcript
Last lecture summary
• Fundamental system in linear algebra :
system of linear equations Ax = b. nice
case – n equations, n unknowns
– matrix notation
– row picture
– column picture
• linear combinations
 1 5   x  0 
 2 1  y    3

   
• For our matrix, can I solve
1 5
A


2

1


Ax = b for every b?
• Yes
• And what it means geometrically?
• RHS fills the whole 2D space.
• But when this can go wrong?
• If two columns are on the same line, then their
combination is also on this line.
• singular matrix, not invertible
• invertible matrix, columns are independent
Matrix by matrix multiplication
• row times column
– shape of matrices?
– mxn.nxp=mxp
• column picture
– AB = C
– Columns of C are linear combinations of columns of A
• row picture
– Rows of C are linear combinations of rows of B
• column times row
– set of full size matrices
Inverse
• I = AA-1 = A-1A
• When is square matrix invertible (i.e.
nonsingular)?
– If it does not have dependent columns
– If you can’t find non-zero x such that Ax = 0
Rules
• CT = (AB)T = BTAT
• (A-1)T = (AT)-1
Transpose
• What is it?
• What is symmetric matrix?
• RTR is symmetric, RRT is also symmetric.
Vector space
• contains vectors – objects I can
1. add together
2. multiply by number
linear combination
• zero must belong to the space, otherwise
it can’t be called space !
Subspace
• Something smaller within the space.
• This smaller bit is also space.
• Subspaces in R2:
1. all of R2
2. all lines through zero
– Is this line the same as R1?
3. zero vector
Column space
• subspaces come out of matrices
• column space C(A)
– take columns of A
– all their linear combinations
– They all together form a space.
• Original columns as well as zero vector are all
covered by the term “all linear combinations”.
New stuff
Column space
1
2
A
3

4
1
1
1
1
• We’ll be interested in the size of the
column space now.
• What do you think, is that space the whole
four dimensional space? Just use your
feeling, if we start with three vectors and
take their combinations, can we get the
whole four dimensional space? NO
• So somehow we get a smaller space, but
how much smaller? That’s not immediate.
• Let’s make me first the critical connection
with linear equations.
5
6
7

8
• Column space is a
subspace of R what?
1
2
A
3

4
1
1
1
1
5
6
7

8
– R4
• What’s in the column space
of A?
– The columns (vectors) and all
their linear combinations.
Two questions
1. Does Ax = b always have a solution for
every b? I guess that's going to be a yes
or no question.
2. And then I'm going to ask which righthand sides are okay?
• OK, let’s write Ax = b for our A.
1
2

3

4
•
•
1 5
 b1 
x


1
1 6   b2 
x2 
1 7   b3 
  x3   
1 8   b4 
What is the answer to the 1st question?
Apparently the answer for 1. is No. Why?
–
the combinations of the columns don't fill the
whole four dimensional space
1
2
A
3

4
1
1
1
1
5
6
7

8
• However, for some RHSs I can solve that
Ax = b.
• Which RHSs allow me to solve this? Which vectors b
allow this system to be solved? This is the critical
question.
• Tell me one RHS I can solve this system for?
• All zeros.
1 1 5 
• Tell me another RHS I can solve for?

  x1 
– [1 2 3 4]
• And another?
 b1 
b 
2
1
6



 x2   2 
3 1 7   b3 

  x3   
4 1 8 
b4 
– [6 7 8 9]
• So apparently I can solve Ax = b when b is a linear
combination of the columns.
• In other words?
• I can solve Ax=b exactly when b is in the
column space.
• If b is not a combination of the columns,
then there is no x. There's no way to solve
Ax = b.
• Now the question is are all the columns
independent? Do they all contribute
something new, or can I drop one of them
and still have the same column space?
1
2
A
3

4
1
1
1
1
5
6
7

8
• So, would you throw some column
away without changing a column
space?
• Col3 = Col1 + 4
• The column space of this matrix is
a two dimensional subspace of R4.
Exercise
• Describe the column spaces for
1
I 
0
1
A 
2
1
B
0
0
1
2
4
2 3
0 4
I … whole space R2
A … line, the equation Ax is solvable only if b is on that line
B … whole space R2
Null space
1
2

3

4
1
1
1
1
5
0 
x


1
6   0
x2  


0 
7
  x3   
8    0 
• That’s a completely different space.
• What's in it? It contains not right-hand sides
b. It contains x’s.
• All solutions (x) to the equation Ax = 0.
• So where is the null space for this example?
• All x’s form a subspace of what?
– of R3
• Column space is in R4, while null space is in
different R (R3).
• OK, we’ll try to find a null space N(A) of our
matrix A. Help me, find at least three x’s.
1
2

3

4
1
1
1
1
5
0 
 x1   

6    0 
x2  

0 
7
  x3   
8
0 
• [0 0 0] Now try to find another?
• [1 4 -1] Now try to find yet another?
• [2 8 -2]
• In other words, c-multiples of [1 4 -1]
• So geometrically, how would you describe
1
a null space?
 
c4
1
• It’s a line. Line in R3, through the origin.
• We have to show, that this line is a
subspace.
– Show, that if I have two solutions v and w,
their sum v + w is also a solution. So if Av = 0
and Aw = 0, then A(v + w) = 0
– This is actually one of the matrix laws
(distributive law), that I can split A(v + w) into
two pieces: Av + Aw = 0
• And similarly I have to show that if Av = 0
then A times any multiple v is zero.
• A(12.v) = 0 because 12 x Av = 12 x 0 = 0
• Now, to fully understand the vector
space, let’s change the RHS
• What’s the solution?
• [1 0 0]
• Are there any other solutions?
• [0 -4 1], etc.
• Do the solutions form a subspace?
• They do NOT, why?
– zero vector is not a solution
• What are the solutions geometrically?
• They form a plane, not going through
the origin.
1
2

3

4
1
1
1
1
5
1 
 x1   

6     2
x2  

 3
7
  x3   
8
 4
Independence, basis,
dimension
based on excelent video lectures by Gilbert Strang, MIT
http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/VideoLectures/detail/lecture09.htm
Lecture 9
Independence
• When x1, x2, …, xn are independent?
• No linear combination gives zero vector,
except 0.


 
c1 x1  c2 x2    cn xn  0
except zero combinatio n, all ci  0
• I have three nonzero vectors in 2D space
(plane).
• I can arrange such vectors in a 2 x 3
matrix A. Then, the vectors are dependent,
if Ax = 0 (except x = 0), i.e. if there is
something in the null space of A.
• In othe words, columns are independent, if
there is only 0 in its null space.
• The number of independent columns in
matrix A is called the rank (hodnost) of
matrix A.
Span
• When we had a columns in a matrix, we
took all their combinations and that gave
us the column space.
• Those vectors that we started with span
(definují) that column space.
• So now I can say in shorthand the
columns of a matrix span the column
space.
• Columns of matrix A span the column
space.
• Are the columns independent?
– It depends on that particular columns.
• But obviously we’re highly interested in a
set of vectors that spans a space and is
independent.
– If we didn’t have them all, we wouldn’t have
our whole space
– If we had more, they wouldn’t be independent.
• Such a bunch of vector is called a basis
for a vector space.
• Basis is a set of vectors they have two
properties:
– I’ve got enough vectors.
– And not too many.
• Well, mathematician way of saying the same:
– they span the space
– they are independent
• From now, whenever I look at a subspace, if you
give me a basis for that subspace, you've told
me everything I need to know about that
subspace.
• Basis is
– minimum spanning set
– maximum independent set
• Examples – R3 space
• What would be a basis for a 3D space:
– [1,0,0]T, [0,1,0]T, [0,0,1]T, identity matrix, null
space is 0
• n vectors give a basis if the n x n matrix
with those columns is invertible
– why?
– Matrix is invertible, if Ax = 0 only for x = 0
• are
1 5 7 
1 3 1 


2 5 4
1 5
1 3


2 5
these vectors independent?
• yes
• which space they span?
• R3 (3D)
• do they form a basis?
• yes, for R3
• Are these two vectors a
basis for any space?
• yes, and for what space?
• the one they span, i.e. their combination
• what is their combination?
• plane inside 3D space
• basis is not unique
– there are zilions of bases
– columns of any invertible 3x3 matrix form bases for
3D
• however, all bases for a given space have the
same number of vectors (3 for R3, n for Rn)
• If we're talking about some other space, the
column space of some matrix, or the null space
of some matrix, or some other space that we
haven't even thought of, then that still is true that
there're lots of bases but every basis has the
same number of vectors.
• this number is called dimension (how big is
the space?)
• Let me repeat the four terms we’ve got now
defined
– Independence - looks at combinations not
being zero
– Spanning - looks at all the combinations
– Basis - combines independence and spanning
– Dimension - the number of vectors in any
basis, because all bases have the same
number.
Example
• do the columns span the column space of this matrix?
• yes, by definition what the column space is
• form they a basis for the column space?
• no, they are not independent, there’s something
in the null space
1 2 3 1
1 1 2 1


1 2 3 1
• Look at the null space N(A)
• Tell me some vector in the nullspace (solution of Ax = 0)
• [-1 -1 1 0]T
• Tell me the basis for that column space. There are many
answers, give me the most natural answer.
• first two columns
• And the rank of the matrix is?
• two
• Great theorem comes !!!!
• rank is the number of independent columns
• rank is the number of vectors in the basis
• rank is WHAT?
• dimension!
• The rank of A is the dimension of the
column space.
dim C(A) = r
• About words
– I am talking about the rank of the matrix
– and I am talking about the dimension of the
vector space/subspace
– I am not talking about the dimension of a
matrix, I am not talking about a rank of space
– And there is a link between the rank of matrix
and the dimension of its column space.
• Null space
– we already have one vector there: [-1 -1 1 0]T
– Are there other vectors in the null space?
• Yes. So our only vector is not a basis,
because it does not span.
– Tell me one vector more
• [-1 0 0 1]T
1 2 3 1
1 1 2 1


1 2 3 1
– The vectors in the null space are telling me in
what way the columns are dependent. That's
what the null space is doing.
– Now, what is the nullspace?
– These are two vectors in the null space.
They're independent. Are they a basis for the
null space? What's the dimension of the null
space?
dim N(A) = n – r
n is the number of columns
• they are independent, they form a basis, the
null space is two-dimensional
Four fundamental subspaces
based on excelent video lectures by Gilbert Strang, MIT
http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/VideoLectures/detail/lecture10.htm
Lecture 10
1. column space C(A)
2. null space N(A)
3. row space
–
–
–
–
all combinations of rows of A
rows span a row space
they are/they are not a basis for a row space
But I don’t like to work with row vectors, I’d like to
stay with column vectors. How to get column
vectors out of the rows?
•
–
transpose
So row space is all combinations of columns of AT C(AT)
4. null space of AT ... N(AT) (called the left null
space of A)
• Where are these spaces? A is m x n
• N(A) – vectors with n components (it is in
Rn), solutions to Ax = 0
• C(A) – columns of A, each column has m
components, it is in Rm
• C(AT) – the row space is in Rn
• N(AT) – the left null space is in Rm
• And now we want to understand these
spaces, i.e. we’d like to know a basis for
those spaces. And what’s their dimension?
Dimensions
•
•
•
•
•
A=mxn
dim C(A) = r
dim C(AT) = r
dim N(A) = n – r
dim N(AT) = m – r
• dim C(A) + dim N(A) = n
• dim C(AT) + dim N(AT) = m
• The row space and the null space are in Rn.
Their dimensions add to n.
• The column space and the left null space are in
Rm, and their dimensions add to m.
• Please, pay attention to the fact that
dimension (i.e. rank) of the column space
and row space is the same.
• E. g. is this matrix singular?
1 2 3
1 2 3


2 5 5
– Yes, it is, because two rows are the same.
Thus rank both of column and row space is
two.
Different type of vectors space
• All our vector spaces have been subspaces of
some real n dimensional space.
• A new vector space – all 3 x 3 matrices.
• i.e. my matrices are vectors.
• They don’t look like vectors, they’re matrices, but
they are vectors in my vector space because
they obey the rules.
• How about subspace of this matrix space M?
– upper triangular matrices U
u1,1 u
0 u
2, 2

0
0

0
0
0
0

u
u
u
u
u 3, 3 u
0 u4, 4
0
0
u 
u 
u 

u 
u5,5 
– symmetric matrices
– intersection of two subspaces is also a subspace
– what is the intersection of U and symmetric?
• diagonal – this is smaller subspace
• Now we will intuitively investigate a bases
(and dimensions) of these three matrix
spaces: all 3x3 matrices, 3x3 upper
triangular, 3x3 symmetric and 3x3
diagonal matrices
• Dimension = number of members in the
basis
• 3x3 matrices
– basis?
• matrices, each with 1 at different positions and the
rest zeros
– dimension?
• nine
• upper triangular matrices
– dimension?
• six
• symmetric
– dimension is again six
• diagonal
– dimension is three
Orthogonality
based on excelent video lectures by Gilbert Strang, MIT
http://ocw.mit.edu/OcwWeb/Mathematics/18-06Spring-2005/VideoLectures/detail/lecture14.htm
Lecture 14
• Welcome to the world of orthoganility. This
is a ninety degree lecture.
• What it means for vectors to be
orthogonal?
• What it means for subspaces to be
orthogonal?
Two vectors
• orthogonal = perpendicular, the angle
between two vectors is 90o
• How to find whether two vectors are
orthogonal? Use Pythagoras.
what is this vector in terms of a and b?
 
a b

b
length of the vector

a
2
a  a12  a22  a1a1  a2 a2


a
2
a  aT a
• So
2
2 2
a  b  ab
• This leads to the inner product rule about
orthogonality. Here is why:
aTa + bTb = (a+b)T(a+b) = aTa + bTb + aTb + bTa
 aTb + bTa = 0  2 aTb = 0  aTb = 0
• Two vectors are orthogonal if aTb = bTa = 0
• Zero vector is orthogonal to any vector.
Orthogonality of subspaces
• Subspace S is orthogonal to subspace T. What it
means?
– It means that every vector in S is orthogonal to every
vector in T.
• Wall is one subspace in 3D, floor is another
subspace in 3D.
• Are they orthogonal?
– No
• And why not?
– if two subspaces meet at some vector, well then for
sure they're not orthogonal, because that vector is in
one and it's in the other, and it's not orthogonal to
itself unless it's zero.
• row space is orthogonal to the null
space (in Rn)
• Why?
– if x is in the null space, then Ax = 0
 row 1  x1  0
row 2  x   0

 2   
 row 3  x3  0
– So I'm saying that a vector in the row space is
perpendicular to the x in the null space.
– row 1 is orthogonal to x, their inner product is
zero
• OK, so far we have shown that rows of A
are orthogonal to x, but what else is in the
row space?
– all their linear combinations
• To show that the linear combinations of
rows are also zero is pretty easy. Help me
– row1. x = 0, so c . row1 . x = 0
• Similarly, column space is orthogonal
to the left null space (in Rm)
• Next comes another definition, without
proving.
• Row space and null space are
orthogonal complements (in Rn).
• The orthogonal complement of a row
space contains not just some vectors that
are orthogonal to it, but all.
• That means that the null space contains
all, not just some but all, vectors that are
perpendicular to the row space.
G. Strang, Introduction to linear algebra