Download CBrayMath216-2-4-f.mp4 SPEAKER: We`re quickly approaching

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Orthogonal matrix wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Gaussian elimination wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Euclidean vector wikipedia , lookup

Exterior algebra wikipedia , lookup

Matrix multiplication wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Vector space wikipedia , lookup

Matrix calculus wikipedia , lookup

System of linear equations wikipedia , lookup

Four-vector wikipedia , lookup

Transcript
CBrayMath216-2-4-f.mp4
SPEAKER: We're quickly approaching being done with our first segment of the course. And this first segment of the
course is "Introduction to Linear Algebra," after which we're going to start talking about differential equations, and
we're going to find some nice applications of the linear algebra that we've learned to solving a large category of
differential equations, enormously important application.
After which we will, in fact, go back and do some more linear algebra. And after that, we will go back and do some more
differential equations. [INAUDIBLE] So we will use those subsequent linear algebra for this other tremendous application
of solving a large category of differential equations.
So now that we are near the end of our first segment of the course, you might say, I want to do a quick summary of
related key ideas that we've talked about. The most fundamental of these ideas is that of the linear combination. It's a
very simple idea, it's just a sum of scalar multiples of the objects that you've been given. So this impression right here,
the basic idea.
OK, so we saw vector spaces. Now, vector spaces are a sophisticated idea. There's a lot going on in there. But there is a,
arguably a little oversimplified, but a nicely simple point of view on what a vector space is. And a vector space is a set
where you can always do linear combinations.
And if you go back and look at the rules for what is a vector space, it has to be closed under addition, and there has to be
a notion of vector addition. There has to be a notion of scalar vector multiplication. And that has to always result in a
new vector-- in your vector space, of course. And then there are certain algebraic requirements. We want algebra,
addition, and scalar multiplication of vectors to behave in a natural way.
But oversimplifying a little bit, nice way to think about a vector space, is a vector space where you can always do linear
combinations. So really, the idea of a vector space sort of rests on this idea of linear combination in an important way.
OK. We talked about the idea of a list of vectors being linearly independent. Well this too, is a statement about linear
combinations.
Specifically, linear combinations are never unexpectedly zero. And this is an alternative phrasing. We didn't say it exactly
this way at the time. But by unexpectedly zero, I'm pointing out the fact that if we're looking at the trivial linear
combination, well then of course you're going to get 0. That's what you might call unexpectedly zero.
So linear independence, again, could be viewed as saying that it's never unexpectedly zero. More to the point, linear
independence is a statement about linear combination. So again it comes back to linear combinations.
OK, span, very obviously connects to linear combinations. It is in fact just a set of all linear combinations. All right.
And we talked about the idea of a basis for a vector space. And it's a nice set. It's a set that has what I call just the right
number of vectors. Specifically enough vectors that it can actually span the vector space in question. It's got to have
enough. If you don't have enough vectors, you can't span the whole vector space.
You do have to have at least that many. But not too many. If you have too many vectors, then it wouldn't be possible to
maintain independence. So you have to have a collection of vectors that have enough but not too many vectors. And
that's at just the right number.
But notice basis hinges critically on the ideas of span and independence. And span and independence go back to linear
combination. So ultimately, we are still, when we're talking about a basis, we're still just elaborating and building more
and more sophisticated ideas on this one construction, the linear combination.
Dimension, of course another important idea that we've been talking about for a while now. And that is just that right
number. That's defined from a basis.
OK, so we've done a lot of linear algebra, a lot of different things. But ultimately it all traces back to linear combinations.
One other observation I want to make is about non-singular matrices. We've talked about non-singular matrices, and
we've done various different things with them. This discussion reminder, strictly about square matrices, only square
matrices.
And we've proved a long list of facts about non-singular matrices. And here I'm collecting a lot of them, possibly not
even all of them. Gosh, I have to zoom out just to get them all on the page here.
A long list of facts about non-singular matrices. Recall the definition. It's that the reduced rational inform is the identity.
Another way to say that is there's a pivot in every row and a pivot in every column. So the rank is equal to n, that being
that n.
Of course, if there is n pivots, that means there's a pivot in every row and that means it has the existence property. And
then there's also a pivot in every column and that means it has the uniqueness property, and this goes on and on and
on. You guys go through and make sure you recognize all of these properties.
The wonderful thing is that all of these properties are equivalent for square matrices. So any square matrix that has one
of these properties has every other property. So if you find yourself with a matrix that somehow you know has the
existence property, then you know that the columns of that matrix are independent. And if you find a matrix for which
the road space is n-dimensional, well, then you know that matrix is invertible, etc.
There's a powerful fact in that all of these being equivalent means you can pair them up in some sense in whatever way
that you like. Now importantly, outside of the context of Math 216, you can cite these facts at your leisure. If a matrix
represents a row reduction, you may conclude that its reduced rational end form is the identity, et cetera. Inside of
Math 216, of course, we're all just learning these ideas, you know why that connects to this is something that we're in
the process of learning.
So it's important that you understand what these connections are and how to prove all of these connections between
these various properties. And that's a great exercise to go through and ask yourself, how would I prove that all of these
are the same? There's a lot of different ways to do it. There's isn't just a single way to do any particular one of these, but
they're all good practice.