Download CBrayMath216-2-3-d.mp4 C. BRAY SPRING: So we return now to

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Singular-value decomposition wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Cross product wikipedia , lookup

Exterior algebra wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Matrix calculus wikipedia , lookup

System of linear equations wikipedia , lookup

Euclidean vector wikipedia , lookup

Four-vector wikipedia , lookup

Lp space wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Vector space wikipedia , lookup

Transcript
CBrayMath216-2-3-d.mp4
C. BRAY SPRING: So we return now to our previously discussed ideas of linear independence and span.
Previously, we discussed linear independence and span only in the context of Rn. We were thinking about
vectors in Rn and what does it mean to take linear combinations of those vectors in Rn. And would you
consider the collection of all linear combinations? And then there's this idea of linear independence that also
depends on the ability to take linear combinations of vectors. Both of these ideas depend on the construction of
linear combinations.
We have now though, this new context of vector spaces. Vector spaces are abstractions. There are lots of
different examples of vector spaces that are not Rn but they all kind of in some sense are modeled on Rn, and
they behave in certain ways like Rn. But we're going to be discussing vector spaces in the abstract.
Can we take these ideas of linear independence and span and can we make sense out of them in the context
of vector spaces? Well, as previously discussed, these ideas are all about linear combinations. And vector
spaces, by design, are spaces where you can add and do scalar multiplication. In other words, vector spaces
are contexts in which you can do linear combinations. So exactly what we need in order to be able to make
sense out of these ideas in the context of vector spaces is exactly what we have. So we'll see that in these
definitions here.
The definition of the span of a collection of vectors in a vector space. Just the set of all any or combinations of
those vectors. That's morally equivalent to the definition that we wrote down of the span of a collection of
vectors in Rn. We're just taking that idea and applying it to vectors in an abstract vector space V as opposed to
specifically Rn.
Likewise, a list of vectors is linearly dependent in a vector space V if they have a significant relation. In other
words, if there is a-- by significant we mean list of coefficients that are not all zero. And by relation we mean a
linear combination that adds up to zero. So this too is a statement about linear combinations. And in vector
spaces that makes sense.
So this idea of linear dependence that we developed in the context of vectors in Rn now actually makes sense
in a vector space, an abstract vector space V. Likewise, independence. Just like with Rn, a collection of
vectors is linearly independent if it's not dependent. I said differently. The only relation is a trivial relation. Of
course, let's not lose sight of the fact that relations are linear combinations that add up to zero.
So let's see an example. Let's consider the vector space of polynomials. We saw recently that this is a
subspace of capital F. And so it's a vector space on its own.
And let's consider the following two vectors in that vector space, namely, P1 is this polynomial and P2 is that
polynomial. And we ask the question, is that list linearly independent? Now at a glance, this seems nonsensical
because, well, we developed this idea of linear independence for vectors in Rn. And these are not vectors in
Rn, these polynomials. And that just doesn't, it doesn't seem to fit. It seems to be a mismatch of a question.
But in fact, this makes perfect sense.
You can think of linear independence in a geometric intuition way. And certainly, that doesn't make any sense
here. But we have a definition of linear independence here. Linear independence is a statement about relations
and whether or not the trivial relation is the only relation. That makes perfect sense here.
We can take these polynomials and ask, if I have a relation, linear combination that adds up to zero, if I have a
relation is it necessarily the case that this is a trivial relation? Can I draw conclusions about these numbers C1
and C2? This has got nothing to do with geometry. This is just an algebraic observation, algebraic question.
So we can take these polynomials, plug in what they are, and the question can be rewritten. Take these terms
and distribute and reorder the terms, et cetera. And what we find is, in order for a relation to hold I would need
this to be true. And importantly, what we have here is a statement about functions.
And then let me go back and note, this zero here, this is not the number zero. This is the vector zero. This is
the zero vector. So what we mean by this algebra is not is there a value of x that makes this equation true, that
makes the left side equal to the number zero? That is not what this is saying. This is saying, this polynomial is
always equal for all values of x to this polynomial. In other words, this polynomial is the zero polynomial.
And there's only one way that that could possibly be true, only if all three of these coefficients were zero.
Those coefficients all have to be zero if the polynomial is going to be identically the zero polynomial. And from
three equations can I conclude whether or not these coefficients here must necessarily both be zero? And well,
yeah, absolutely I can.
See, these two equations right here directly say that C1 and C2 have to be zero. And therefore, the only
relation is the trivial relation. And that's the definition of what it means for those two vectors in this vector space
to be linearly independent.
So we've drawn a conclusion about a list of functions that sounds like we're talking about a geometric feature
of geometric vectors. And in that sense, we're visualizing these functions as having kind of a geometric
relationship in this vector space.