Download Solution

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Cross product wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Vector space wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Gaussian elimination wikipedia , lookup

Exterior algebra wikipedia , lookup

System of linear equations wikipedia , lookup

Tensor product of modules wikipedia , lookup

Matrix multiplication wikipedia , lookup

Matrix calculus wikipedia , lookup

Four-vector wikipedia , lookup

Tensor product wikipedia , lookup

Transcript
MATH 205 HOMEWORK #5 OFFICIAL SOLUTION
Problem 1: An inner product on a vector space V over F is a bilinear map h·, ·i : V × V → F
satisfying the extra conditions
• hv, wi = hw, vi, and
• hv, vi ≥ 0, with equality if and only if v = 0.
(a) Show that the standard
dot product on Rn is an inner product.
R
(b) Show that (f, g) 7→ f (x)g(x) dx is an inner product on C ∞ ([0, 1], R).
(c) Suppose that F is ordered. Prove that for any v, w ∈ V ,
hv, wi2 ≤ hv, vihw, wi.
When does equality hold? What standard inequality in trigonometry does this reflect when
V = Rn ?
(d) We say that two vectors v, w in V are orthogonal if hv, wi = 0. Suppose that T : V → V is
a linear transformation satisfying
hT v, wi = hv, T wi
for all v, w. Show that eigenvectors of T with different eigenvalues are orthogonal.
Solution: Note that if we can show that a bilinear form is symmetric, then it is linear in the first
variable if and only if it is linear in the second. Thus showing linearity in one variable is enough.
(a) Let v = (a1 , . . . , an ) and w = (b1 , . . . , bn ). The standard dot product is symmetric, as
hv, wi = a1 b1 + · · · + an bn = b1 a1 + · · · + bn an = hw, vi.
Let λ ∈ R and u = (c1 , . . . , cn ). Then
hv+λu, wi = (a1 +λc1 )b1 +· · ·+(an +λcn )bn = a1 b1 +· · ·+an bn +λ(c1 b1 +· · ·+cn bn ) = hv, wi+λhu, wi.
Thus h·, ·i is bilinear. In addition,
hv, vi = a21 + · · · + a2n ≥ 0,
with equality if and only if ai = 0 for all i. Thus the standard dot product is an inner
products, as desired.
(b) Note that
Z
Z
hf, gi = f (x)g(x) dx = g(x)f (x) dx = hg, f i,
so that h·, ·i is symmetric. Let λ ∈ R, h ∈ C ∞ ([0, 1], R). Then
Z
Z
Z
(f (x) + λh(x))g(x) dx = f (x)g(x) dx + λ h(x)g(x) dx,
so that h·, ·i is bilinear. Lastly, suppose f (x) 6= 0, so that there exists x0 such that f (x0 ) 6= 0.
Then there exists some δ such that |f (x)| > |f (x0 )/2| for |x − x0 | < δ. Then
Z 1
Z x0 +δ
2
f (x) dx ≥
|f (x0 )/2|2 dx = |f (x0 )/2|2 δ > 0.
0
x0
Thus if f 6= 0 then hf, f i > 0; clearly if f = 0 hf, f i = 0. Thus the second property of an
inner product holds as well, and we are done.
1
2
MATH 205 HOMEWORK #5 OFFICIAL SOLUTION
(c) First note that if either v or u is 0 then the inequality is an equality and holds trivially;
thus we now consider the case when the two are nonzero. Let w = u − hu, vi/hv, viv. Then
we have
hw, vi = hu, vi − hu, vi = 0.
Now consider
0 ≤ hw, wi = hu, ui − 2hu, vi2 /hv, vi + hu, vi2 /hv, vi = hu, ui − hu, vi2 /hv, vi.
Rearranging this, we get
hu, uihv, vi ≥ hu, vi2 ,
as desired. Equality will hold exactly when w = 0, which means that u is a scalar multiple
of v.
Note that in the classical case, this reduces to the inequality cos2 θ ≤ 1.
(d) Suppose that u is an eigenvector of T with eigenvalue λ and v is an eigenvector of T with
eigenvalue ρ. Then
λhu, vi = hT u, vi = hu, T vi = ρhu, vi.
Thus if λ 6= ρ we must have hu, vi = 0. In other words, u and v are orthogonal.
Problem 2: Show that (F ⊕∞ )∗ ∼
= F ×∞ . Conclude that it is not the case that V and V ∗ are
always isomorphic.
Solution: Suppose that f ∈ (F ⊕∞ )∗ . Then f is uniquely determined by its value on a basis of
F ⊕∞ . Taking the standard basis {ei }∞
i=1 , where ei has a 1 in the ith position and 0s elsewhere,
we see that the data of f consists exactly of a scalar in F for each i ∈ I. These add and scale
pointwise. Thus f corresponds exactly to a vector in F ×∞ .
Since F ×∞ is never isomorphic to F ⊕∞ we see that V ∗ is not always isomorphic to V .
Problem 3: Suppose that F 0 is a field containing F and V is an F -vector space. If we consider
F 0 to be an F -vector space, we can form the tensor product F 0 ⊗ V , which is naturally an F -vector
space. Show that it is also an F 0 -vector space. This is called the change of base of V .
Solution: In order to show that F 0 ⊗V is an F 0 -vector space we need to define scalar multiplication.
For a pure tensor α ⊗ v we define scalar multiplication by λ ∈ F 0 by
λ(α ⊗ v) = (λα) ⊗ v.
If this is well-defined then it is clearly associative, since multiplication in F 0 is associative. Similarly,
it is unital: if λ = 1 then it clearly returns the same vector. We also have
λ(α ⊗ v + β ⊗ w) = (λα) ⊗ v + (λβ) ⊗ w
and
(λ + ρ)(α ⊗ v) = ((λ + ρ)α) ⊗ v = (λα + ρα) ⊗ v = (λα) ⊗ v + (ρα) ⊗ v.
Thus if scalar multiplication is well-defined then it satisfies the axioms of a vector space.
We have defined scalar multiplication by its action on the generators of F 0 ⊗ V . To show that
this is well-defined we need to check that it preserves the relations; in other words, that the scalar
MATH 205 HOMEWORK #5 OFFICIAL SOLUTION
3
multiple of any element in A0 stays in A0 . We therefore check:
λ((α + β) ⊗ v − α ⊗ v − β ⊗ v) =
=
0
0
λ(α ⊗ (v + v ) − α ⊗ v − α ⊗ v ) =
λ((aα) ⊗ v − a(α ⊗ v)) =
=
=
λ(α ⊗ (av) − a(α ⊗ v)) =
=
(λ(α + beta)) ⊗ v − (λα) ⊗ v − (λβ) ⊗ v
((λα) + (λβ)) ⊗ v − (λα) ⊗ v − (λβ) ⊗ v ∈ A0 .
(λα) ⊗ (v + v 0 ) − (λα) ⊗ v − (λα) ⊗ v 0 ∈ A0 .
(λaα) ⊗ v − (λa)(α ⊗ v)
(aλα) ⊗ v − (a(λ(α ⊗ v)))
(aλα) ⊗ v − a((λα) ⊗ v) ∈ A0 .
(λα) ⊗ (av) − a(λ(α ⊗ v))
(λα) ⊗ (av) − a((λα) ⊗ v) ∈ A0 .
Thus multiplication by λ preserves A0 , and is thus well-defined on F 0 ⊗ V . Thus F 0 ⊗ V is an
F 0 -vector space.
Problem 4: Prove the universal property for tensor products. In other words, show that for any
vector spaces U , V , and W there is a bijection
bilinear maps
linear maps
←→
.
U ×V →W
U ⊗V →W
Solution: Note that there exists a bilinear map
ϕ:U ×V →U ⊗V
which maps (u, v) to u ⊗ v. Thus we have a function
linear maps
bilinear maps
R:
−→
U ⊗V →W
U ×V →W
defined by mapping a linear map T : U ⊗ V → W to the bilinear map T ◦ ϕ. We need to show that
this map is a bijection.
FIrst we check that R is injective. Recall that S = {u ⊗ v | u ∈ U, v ∈ V } is a spanning set
for U ⊗ V ; thus any linear map T : U ⊗ V → W is uniquely defined by its values on S. if
T, T 0 : U ⊗ V → W satisfy T ◦ ϕ = T ◦ ϕ; then by definition T |S = T 0 |S ; thus T = T 0 . Thus R is
injective.
Now we need to check that R is surjective. Let T : U × V → W be a bilinear map. We need to
show that there exists T 0 : U ⊗ V → W such that T 0 ◦ ϕ = T . We define
T 0 (u ⊗ v) = T (u, v).
If this is well-defined then it satisfies the desired condition; thus we need to check that it is zero on
A0 . We check this on the generators of A0 :
T 0 ((u + u0 ) ⊗ v − u ⊗ v − u0 ⊗ v 0 )
T 0 (u ⊗ (v + v 0 ) − u ⊗ v − u ⊗ v 0 )
T 0 ((λu) ⊗ v − λ(u ⊗ v))
T 0 (u ⊗ (λv) − λ(u ⊗ v))
=
=
=
=
T (u + u0 , v) − T (u, v) − T (u0 , v) = T (u + u0 , v) − T (u + u0 , v) = 0
T (u, v + v 0 ) − T (u, v) − T (u, v 0 ) = T (u, v + v 0 ) − T (u, v + v 0 ) = 0
T (λu, v) − λT (u, v) = λT (u, v) − λT (u, v) = 0
T (u, λv) − λT (u, v) = λT (u, v) − λT (u, v) = 0.
Thus it is well-defined and R is surjective.
Problem 5: Let {ui }m
and {vj }nj=1 be bases of U and V , respectively. Show that a general
Pm Pni=1
element w = i=1 j=1 wij ui ⊗ vj is the sum of r pure tensors if and only if the m × n matrix
(wij ) has rank at most r.
4
MATH 205 HOMEWORK #5 OFFICIAL SOLUTION
Solution: Note that if we replace the basis {ui }m
i=1 with the basis
(u1 , . . . , λui + uj , ui+1 , . . . , um )
then the relationship between the matrix (wij ) and the matrix for w written in terms of the new
basis is the same, except that column j turns into Cj − λ−1 Ci . Thus we can do column reductions
on the matrix (wij ) by replacing the basis {ui }m
i=1 . Similarly, we can do row reductions by replacing
the basis {vj }nj=1 . A matrix has rank r if and only if it can be reduced to a matrix with r 1’s, with
no two 1’s in any row or column, using row and column reductions. If the matrix (wij ) has only r
entries then we can rewrite w as a sum of r pure tensors. Thus if the matrix has rank r then by
doing changes of basis corresponding to row and column operations we can write w as a sum of r
pure tensors.
Note that if w = (a1 u1 + · · · + am um ) ⊗ (b1 v1 + · · · + bn vn ) then wij = ai bj , so that (wij ) has rank
1, and every row is a scalar multiple of another row. Thus if w = x1 ⊗ y1 + · · · + xr ⊗ yr each row
in (wij ) is in the span of the vectors x1 , . . . , xr , so the row space of (wij ) has dimension at most r.
Thus the rank of the matrix (wij ) is at most r, as desired.
Problem 6: This next problem will investigate tensor products of modules. Note that the
definition of “linear” and “bilinear” still work for rings instead of fields. Suppose that R is a
commutative ring with unit, and M and N are R-modules. We define the tensor product M ⊗R N
to be the R-module such that there is a bijection
bilinear maps
linear maps
←→
M ×N →S
M ⊗R N → S
for any R-module S.
(a) Suppose that M = Rm and N = Rn . Show that M ⊗R N = Rmn .
(b) Suppose that R → S is a homomorphism of rings. Show that S is an R-module. Show that
S ⊗R M is an S-module.
(c) Now suppose that R = Z. As we discussed before, Z-modules are just abelian groups. What
is Z ⊗Z Z/nZ? What is Z/pZ ⊗Z Z/qZ for not necessarily distinct primes p and q? Find a
general description of Z/mZ ⊗Z Z/nZ.
Solution: Note that the construction of the tensor product that we did for vector spaces works
equally well in the case of R-modules, since we did not use any properties of fields in the construction.
Before we begin the problem we will show that for any R-modules M , M 0 and N ,
(M ⊕ M 0 ) ⊗ N ∼
= (M ⊗ N ) ⊕ (M 0 ⊗ N ).
We define
ϕ : (M ⊕ M 0 ) ⊗ N → (M ⊗ N ) ⊕ (M 0 ⊗ N )
by mappint (m, m0 ) ⊗ n to (m ⊗ n, m0 ⊗ n). To check that this is a well-defined morphism of
R-modules we need to check that it is 0 on the generators of A0 . We check:
ϕ(((m, m0 ) + (m0 , m00 )) ⊗ n
−(m, m0 ) ⊗ n − (m0 , m00 ) ⊗ n) =
=
=
0
0
ϕ((m, m ) ⊗ (n + n )
−(m, m0 ) ⊗ n − (m, m0 ) ⊗ n0 ) =
=
=
((m + m0 ) ⊗ n, (m0 + m00 ) ⊗ n) − (m ⊗ n, m0 ⊗ n) − (m0 ⊗ n, m00 ⊗ n)
((m + m0 ) ⊗ n − m ⊗ n − m0 ⊗ n, (m0 + m00 ) ⊗ n − m0 ⊗ n − m00 ⊗ n)
(0, 0) ∈ (M ⊗ N ) × (M 0 ⊗ N ).
(m ⊗ (n + n0 ), m0 ⊗ (n + n0 )) − (m ⊗ n, m0 ⊗ n) − (m ⊗ n0 , m0 ⊗ n0 )
(m ⊗ (n + n0 ) − m ⊗ n − m ⊗ n0 , m0 ⊗ (n + n0 ) − m0 ⊗ n − m0 ⊗ n0 )
(0, 0) ∈ (M ⊗ N ) × (M 0 ⊗ N ).
MATH 205 HOMEWORK #5 OFFICIAL SOLUTION
5
The other two relations follow analogously.
We define φ : (M ⊗ N ) × (M 0 ⊗ N ) → (M × M 0 ) ⊗ N by mapping (m ⊗ n, m0 ⊗ n0 ) to (m, 0) ⊗
n + (0, m0 ) ⊗ n. We check that this is well-defined:
φ((m + m0 ) ⊗ n − m ⊗ n − m0 ⊗ n,
(m0 + m00 ) ⊗ n0 − m0 ⊗ n0 − m00 ⊗ n0 ) = (m + m0 , 0) ⊗ n − (m, 0) ⊗ n − (m0 , 0) ⊗ n
+(m0 + m00 , 0) ⊗ n0 − (m0 , 0) ⊗ n0 − (m00 , 0) ⊗ n0
= 0.
The other relations follow analogously.
Thus these are well-defined maps. We have
φ(ϕ((m, m0 ) ⊗ n)) = φ(m ⊗ n, m0 ⊗ n) = (m, 0) ⊗ n + (0, m0 ) ⊗ n = (m, m0 ) ⊗ n
and
ϕ(φ(m ⊗ n, m0 ⊗ n0 )) = ϕ((m, 0) ⊗ n + (0, m0 ) ⊗ n0 ) = (m ⊗ n, 0 ⊗ n) + (0 ⊗ n, m0 ⊗ n0 )
= (m ⊗ n + 0 ⊗ n, 0 ⊗ n0 + m0 ⊗ n0 ) = (m ⊗ n, m0 ⊗ n0 ).
Thus these are mutually inverse isomorphisms.
We make one extra observation before starting. For any m ∈ M , n ∈ N ,
0 ⊗ n = m ⊗ 0 = 0 ∈ M ⊗R N.
Indeed,
0 ⊗ n = (0 + 0) ⊗ n = 0 ⊗ n + 0 ⊗ n,
and subtracting 0⊗n from each side gives us the desired equality. The other one follows analogously.
We are now ready to solve the problem.
(a) Note that Rn is the n-fold product of R with itself. Thus applying the above we know that
Rn ⊗R Rm ∼
= (R ⊗R Rm )n ∼
= Rmn .
The last step follows because R ⊗R Rm ∼
= Rm . Indeed, we note that for any pure tensor
m
r ⊗ v ∈ R ⊗R R we have
r ⊗ v = (r · 1) ⊗ v = r(1 ⊗ v) = 1 ⊗ rv.
So we have a map R ⊗R Rm → Rm which takes r ⊗ v to rv. We also have an inverse map
which takes v ∈ Rm to 1 ⊗ v.
(b) The proof we wrote in problem 3 works verbatim here.
(c) We define mutually inverse isomorphisms Z ⊗Z Z/n → Z/n and Z/n → Z ⊗Z Z/n. The first
takes a ⊗ [b] to [ab], and the second takes [b] to 1 ⊗ [b]. Thus Z ⊗Z Z/n ∼
= Z/n.
Now suppose that m and n are relatively prime numbers. Let a ⊗ b ∈ Z/m ⊗Z Z/n. Since
m and n are relatively prime, n has a multiplicative inverse n0 in Z/m. Then we have
a ⊗ b = (1a) ⊗ b = (nn0 a) ⊗ b = n(n0 a ⊗ b) = (n0 a) ⊗ (nb).
But n = 0 in Z/n, so nb = 0. Thus a ⊗ b = (n0 a) ⊗ 0 = 0, so all pure tensors are equal to 0.
Therefore Z/m ⊗ Z/n ∼
= 0 if m and n are relatively prime.
On the other hand, if m|n then Z/m ⊗Z Z/n ∼
= Z/m. Indeed, note that
a ⊗ b = (a · 1) ⊗ b = a(1 ⊗ b) = a(1 ⊗ (b · 1)) = ab(1 ⊗ 1).
Thus Z/m ⊗Z Z/n is generated by the multiples of 1 ⊗ 1, and is thus isomorphic to Z/k for
some k. Note that
m(1 ⊗ 1) = (m · 1) ⊗ 1 = 0 ⊗ 1 = 0,
so k|m. On the other hand, we have a surjective homomorphism Z/m ⊗Z Z/n → Z/m given
by a ⊗ b 7→ ab. Thus m|k, and we see that k = m, as desired.
6
MATH 205 HOMEWORK #5 OFFICIAL SOLUTION
m`
1
Write m = pk11 · · · pk` ` and n = pm
1 · · · p` , where some of the ki or mj can be 0. We can
then write
k
Z/mZ ∼
= Z/pk11 Z × · · · × Z/p` ` Z
and
m`
1
Z/nZ ∼
= Z/pm
1 × · · · × Z/p` .
Then
` Y
r
Y
m
∼
Z/mZ ⊗Z Z/nZ =
Z/pki i Z ⊗ Z/pj j Z.
i=1 j=1
If i 6= j then that summand is the trivial group. On the other hand, if i = j then the tensor
product is just the smaller power of pi . Thus we have
Z/mZ ⊗Z Z/nZ ∼
=
`
Y
i=1
min(ki ,mi )
Z/pi
Z∼
= Z/ gcd(m, n)Z.