Download Homework 15, Mathematics 1 submit by 1.2. Only problems 1b, 2b

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Field (mathematics) wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Singular-value decomposition wikipedia , lookup

System of linear equations wikipedia , lookup

Cross product wikipedia , lookup

Exterior algebra wikipedia , lookup

Matrix calculus wikipedia , lookup

Laplace–Runge–Lenz vector wikipedia , lookup

Dual space wikipedia , lookup

Geometric algebra wikipedia , lookup

Vector space wikipedia , lookup

Euclidean vector wikipedia , lookup

Four-vector wikipedia , lookup

Covariance and contravariance of vectors wikipedia , lookup

Cartesian tensor wikipedia , lookup

Linear algebra wikipedia , lookup

Bra–ket notation wikipedia , lookup

Basis (linear algebra) wikipedia , lookup

Transcript
Homework 15, Mathematics 1
submit by 1.2.
Only problems 1b, 2b, 3bc, and 4a will be graded.
Problem 1: Let F be a field. Prove that
(a) The multiplicative identily 1 is unique in F , i. e. , if for some α ∈ F , α·β = β
for all β ∈ F , then α = 1.
(b) [3 points] For any α ∈ F , its multiplicative inverse α−1 ∈ F is unique.
Problem 2: Consider the field Z11 = {0, 1, 2, . . . , 10}, where a + b (resp. a · b)
is defined as the usual addition (resp. multiplication) of integers, minus some
multiple of 11 such that the sum (resp. product) falls into {0, 1, . . . , 10}. For
each a = 1, 2, . . . , 10, find
(a) −a;
(b)[3 points] a−1 .
Problem 3: Let V be a vector space over a field F . Prove that
(a) For all α ∈ F , α · ~0 = ~0.
(b) [3 points] For all u ∈ V , 0 · u = ~0.
Hint: Start with rewriting 0 · u + (−u) according to the axioms.
(c) [2 points] For all u ∈ V , (−1) · u = −u.
Problem 4: Prove Proposition 2.2.2:
(a) [3 points] If a system of vectors U = {u1 , . . . , un } is complete and some
vector ui ∈ U can be expressed as a linear combination of the vectors in U \{ui },
then the system of vectors U \ {ui } is also complete.
(b) If a system of vectors U = {u1 , . . . , un } is linearly independent and some
vector u ∈
/ U can not be expressed as a linear combination of the vectors in U ,
then the system of vectors U ∪ {u} is also linearly independent.
(c) One can generalize the notions of linear independence and completeness
even to infinite systems of vectors in the following way. We say that the set
(system) of vectors U is linearly independent, if every finite subset of vectors
from U is linearly independent. We say that the set of vectiors U is complete
in the vector space V , if every vector from V can be expressed as a linear
combination of vectors from some finite subset of U . Prove the statements (a)
and (b) even for infinite systems U .
Problem 5: Considering the definition in Problem 4c (and recalling that a
system of vectors is a basis when it is both linearly independent and complete,
this being the case also for infinite systems), prove that {1, x, x2 , . . . , xn , . . . }
is an (infinite) basis of the vector space of all polynomials with real coefficients
(over the field of real numbers).
1