Download Feb 7

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
1300 Linear Algebra and Vector Geometry
R. Craigen
Office: MH 523
Email: [email protected]
Feb. 7, 2017
Properties of matrix arithmetic §1.4
Our operations allow matrices to be handled algebraically
almost as if they were single objects ... almost like numbers
We enumerate their properties ... and their NON-properties.
These are important tools ... and pitfalls to avoid!
Theorem 1.4.1 For matrices A, B, C and scalars a, b (When
operations shown are defined):
(a) A + B = B + A
Commutative law for (matrix) +
(b) A + (B + C ) = (A + B) + C Associative law for +
Associative law for ·
(c) A(BC ) = (AB)C
(d) A(B + C ) = AB + AC
Left distributive law
Right distributive law
(e) (B + C )A = BA + CA
(f ) A(B − C ) = AB − AC
(g ) (B − C )A = BA − CA
(i) a(B − C ) = aB − aC
(h) a(B + C ) = aB + aC
(j) (a + b)C = aC + bC
(k) (a − b)C = aC − bC
(m) a(BC ) = (aB)C
(l) a(bC ) = (ab)C
Proving the basic properties
To prove a claim that two matrices of the same size are equal
show that every entry is equal. This works for all parts of
Theorem 1.4.1. Typically:
(f ) A(B − C ) = AB − AC
Proof: If either side is defined then A must be m × n while B
and C are both n × q (for some m, n, q). So both expressions
give a m × q matrix
(A(B − C ))ij = ai1 (b1j − c1j ) + ai2 (b2j − c2j ) + · · · + ain (b1n − c1n )
= ai1 b1j − ai1 c1j + ai2 b2j − ai2 c2j + · · · + ain b1n − ain c1n
= (ai1 b1j + ai2 b2j + · · · + ain b1n ) − (ai1 c1j + ai2 c2j + · · · + ain c1n )
= (AB)ij + (AC )ij
It follows that A(B − C ) = AB − AC .
Proving Associativity of matrix multiplication
The hardest ... and the most amazing ... of these properties!
(c) A(BC ) = (AB)C
Proof: If A is m × n then B must be n× (say) p and C must be
p × q for some m, n, p, q. Both sides are thus defined and yield
m × q matrices. Now,
(A(BC ))ij = ai1 (BC )1j + ai2 (BC )2j + · · · + ain (BC )nj
= ai1 (b11 c1j + · · · + b1p cpj ) + · · · + ain (bn1 c1j + · · · + bnp cpj )
= ai1 b11 c1j + · · · + aih bhk ckj + · · · + ain bnp cpj
where h runs over 1, 2, . . . , n and k runs over 1, 2, . . . , p.
Expanding ((AB)C )ij similarly gives the same sum
It follows that A(BC ) = (AB)C .
Example: matrix associativity


1 1
1 −1
−1 2


,C =
Let A = (1 2 3 4), B = 
.
0 1
5 −2
1 0
We calculate the product in two different ways:

1
1

A(BC ) = (1 2 3 4) 
0
1

1 −1
 −1
1 5
0


4
−6
2 
 = (1 2 3 4) 
5
−2 
−1

0
4
 = (3 10)
−2
2
Whereas


1

1

(AB)C = 
(1 2 3 4) 0
1

1

−1
 −1
1  5
0
2
−1
= (7 2)
5
−2
2
= (3 10)
−2
Notes about matrix multiplication
Commutativity fails!
It is thus is safe to write ABC for either A(BC ) or (AB)C
Similarly we can write A + B + C for either A + (B + C ) or (A + B) + C
However do not jump to the conclusion that matrix
multiplication behaves as with scalars!
For example in most cases, AB 6= BA !!
Matrix multiplication is, generally not commutative—Not even
when AB and BA are the same size!
1
−1
−1
2
1
2
−2
−2
=
0
0
0
2
, but
0
2
−2
1
−2
−1
−1
4
=
1
4
−4
−4
Always keep track of which matrix is on the left and which is on
the right. In the product AB we say B is premultiplied by A
and that A is postmultiplied by B.
Cancellation fails!
What if AB = AC —what can we say about B and C ?
Be careful! In general you cannot cancel A in such cases!
EG:
1 2
4 5
1 2
2 1
6 3
=
=
2 4
1 −1
2 4
2 1
12 6
2 1
4 5
But
6=
2 1
1 −1
1 2
So we cannot cancel
2 4
Similarly you cannot conclude that B = C simply because
BA = CA—we cannot cancel a common factor in a product
either from the left or from the right!
The zero matrix
(of order n ... or size m × n)
The m × n with all entries equal to zero is called the zero matrix (of
size m × n, or order n if m = n), denoted 0 = 0m×n (or 0n if m = n).
EG: 01×3
0 0
0 0 0
= (0 0 0); 02 =
; 02×3 =
0 0
0 0 0
Theorem 1.4.2 If c is a scalar, A is any matrix and 0 is a
matrix of the appropriate size, then
1. A + 0 = 0 + A = A
[0 is an additive identity]
2. A − 0 = A
3. A − A = A + (−A) = 0
[−A = (−1)A is the additive inverse]
4. 0A = 0(whether the first “0” is taken as a scalar or a matrix.
Also the matrix product A0 is always 0)
5. If cA = 0 then either c = 0 or A = 0.
The product of nonzero matrices can be zero
As we have just seen, if the product of a scalar and a matrix is
zero, then either the scalar or the matrix is zero.
cA = 0 =⇒ c = 0 or A = 0
But the same is not true for the product of two matrices!
1 −1
2 2
0 0
EG:
=
−1
1
2 2
0 0
and

1 1
1
1
1 1 −1 −1

1
1
 −1 −1 

= 0 0
 1 −1 
0 0
−1
1
When AB = 0 but A 6= 0 and B 6= 0, A and B are zero divisors.
Zero divisors are the basic reason why cancellation fails in
matrix arithmetic. Note that if A, B are zero divisors then
AB = 0 so AB = 0B but obviously A 6= 0—cancellation fails!
The identity matrix
(of order n)
Another special matrix, called the (multiplicative) identity
matrix (of a given order n) is defined as follows

1
0

0
I = In = 
.
 ..
0
1
···
..
···
.
0

0
.. 
.


0
1
I.e., the n × n matrix with diagonal entries 1; all other entries are 0
EG:
I1 = (1);

1
I3 A = 0
0
0
1
0

0
a
0 c
1
e
1 0
I2 =
;
0 1
 
b
a
d  = c
f
e
 
b
a
d  = c
f
e


1 0 0
I3 = 0 1 0
0 0 1

b 1
d
0
f
0
1
= AI2
I acts as a multiplicative identity from both left and right.
The inverse of a matrix
and singular matrices
A (multiplicative) inverse of matrix A is a matrix B such that
AB = BA = I
EG:
!
1 −1
1
1
1
2
− 12
1
2
1
2
!
=
1 0
0 1
!
=
1
2
1
−2
1
2
1
2
!
1 −1
1
1
Not all matrices have inverses.
If B is an inverse of A we say A is invertible, or nonsingular.
If not, then we say that A is singular (or non-invertible).
Note that if B is an inverse of A then A is an inverse of B.
Also note that a matrix commutes with its inverse.
Notation: If B and A are mutual inverses then we write
B = A−1 . (Also A = B −1 )
!
Feb. 7 summary
Read §1.4 and most sections about Matrices on Math 1300 Wiki
Do All recommended homework for §1.4. Note you may be
working ahead of lecture; some topics to be covered Thursday.
Also attempt Q’s 51–58 and the T/F questions.
I Terms learned: Commutative law for +; associative laws for +
and ·; Left and Right distributive laws; premultiplying (vs)
postmultiplying by a matrix; the zero matrix of size (order)
m × n (n); additive identity; additive inverse; zero divisors; the
identity matrix of order n; (multiplicative) inverse of a matrix
I Key concepts: Many rules for working with matrices are just
what you’d expect. MANY ARE NOT—BE CAUTIOUS!
KNOW THE RESULTS AND RELY ON THEM. Also your
fluency will be improved by working through proofs of the rules.
Matrices behave “sort of” like numbers but multiplication is not
commutative; cancellation does not work in general, and unlike
with scalars you can have nonzero “zero divisors”. 0m×n is an
additive identity and In is a multiplicative identity
I Methods learned: Proving matrix identities by (a) showing two
expressions must produce matrices of the same size; and (b)
examining corresponding entries to show them to be equal
I
I
Related documents