Download Proceedings of the American Mathematical Society, 3, 1952, pp. 382

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Capelli's identity wikipedia , lookup

Quadratic form wikipedia , lookup

Eigenvalues and eigenvectors wikipedia , lookup

Jordan normal form wikipedia , lookup

Linear algebra wikipedia , lookup

Determinant wikipedia , lookup

System of linear equations wikipedia , lookup

Invariant convex cone wikipedia , lookup

Boolean algebras canonically defined wikipedia , lookup

Oscillator representation wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Fundamental theorem of algebra wikipedia , lookup

Matrix (mathematics) wikipedia , lookup

Laws of Form wikipedia , lookup

Four-vector wikipedia , lookup

Symmetry in quantum mechanics wikipedia , lookup

Matrix calculus wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Matrix multiplication wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Transcript
Proceedings of the American Mathematical Society, 3,
1952, pp. 382-388.
A NOTE ON BOOLEAN MATRIX THEORY1
R. DUNCAN LUCE
1. Introduction. Let U be a Boolean algebra of a t least two elements. If a , bE U, intersection, union, complementation, and inclusion are denoted by aAb, aVb, a', and a C b respectively. Form the
set V of matrices A, B, . . . of order n having entries Aij, Bij, . . . ,
i,j = 1, 2, . , n, which are members of U. The restriction to square
matrices is inessential to some of the theorems; in particular, Theorems 5.1 and 5.2 are true for rectangular matrices which are subject
to the usual restrictions on the range of the indices to make the indicated multiplication and equality meaningful.
In the set V define intersection, union, complementation, and
inclusion by:
(A n B)~,.= A ~ ,n
. B ~ ~(A, B ) =~ A~~ ,v
. B ~ ~
( A, ' ) ~=~A:,.,
A C B if and only if for every i, j, A i i C Bii.
u
Under these definitions V forms a Boolean algebra, which is known
as the algebra of Boolean matrices.*
A multiplication may be defined in V as follows:
If o and e are the null and universal elements of U, then the zero
(null), identity, and universal matrices 0, I, and E are defined as:
(0)ii=
0,
(I)ii =
o
if i f j and e if i
= j,
( E ) i i = e.
Under these definitions and the given multiplication, V forms a
lattice-ordered semigroup with zero.3
As in ordinary matrix theory the transpose of A, AT, is defined by
(A T)ij=Aji and the elements of the transpose will be denoted by A.:
The following properties follow either immediately from the definitions or by virtue of V being a lattice-ordered semigroup:
Received by the editors December 28, 1950 and, in revised form, August 17, 1951.
1 The work included in this paper was carried out during the summer of 1949
when the author was a member of the Research Center for Group Dynamics, University of Michigan.
2 G. Birkhoff, Lattice theory, Amer. Math. Soc. Colloquium Publications, vol. 25,
rev. ed., New York, 1948, p. 213.
Ibid., pp. 201-213.
A NOTE ON BOOLEAN MATRIX THEORY
A(BC)
=
383
(AB)C, A C B implies AC C BC and CA C CB for all C,
O n A =OA=AO=O
and I A = A I = A forall A,
2. Symmetry and skew-symmetry. In analogy t o ordinary matrix
theory we wish t o define concepts of symmetry and skew-symmetry ;
it is clear t h a t the formal definition of symmetry may be carried over
without change, b u t this is not true for skew-symmetry. Our choice
for the latter (Definition 2) has in its favor a form parallel t o one
definition of symmetry and t h a t it allows Theorem 2.1 t o be proved.
DEFINITION
1. A Boolean matrix A is said to be symmetric if
ATnA'=O.
DEFINITION
2. A Boolean matrix A is said t o be skew-symmetric if
ATnA=O.
Evidently the property A =AT implies A is symmetric, and conversely it is implied by symmetry for A T n A ' = O implies A T C A .
Taking the transpose, A C A T , whence A =AT. I t is equally easy t o
show t h a t A is skew-symmetric if and only if A =Ar\AT'. Moreover,
A is skew-symmetric if for some B, A = B n B T 1 . For if there exists
such a B , then A'= B'UBT and so A T= B T n B ' C B T V B ' =A1, whence
A T = A T n A ' , or what is the same, A = A n A T 1 .
By the properties of the transpose mentioned in $1 one easily
proves t h a t symmetry is a property invariant under union, intersection, and complementation. Skew-symmetry is invariant under intersection, for if A A A T = O and B n B T = O , then ( A n A T ) n ( B n B T )
= O = ( A n B ) n ( A n B ) T . I t is not, however, invariant under complementation as
shows, nor under union as A and
show.
THEOREM
2.1. Any Boolean matrix can be uniquely decomposed into
384
R. D. LUCE
Uune
the disjoint union of a symmetric and a skew-symmetric matrix.
PROOF.If A is SO decomposed, the decomposition is unique. For if
A = S U Q where S A Q = 0 , S = ST, and QTAQ= 0 , then
since QCQT' and QCS'.
Selecting S = A A A T and Q =AAAT' proves that the decomposition exists.
3. Consistent matrices.
DEFINITION3. A Boolean matrix A is said to be row (column) con:
sistent if A E = E (EA = E ) .
LEMMA3.1. A is row (column) consistent
(ICATA).
if and only if I C A A
PROOF. By definition I C A A T if and only if e = U j ( ~ i j A A z )
Aii= U (Aii A E ik) for every i , k. This, by definition, is equivalent to E = AE.
=U
LEMMA3.2. (1) If A and B are row (column) consistent, then A B is
row (column) consistent. (2) If A B is row (column) consistent, then A
is row consistent (B is column consistent). (3) Neither converse is true.
PROOF.(1) By hypothesis A E = E and B E = E ; so (AB)E = A(BE)
= A E = E. (2) By hypothesis (AB)E = E and it is always true that
A E C E and B E C E ; so E = (AB)E = A ( B E ) C A E , whence A E = E.
This lemma states, in the terminology introduced by Dubreil for
semigroup theory: that under multiplication the set of row (column)
consistent matrices forms a left (right) consistent subsemigroup of V.
The introduction of the concept of a consistent matrix suggests that
a t the other extreme of the relation A C A E C E we might consider
matrices A such that A E =A. Such a definition would be a slight
4 P. Dubreil, Contribution d la th6orie de demi-groups, M6moires de I'Acad6mie des
Sciences de 1'Institut de France (2) vol. 63 (1941).
19521
A NOTE ON BOOLEAN MATRIX THEORY
385
generalization of the concept of a right ideal element in a relation
algebra;6 however, these matrices do not seem to be as important
as consistent matrices. I t may be mentioned, nonetheless, that the
class of all ideal matrices (i.e., both left and right ideal) is a Boolean
subalgebra isomorphic to the given algebra U.
4. Inverses. We shall denote the inverse of a matrix A, if it exists,
by A-l. Following the terminology of ordinary matrix theory a
matrix A is called orthogonal if it has an inverse which is AT. For
example,
is orthogonal. I t is immediate that the product of two orthogonal
matrices is orthogonal.
LEMMA4.1. A Boolean matrix A is orthogonal if and only if A
is both row and column consistent and for every i, j , k, ifj, Aik
n A j k = 0 =AkinAkj.
PROOF. If A is orthogonal, Lemma 3.1 implies that A is row and
column consistent. For every i, j, i#j, A A T = I implies
whence for every k, AiknA j k =o. Similarly, Akir\Akj = O for every
i, j, k, i # j , follows from ATA =I.
CorIversely, if A iknA j k = 0 for every i, j, k, i #j, then Uk(A iknA jk)
=U~(A~~AA
=o,
$ ) whence AATCI. Since A is row consistent,
Lemma 3.1 implies I C A A T, and so AA =I.Similarly A TA =I.
THEOREM
4.2. A Boolean matrix has a n inverse if and only if it is
orthog~nal.~
PROOF. The sufficiency is true by definition.
By Lemma 3.2, if A has an inverse, both A and A-I are row and
column consistent. Thus, using Lemma 4.1, it is sufficient to show
AiknA jk = O = Akir\Akj for every i,j , k, i#j. We shall carry this out
only for the first case as the other is essentially the same. AA-I = I
implies that for every i, j, i # j , Uk(AikAA;') =o, whence, for every
Birkhoff, op. cit., p. 212.
This result is stated by J . H. M. Wedderburn, Ann. of Math. vol. 35 (1934) pp.
185-194.
5
0
386
R. D. LUCE
[June
k , AikAA,'=o; summing over j, j#i, and holding k fixed, we have
Uj+i ( A i k n A ~ ' )=o. Thus,
since UjAil =e because A-'
k, i#j,
Aik
is row consistent. Now, for every i, j ,
A A jk = (Aik A
=
n A;:)
n (Ail,A A;;)
n
A A;:)
= 0.
Hence A is orthogonal.
COROLLARY.
A is involutory (A2=I )
metric and orthogonal.
if
and only zj A is both sym-
PROOF.Immediate.
5. Linear matrix equations. In this section we shall consider the
problem of finding the class of matrices X such that XA = B (AX = B)
when A and B are given Boolean mat rice^.^ This is clearly equivalent
to finding the intersection of the two classes of matrices X satisfying
X A C B and XA>B. The former case is relatively simple and is
completely solved; however, the latter is much more difficult. We
have been able only to give a sufficiency condition on X, which is
not necessary, for X to be a solution of XA>B. We also present a
necessary and sufficient condition on A and B that there be solutions
to XA = B.
Utilizing the solutions to X A C O (hence in this special case the
complete set of solutions to XA = 0 ) , we completely characterize the
divisors of zero occuring in the algebra of Boolean matrices.
I t is clear from the previous section that if A is orthogonal,
X = BAT is the unique solution of XA = B, but in general a solution
will not be unique.
Toward our ends we prove the following lemma.
LEMMA5.1. Let A, B, and C be Boolean matrices. ABCCI' if and
only if A'> (BC) T.
PROOF. By definition ABCCI' if and only if, for every i,
U3Uk(A23nBjknCk+)
= O Or, what is the same, A + j n B 3 k n C k r =for
~
7 The results of this section are generalizations of work by D. 0.Ellis and J. W.
Gaddum as stated in Bull. Amer. Math. Soc. Abstract 56-5-448.
A NOTE ON BOOLEAN MATRIX THEORY
19521
387
every i, j, k. Rewriting, we have A&>Bjkr\Cki= C~AB;. Since the
left side is independent of the index k, this is equivalent t o A:,
>U~(C;~\B$), t h a t is, A'>CTBT = (BC)T.
Birkhoff defines for any multiplicative lattice8 the right (left)
residual B :A (B : :A) of B by A as the largest X (if it exists) satisfying
XA C B ( A X C B ) . A multiplicative lattice in which such residuals
always exist is called a residuated l ~ t t i c e . ~
THEOREM
5.2. I n the algebra of ~oolean'matricesXA C B if and only
and A X C B if and only if X C (A TB')'; hence the algebra is residuated and B : A = (B'A T)' and B : :A = (A TB') '.
if X C (B'A T)',
PROOF. X A C B means by definition that Uj(Xijr\Ajk)CBik for
every i, k, and this is equivalent t o Xijr\AjkCBik, i, j , k = 1, 2,
,
n. This in turn is true if and only if ~ j j r \ ~ j k r \ ~ ~ = ~ i ~ r \ A
=o. Summing on j and k, UjUk(Xijr\Ajkr\Blt) =o, which means, by
definition, that XAB'T = XABT' C 1'.According t o Lemma 5.1, this
is true if and only if X'>(AB'T)T, that is, if and only if XC(B'AT)'.
A similar proof holds for A X C B .
j k r \ ~ ~
COROLLARY
1. The following are equivalent : XA = 0, X C (A E )
E X C (AE) A similar statement holds for A X = 0.
PROOF.Since it is always true t h a t O C X A , it is sufficient t o consider X A C O , which by the theorem is equivalent to XC(O'AT)'
=(EAT)'= (AE)T'. This is equivalent to X'>(AE)T and hence t o
X;,>U~(E~~AA:) = U ~ A ; for every i, j. Since the right side of this
expression is independent of i, it is the same as U ~ A $ C ~ ~ X ; ~
= nk(OikuXij) = [Uk(~ikr\Xkj)
1'. Rewriting, we have EA T C(EX)'
and so E X C (AE)
As was mentioned in the introduction, this theorem and its corollaries are not restricted t o square matrices; however the usual restrictions on the ranges of the indices are required. With this in mind, we
may restrict X and B to be vectors; then Corollary 1 becomes, essen.~
second
tially, the first principal result of Ellis and G a d d ~ m Their
result is included in the following corollary.
COROLLARY
2. XA = B has a solution if and only if BC(B'AT)'A.
PROOF.If there is a solution X , the theorem implies XC(B'AT)';
however, XA = B also implies B C X A C (B'A T)'A.
Conversely, if B C (BIAT)'A, then X = (B'AT)' is a solution for the
hypothesis implies XA > B and the theorem XA CB.
8
Birkhoff, op. cit., p. 200.
Ibid., p. 201.
388
R. D. LUCE
In the first part of their third result Ellis and Gaddum note that if
X solves X A C B , then, since XC(B'AT)', X may be written in
parametric form X = Rr\(B'AT)' with R an arbitrary matrix. If,
however, X is to solve XA =B, R may no longer be arbitrary. The
second part of their third result is a statement of conditions on R
necessary and sufficient for X to be a solution. These are trivial, for
they may easily be shown equivalent (in the case X and B are vectors)
to the following trivial result:
COROLLARY
3. XA =B has the solution X = Rr\(BIAT)' if and only
if B C [R~\(B'AT)']A.
THEOREM
5.3. Let A and B be Boolean matrices of the same order. If
there exists a matrix C such that CCA and B C E C , then all matrices X
such that X>BCT are solutions of XA >B.
PROOF.If such a C exists, it is sufficient to show BCTA>B since
XA>BCTA. For every i, 1,
since CCA. From the assumption B C E C we obtain BilCUkCkl, SO
the result follows by definition.
The condition of this theorem is by no means necessary for if
A =B =E , then X =I is a solution. Now suppose the C of the theorem
satisfies I>BCT =ECT. Then for every i, j, i#j, o = U~(E~~AC;)
= UkCjk, and so for every j and k, Cjk =o. C = 0 contradicts the condition B = E C E C .
In the theory of semigroups with a zero element, an element A is
a right divisor of zero if A Z 0 and there exists an element B # 0 such
that BA = 0.
THEOREM
5.4. A Boolean matrix A is a right (left) divisor of zero if
and only if A is not row (column) consistent.
PROOF.Suppose BA = 0 and A is row consistent, then, by Corollary
1 of Theorem 5.2, B C (AE) = ET' = 0, SO A is not a right divisor of
zero.
Conversely, if A is not row consistent, there exists some integer p
such that U,Ap,#e. Define: Bpp= (U,A,,)'andBi, =ootherwise,whence
B #O. Consider BA : if i # p , Uk(Bikr\Akj) = O since Bik =o. If i = p ,
Uk(Bpkr\Akj)=Bppr\Apj=(UpApq)'r\ApjCA~jr\Apj=~.
Thus BA
=O and A, B#O, so A is a right divisor of zero.