Download Tranquilli, G.B.; (1965)On the normality of independent random variables implied by intrinsic graph independence without residues."

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Eigenvalues and eigenvectors wikipedia , lookup

Matrix (mathematics) wikipedia , lookup

Linear least squares (mathematics) wikipedia , lookup

Orthogonal matrix wikipedia , lookup

Singular-value decomposition wikipedia , lookup

Non-negative matrix factorization wikipedia , lookup

Four-vector wikipedia , lookup

System of linear equations wikipedia , lookup

Gaussian elimination wikipedia , lookup

Brouwer fixed-point theorem wikipedia , lookup

Matrix calculus wikipedia , lookup

Jordan normal form wikipedia , lookup

Perron–Frobenius theorem wikipedia , lookup

Matrix multiplication wikipedia , lookup

Cayley–Hamilton theorem wikipedia , lookup

Ordinary least squares wikipedia , lookup

Transcript
I
I
.-
I
I
I
I
I
I
ON THE NORMALITY OF mDEPENDENT RANDOM VARIABIES
IMPLJED BY mTRINSIC GRAPH-INJJEPENDEIiJCE
WITHOUT RESIDUES
By
G. B. Trsnquilli
University of North Carolina and
University of Rome
Institute of Statistics Mi.meo Series No. 430
May
1965
I_
I
I
I
I
I
I
I
.e
I
This research was supported by the Air Force Office
of Scientific Research Grant No. AF-AFOSR-760-65.
DEPARTMENT OF STATISTICS
UNIVERSITY OF NORTH CARpLmA
Chapel Hill, N. C.
I
I
.I
I
I
I
I
I
I_
ON THE NORMALITY OF INDEPENDENT RAN'OOM VARIABLES
IMPLIED BY INTRINSIC GRAPH- INDEPElIDENCE
WITHOUT RESIDUES
By
G. B. Tranquilli
University of North Carolina and
University of Rome
SUMMARY
In the present paper we prove the normality of n(> 2) mutually indepen-
that
dent random variables (r.v.'s) given/two linear functions of all the n r.v.'s
are independent, without any assumption about their diatribution patterns
I
I
I
I
I
I
I
.I
(Part I).
Having introduced the definitions of "graph-independence," "instrinsic
graph-independence" and "residual r. v.
with regard to an intrinsic graph-
independence," we prove the normality of
n(~
2) mutually independent r. v... IS
when they have any intrinsic graph-independence without residues (Part II).
I
I
-,
PART I
I.1-Let ~::I !(n)
(~,X2,··.,Xn)' and.! • .!(m) • (Y1 ,y2 , ••• ,Ym)' be two
r.v.' s (random variables) ot rank reapective1y n and r
(i.e. two mu1ti::I
variate r.v.' B represented as random column vectors) which are in correspondence through a linear relation ot matrix C • C
mxn ot rank
n
.! • C ! or Yi •
(1)
Let ~x(t) • ~X
-
t
J.1
X
c
iJ
X with C • «C
iJ
J
X (t1 ,tn ,
~
l' 2'···' n
•••
»,
r~
min(m,n);that is*
i • 1,2, ••• ,m; J
,t) and ~Y(~) • ~Y Y
n
-
between two r.v.'s
!
1,2, ••• ,n.
Y (u1 ,un ,
l' 2'···' m
!. ~(n)=
be the c.t.'s (characteristic tunctions) of the r.v.'s ! and.!:
=(t1 ,t2 , ••• ,tn )', ~ • ~(m) • (u1 ,u2 ,···,um)'·
::I
~
•••
,u )
m
Since to a linear correspondence
and.! corresponds the cogradient linear correspondence
between the arguments ot their c.t.'s, that i8 [12]
then (1) expressed in terms ot the c.f.' 8 becomes
'~y(~)::1 ~X(C'~) and ~Y (u1 ) • ~y(O, ••• ,O,ui'O""O)
(3)
-
-
i
::t
~X(Cilui' i2 ui""c inu i )
-
According to the kiad ot problem which emp10;y88the relation (1) [(3)] such
a relation bec(me a matric (a functional) equation with initial conditions when
some of the elements !, .!, C (~X' ~Y' C') are the unlmowns in the equation (1)
- -
[(3)]. Obviously the initial conditions, the assumptions taken on the unknowns,
have to be compatible with equation (1) [(3)].
*~
::t
~(n)
is a column vector (~,~, ... ,xn )', the transpose of the row vector
(x1,~,···,xn) -~' - !(n)'
C • Cmxn • « c i4oJ »i - 1 ,~,
n
is a matrix with
••• ,m
j-1,2, •• '.,n
m rows and n columns, and C' ::I C'
is its transpose.
nxm
I
I
I
I
I
I
-,
I
I
I
I
I
I
I
-.
I
,
I.
I
I
I
I-
An interesting general problem is the following:
class (~O of all the r. v. 's
!
,_
I
I
I
I
I
I
I
I
••I
the equivalence
r X of
specified general
which have the same set
characteristics about the structure of their distribution function (or their
c.f.) to find the sub-class (!)* e (!) and the matrix class (e} such that all
the r.v.' s
X
III
e
!,
with any
c: (!l*
!
and e
(e), form the sub-class (X}*
E
belonging to the equivalence class (X} of all the r.v.' 15 X which have the
same set
Ay
of prefixed general characteristics about the structure of their
-
I
I
" Given
H
distribution function (or their c.f.).
T,he assumption set
transformation
X .. e !
rx
if-!
then becomes the assumption set
E
(!}*,
X E (X} *, e
(e}:
€
intrins1c to the class (!}* and becomes extrinsic
Ay
under any linear
the set
Ay
is linearly
after such a linear trans-
formation.
Here, we are interested in the case where
independence among the n r.v.' s
JS..,~,
..• ,Xn
r X is
and 6
1
the assumption of mutual
1. the assumption of
independence between pairs of r.v.' s Y.\, Y.ic so that for each Y (jl = 1,2, ••• ,m)
jl
there exists at least one r. v. Yj (j2 ~ -jl) such that Yj and Yj are indepen2
1
2
dent r.v. 's (the matrix class (e) shall be at least compatible With those
assumptions).
We will call "general graph-independence" such a relation of
- -
independence among marginal components of a multivariate r.v. Y =Cl and say that
the r.v.
!
has "intrinsic general graph-independence" which will be "Without
reSidues" according to a class (C)* c: (C) o~ 'hcn-triv1al matrices."
give the reasons of such definitions
We shall
in the Part II.
The problem is that one of the restricted converse of an extension of' a
fundamental proposition enunciated and utilized by R. A. Fisher [1] :
(a )-Given a multinormal r. v.
!
(of rank n) if' the marginal components
n
xl ,X2 '''.'Xn are mutually independent (CPx<.:!?) .. J=lCPxj(t j )) there always exists
a linear transformatim C ! , with I18trix C == Cmxn of rank m (2 ~ m ~ n),
such that the r. v. C !
,
-'I
,. !
= !em) has marginal components Y , Y , ••• , Y
l 2
m
m
mutually independent (cpy(~) = n Cf»y (ui
i=l i
Incidentally, ! is a normal r.v. of rank m. There exist infinitely
».
many
linear transformations
~
I
I
with such a property; they are liven by the
solutions of the matric equation C Sx C' ,. Sy with C and
-
-
'
By
'-
Sx and Sy are 'the diagonal covariance natrices of the r. v. 's
unknown, where
!
and
!.
The
restricted converse proposition which we will prove true is:
(a')- If C
mxn
is restricted to the "non-trivial" natrix class (M) of rank
m
m < n in each row and column of 'Which there are at least two non-zero elements
I
I
I
I:
then Fisher's proposition is true only if the r.v.
(4)
Cmxn e (M}m'
!
!
0
(*)
is normal; that is
m
X and co...(~) = nco.. (u ) ==>
m:xn'1:.
i=l Ii i
= C
T
--> ! is a normaJ. r.v. let I = I(r,q) be the set of the subscript pairs
Given any r.v. Y
=(m)
"~
(i,h) such that the marginal r.v. 's Yi and Yh are independent (il:h):
O,2,3, ... ,m is the number of different r.v. 's Yi which are independent of
SOIOO r'.v. Y ; r is the number of different subscript pairs (i,h)e I(r,q).
h
According to the definition ,introduced above, if q = m the m r.v. 's Yi are
q =
graph-independent (see II.l).
I(m) =
C~(m)
we indicate by i(r,m) the set of subscript pairs
Cil ci2 ••• Cin)~
(i,h) such that Yi and Yh are. independent r.v. 's and ~ { ( c
=2
hl ch2 ••• chn
Let
rei) be the number of r.v. 's Y which are independent of Yi and r ~ min(m,n)
h
If
*"=::11>" stands for "implies",
"3
ft
for "exists at least", ":" for '''such that."
4
I
-,
I
I
I
I
I
I
I
-.
I
I
I.
I
I
I
I
I
I
I
,_
I
I
I
I
I
I
I
••I
the rank of the r.v. !(m) : we shall prove (11,2, 1heorem 1) that
=~
= r.
Then any
t(1,1I1)
mrc 1(i) =
must necessarily satisfy this condition.
After these conventions the Fisher's proposition can have the following extensions.
(b)-Given a multinormal r.v.
~
(of rank n) and a set !<7,m)
if the
margirull components Xl' X , ••• 'X are mutually independent there always exists
2
n
a linear transformation C X = Y, with matrix C = C
of rank r ~ fA.
mxn
other specific restrictions, such thatI y = !(7,m) fixed in advance
and with
(1 has
graph-independent lI1arginal components).-
!
Incidentally,
is a normal r.v. of rank r.
There exist infinitely many
linear transformations C ! with such a property:
they are given by the solu-
tions of the matrix equation C B C' = By with C and By unknown, where B and
x
X
By are the covariance matrices of the r. v.
I
S ~
(i ,i , = 1,2, ••• ,m) is such that Cov(Yi 'Y
l 2
i
A(
)
1
2
(i ,i ) e I7,m.
l 2
(c)-Given a multinormal r.v. W
-en)
always exist
and
!
and where By = «COV(Yi 'Y )))
1 i2
) = 0 for each pair
of rank n and a set r(7,m)
two linear transformations A :!!(n)
=!
and B :!!(n)
a nonsingular square matrix and B = B
a matriX of rank r
mxn
specific restrictions, such that
!
C= C
= BA
mxn
-1
•
and
!
J4
with A = A
nxm
and with other
the marginal r.v.'s
and 1 y =
Xl' X ' ••• , X are mutually independent
2
n
Incidentally,
~
=!,
I (7, m)
t
fixed in advance. -
= C ! are normal r.v.' s of rank n and r respective.ly:
There exists infinitely many pairs of linear transformations
A :!!(n) and B :!!(n) with such a property:
the matric equations A
Sw A' = Bx'
B
they are given by the solutions of
Y' = By with
A, B, BX and By indetermi-
nates, where the same definitions and remarks considered after the proposition
(b) are valid here.
Here we will prove true the following restricted converses of (b) and (c) •
5
I
-'I
(b' )-If Cmxn is restricted to the'non-trivial" matrix class (MJ1(l,m) of rank
such that
r=p
i) for each
3 i 1 ,j,j1
i=1,2, ••• ,n
•
ii) for each j=1 ,2, •• ,m
~ 0
3
j1,i,i
c
1
c
c
c
• c
ij i 1j ij1 i 1 j 1
i 1J
then the proposition (b) is true only if the r.v. X is normal; that is
n
(5)
{'f'r1) =TrCf (t j )
J I Xj
l
(c' )-If BA-
j=1
• C
mxn
,
e
e(M)A(
ElM) (
I
I l,m
l,m
): I
ex_ • I(l,m)) -> -X is
normal r.v.
) then the proposition (c) is true only if the
r.v. !!:(n) is normal; that is
n
(6)
(
3.
B, A=(a'.a' •••• ,a')·
BA-
-1 -2
l
-n
\0 (1)=
I
I AW
1T
CD, (t
j=1 T !.jW
j)
,
I
BW
=i(l,m) ,
E(M)~(
)) -> W( ) is a normal r.v. I r,m
- n
These results will be proved without any assumption about the distribution
patterns of the marginal r.v.'s Xj ' proVided that these r.v.'s are neither
trivial constant r.v.'s nor completely degenerate r.v.'s (still these cases can
be considered as limit cases of normal r.v.'s when the variance tends to either
o or (0).
In regard to the relations among the propositions (a'), (b'), (c') we
remark that (a') ~ (b') ~ (c'), and that
i)
proposition (a') -proposition (b'), when either
m= 2 ~
l • (:) • lj
6
l
=
<:)
for m> 2 or
I
!
I
I
I
I
-,
I
I
I
I
I
I
I
-,
I
I
I.
I
I
I
I
I
I
,_
I
I
I
I
I
I
I
I
I·
I
11)
proposition (b') ;; proposition (c'), when A
• I
nn
nn
matrix unity, that
is W .. X.
But it is clear that the proposition (c') is irrmediately implied by the
proposition (b')
also in the general considered case.
I.2-As we shall later show in more detail, the result (b') is an implicit
consequence of the vary particular result (a') when m - 2, which establishes
the simple8t basic Theorem about the necessity for n
mutual~y
independent r.v.ls
Xl' X2 , ••• , X to· be normal when there exist two independent random linear
n
n
n
functions
e
ij
Y~..
E CiJX and Y~ - E - 2j X , provided the assumption
J
J
j=l
j=l
~ 0, i-l,2,; j.l,2, ••• ,m, is satisfied. This general theorem has been
solved under additional restrictive assumptions by the following Authors:
1)
S. Bernstein (vide
M. Frechet [6]: n-2, Cll:llc2l-e12--c22-l,
absolutely continuous distribution functions and equal finite
non-ze~o
variances of the r.v.'s Xl and X exist.
2
2)
M. Frechet [6]: n-2, c;.1-"2l-eJ12:11-C22-l, equal finitEf0n-zero variances
of the r.v.'s Xl and X exist.
2
,)
D. Basu [1]: i)
~, r.v.'s X identically distributed (Xj-X) and
j
having finite moments of all orders (Var X .. Var X ~ 0) provided at
j
least one clj 02j ~ 0 *, j • 1,2, ••• ,n;
11) n-2, finite Ibn-zero variances of the r.v.' s Xl and X
2
exist, 011 \2 ~l C22
*We remark that:lt
~ o.
[ThiS result is the particular case m=
is necessary to provide at least
~
subscripts jl and j2
such that 1.j c 2j c lJ c 2j ~ 0: in fact the independence between Yl and Y2
112
2
n
and the existence of Var(x)~ 0 imply Cov (y ,Y2 ) - Var(X) EO ljC2j • 0, and
l
j~
n
the null sum E c lj c - 0, without c C .. 0 for all the subscripts j, is
1j 2j
2j
j-1
1
possible if and only if at least two components of such a sum are different
from zero.
If
0 ll2j
~ 0 for only one subscript j the Theorem is true only
if Var X = 0 (trivial solution).
=n=2
of the more ceneral Basu's Theorem [7] established for any m = n
2: 2:
If Xl' X , ••• , X are mutually independent r. v. 's with finite noneero variance s
2
n
and if there eXists a nontrivial linear transformation Cnxn-n
X() = -n
Y( ) for
It
which the marginal r.v.'s Y (i=1,2, ••• ,n) also are mutually independent, then
i
the Xj's must be normal r.v.'s, j=1,2, ••• ,n."
A noneingular matrix C
nxn
is nontrivial if there are at least two noneero
coefficients Cij(Fo) in each row and each column of Cnxn '
The author gives an alternative proof of this theorem but by assuming
that the r.v.' s X have finite moments of all orders1
j
4)
r
[10]
G. Pompilj [13]: n::r2, cijFo (i,j=1,2), the r.v.'s Xl and X2 have
finite curnulants of all orders (;\~j)= Var XjFo). This result is
the particular case m=n=2 of the Basu's Theorem with the assumptions
considered in the alternative proof.
Pompilj' s result has been
obtained independently by different considerations.
The above basic theorem would seem to have been
any assumption by G. Darmois [8], [9].
completelyp~oved
without
But the proof is open to criticism not
only for formal reasons (for n::r2 [8][9] the proof lacks substantial details
and some intermediate steps are assumed Without proof; for n > 2 [9] the proof
is only described as a generalization of the case n::r2) but for more substantial
reasons.
In fact the necessary assumption--eij~ 0 for all pairs (i,j)--is not
explicitly made and, moreover, it is not possible to see if this condition is
implied by the proof, on account of its insufficient rigor.
In this contest the following conjecture of Basu [7] is still more remarkable:
"The theorem considered under 3-i) is true even when:
8
a) only the existence of
I
I
-.
I
I
I
I
I
I
-,
I
I
I
I
I
I
I
-,
I
I
I
.e
the gnd moment is assumed, b) the r.v.' s Xj do not follow the same distribution,
provided CljC
2j
1:
0 for each
j ... 1,2, ••• ,n."
So we note that it is necessary for the condition c ljC
I
I
I
I
I
I,
I_
2j
~ 0 to be opera-
tive in order to prove that Xj is normal r. v.
In this order of ideas we now demonstrate the above-mentioned basic
theorem which we will call the Basu-Darmois'Theorem.
In any case, our proof
will be an alternative to Darm.ois' proof.
I.3- Basu-Darmois' Theorem.
Mutual independence among the r.v.' s X 'X '.·. 'X
n
l 2
(11)2) and independence between two random linear combinations of them,
n
fi
Y = r. CijX j and Y ... r. C2j Xj ' together imply the normality of the r.v.' s
l
2
j=l
j=l
Xj ' provided that CljC2j~' j ... 1,2, ••• ,n.
We will first prove that the stated independence conditions imply that
the cumulants of all orders of the r. v. 's X are finite.
j
First, let us state
some general propositions.
I
I
I
I
I
I
I
,e
I
Each l1.Q'l-degenerate r.v. X possesses a
valued function continuous on the real field
IcpX(t) I ~
c.f. cpx(t) which is a complex
-00
< t < +co, with cpx( 0) ... 1,
For the complex function 'it) .1.ogcpx(t) there is always formally
r
00 i
r
dr
defined a formal McLaurin expansion r. =r At, where A = [~gcpx(t)]
r
r=l r. r
dt r
t=O
is the rth cumulant of the r.v. X. Each cumulant Ar is formally expressed by
1.
a polynomial function of the first r moments of Xj so, as well as the moments,
A can be finite, infinite or indeterminate (of the form
r
00 -
(0).
In this
way the formality of the McLaurin expansion of .X(t) has to be considered not
only With regard to the possibility of the noneXistence of a convergence radius
ex> i r
r
for the serie s
r. ::T
At
r. r
1
indeterminate.
In this wider sense of the formality of the MCLaurin expansion
when all the cumulants ~ are fin1te, but even with
r
r=
regard to the possibility that some or all the cumulants).r may be intinite or
9
CD ir
r
E =r
r. Ar t stands just for the sequence of formal cumur-1
lants (A ,A , ••. ,A , ••• ).
l 2
r
Now, because of the characteristic additive property of the cumulants,
of 'x(t), the series
(J)
ir
r
the formal expansion E =r
r. Ar t considered without any specific assumption
r=1
about the existence of the A 's satisfies by definition the following:
r
n
Principle of the formal additivity of cumulants--If E a j 'X (cjt) = aCt) and
00 ir
j=l
j
r
E =r '1 t is the formal McLaurin expansion of the function aCt), then formally
r. r
r=1
under any assumption about the existence of each A(j), rth cumulant of the r.v.
r
The importance of this principle arises from the following:
Lemma 1-If the functions' x are unknown. without anY assumption about the
existence of the cumulants ~(j), but the function aCt) is known and its derivar
tives
r
of all orders are finite, then the Jrinciple of the
dt
t=O
(j )
formal additivity of cumulants implies the cumulants Ar
of all orders are
'1
r
= [ dr aCt)]
finite for each
j
= 1,2, ••• ,n:
(7)
j
= 1,2, ••• ,n.
In fact, if the rth cumulant is infinite (indeterminate) for some subscripts
j, necessarily '1r would be either infinite or indetenninate (indeterminate),
in disagreement with the assumption that '1 is finite.
The Principle just considered and the associated Lemma can be extended 10
the case of multidimensional functions 'xC!'!)
_
Proof of Basu-Darmois's Theorem.
= 'xl' x2"'"
X (c l t l ,c 2t 2 , ••• ,c t ).
n
n n
With the assumption about the independence
between the two r.v.'s Y and Y and among the r.v.'s x ,x , ... ,x by applying
l
n
2
l 2
10
I
I
••
I
I
I
I
I
,I
-,
I
I
J
I
I
I
.,
I
I
I
I.
I
I
I
I
I
I
I
(2) and (3) and remembering that the c.f. of a set of mutually independent
r.v.'s is the product of the c.f.'s of the individual r.v.'s, we have
= CPx
,X , ••• ,xll. (cllul+c2lu2,c12ul+c22u2' •••• , c lnul +<:2nu2) ..
2
l
n
=
n CPx (clju l +c 2j u2 ),
j
j=l
from which
I_
By passing to the function, ::dog
I
I
This result does not necessarily imply that
+
'x (c 2ju2 ) for each subscript
j.
CP, (10) becomes
'Xj (cljUl+C2j~) = 'Xj (cljul )
If this were so, we would have for each
j
I
I
I
I
I
••I
subscript j the well-known linear functional equation
which implies
'x
(t)
=i
'Xj (vl +'V2 ) = 'Xj (vl ) + 'Xj (v2 )
mjt, that is, the trivial solution
j
xj =mj
constant.
Let us assume then
almost everywhere in the neighborhood of (ul ,u2 ) = (o,o)jej(o,o) = 0.
The functions e j (u ' u2 ) are connected through (11) by the following
l
relationship
11
I
-'I
that is,
n
E [W X (cljUl+c2jU2) - 'X (cljul ) - 'X (c 2j U2 )] = e(ul ,u2 ) (=0).
j=l
j
j
j
(12)
co i r (j) r
Letting E =r A.
t be the formal expansion of the function *X (t),
r=l r. r
j
the formal Mclaurin expansion of the left-hand side of (12) is
I
I
I
I
I
I
_I
Putting r-p=-q, from which (2
1 .::: q
=: 00 ),
1
=: p =: r-l)
<-> (1
=: p =: 00 ,
(13) formally becomes
p+q
co
<D
n
( )
j Jupu q
2j p+q 1 2
E ~ [ E cP c q A.
E
(14)
=: r =: <D,
p=l q=l p.q.
j=l lj
But the formal Mclaurin expansion of the right-hand side of (12), the
known function e(u ,u ), which is identically zero, is
l 2
co
E
00
p+q
E ~ r
p=O q=O p.q.
uPu~ with r
p,q 1
p,q
= 0 for each p,q
= 0,1,2, •••
(*)For r=l the coefficient of A.ij)iS zero:
l
I
I
I
I
that is
aA
I
I
so OA.i j )= 0 in any case since being
the 1st cumulant of the r. v. aX and being X a nondegenerate r. v. then
11m
aX
=0
constant r.v. which 1st cumulant is 0 ( = lim a A. ).
l
a ->0
a
12
->
0
I
-,
I
I
I
,-
I
I
I
I
I
I
I_
I
I
I
I
I
I
I
,I
Then by the Principle of the formal additivity of cumu1ants, by comparing (14)
and (15) term by term, we have
~ cp c q
(16)
j-1 1J
2J
A(J)
p+q
=0
for each p,q = 1,2, •••
Therefore by the Lemma proved above (7), the cumulants A (j) are finite
r
('l
for r > 2 : hence also theA(j) are finite. We will now prove that ~ J=o for
-
r
r
r~ 3.
For each p+q-r fixed in advance, corresponding to p=1,2, ••• ,r-1 we have
r-1 homogeneous linear equations with n unknowns A(1), A(2), ••• ,
r
r
the follOWing homogeneous linear system is defined
~ cplj c r~j- p ~(j) =,
0
(17)
~
j=l
which matrix is
'
r
c(r-1)xn" «
Let us take r > n+1:
~ (n):
r
~ ••• ,r-1
p .. 1 ",
r- p » 1 ~
c p1j C~j
,
p. ,', ••• ,r-1 ·
j a 1,2, ••• ,n
the system matriX will be
c
c
r-1
c
11 21
2
ll
c
r-2
21
c
c
r-1
22
•••
c
2
r-2
c
12 22
•••
c
12
c
ln
c
r-1
2n
2
r-2
c
ln 2n
...............................
(18)
c
(r-1)xn
...
...............................
n+1 r_n_1
cJ2 c 22
Such a matriX is decomposable into the product of two factors
13
so
A (r-l)xn
= V(r-l)xn
Dnxn
=
(v
nxn
)
V
(r-l-n)xn
...
1
1
(::J
I
I
Dnxn
r-l
cll c 2l
1
...
(::J
=
C 12 ) •••
c
I
I
I
I
I
I
• •••• 0
•
•
......
....
0
0
• ••c
•
n-l
(
0
c 12 r-l
0
c 22 ···" •
0
........................
n-l
-.
=
•
•
r-l
c 2n
ln
22
........................
r-2
ll )
c
(C
2l
(Cc 12 )
r-2
22
•••
(Cc ln )
r-2
_I
2n
where the first factor V(r_l)xn is a rectangular Van Der Monte matrix and the
second factor D
is a diagonal matrix.
nxn
Since the rank of a rectangular Van Dar Monte matrix like V(r_l)xn with
r-l > n is the rank of the square Van ter Monte matri -: V
formed by the n
nxn
first rows of V( 1) , the analysis of the system (11) with r > n+l reduces
r- xn
to the analysis of the system (11) With r = n+l, which matrix is C
=V
D •
nxn
nxn nxn
Then two cases can arise.
The rank ofC nxn is PA = n: this is possible if and only if the ranks Pv
and D
are P ... P = n. Now by the Theorem
and P of the matrices V
nxn
nxn
v D
D
i)
assumption (clj~' i=1,2; j=1,2, ••• ,n) we have always PD = n.
and only if
C
lj
C
1
jl' j2 = 1,2, ... ,..
2j
So
Pv = n if
~ c lj c 2j for each pair of subscripts (jl'j2); jl~ j2'
2
2
2
Let us suppose so. Then PA = n and the homogeneous linear
system (11) with r-l=n allows only the null solution
14
e
(A~~{,A~:{,••• ,A~~ll))=(O,o,••• ,O).
I
I
I
I
I
I
I·
11
I
I
I
.e
I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
,_
I
Now such a system is equivalent to any system (17) With r-1>n.
(j)
A
• 0 for each r > n+l, j
(20)
r
-
=1,2, ••• ,n
So
•
But according to Marcink1ewicz's theorem [4J (vide D. Dugue [5J and also
=0
(this case has
already been considered as trivial
solution) •
Therefore, excluding trivial solutions, the assumptions of the Basu-Darmois'
Theorem and the additional assumption
Pv = n,together
imply the normality of
the r.v.'s Xj , j = 1,2, ••• ,n. Let us now show this additional assumption is
not necessary to obtain such a result.
11)
nxn is PA '< n:
it means Pv < n, that is c lj
The rank of A
for some pairs of subscripts (jl,j2).
1
c2j
2
= c lj
in such a way to obtain the matriX
where
~l
c (h)
12"·
C
(h)
C
(h) c (h)
11
12 •••
=
2~ =
...
with h
= 1,2, ••• ,N,
N
E ~
hal
= n,
~
1
c 2j
1
In such a situation let us suppose in
the most general case to have ranged the columns of the matrix C2xn =
Jh)
2
~ ~
2
15
for each pair (h l ,h2 )
«( cij ) )2xn
Fh ; h l ,h2 = 1,2, ••• ,N), ~~. We remark that in any ease N ~ 2: in fact
l 2
N=l implies Y = klY contradicting the assumption that Y and Y are indepen2
2
l
l
dent r.v.'s.
Let X3 h ) (j = 1,2, ••• ,~) the r.v. X whose coefficient in the linear equ(h
j
.
(~
ation
Let us now put
~
= 1,2,;
X(h)
=
j
•
(Ii)
c 2j
(h)
= ~C1j
).
Then we obtain tke linear system
N
(22)
Y=I:Z1
h=l n
{
Y2 =
~ ~~
h=l
,
where the initial assumption of the Basu-Darmois'Theorem remain unchanged
which matrix
(24)
\xN
for
r-1
=0 ,
=- N
p = 1,2, ••• ,r-l
is
...
1
1
kl
~
= VNxNDNxB = ~
...
k22
1
k
~
0
•
l
0
k
2
0
0
~
0
~
...............
N-l N-l
k
l
•.•
~-l
•
•
0
...
..
~
I
I
I
I
I
I
I
I
I
I
homogeneous linear system
~ ~ ~(h)
n r
-.
_I
(Y and Y independent, the ~'s mutually independent as well as the Xj's,
l
2
~~, h = 1,2, ••• ,N). So in the same way as in case i) we obtain the
h=l
I
I
•
I
I
I
_I
16
I
I
I
•e
I
I
I
I
I
I
Ie
Since ~-flJ and ~,,~ the rank of ~ is N, and so we are in the case i)
1
2
before examined. Therefore the Zh,s are normal r. v. 's •
~~) X~h) ~
Theorem [2], [3] (see also [5], [11], [14])
But the set of the
(j
also in the case of PA < n.
holds for any n
= 1,2, ••• ,~j
~
2.
h
x~h), s
= 1,2, ••• ,N)
are normal r.v.' s.
is the same set of
So the normality of the r.v.'s X is proved
j
Besides, we remark that the "normal" solution
With this last observation the Basu-Darmois Theorem is
Corollary.
The Basu-Da.rmois Theorem is valid even if e ij == 0 for some pairs
of subscripts (i, j), provided that
i)
11)
c ij = 0 -=>
3 jll CljlC2j2 " 0 and Xj
3 (jl.. j2): c lj1c 2j1c lj2 2j2 "0.
... &j X + b j
jl
C
Let us put:a) Xj - '"
Xm' e ij ... eAim if CljC 2j " 0: m == 1.. 2..... ,M
(1)
(1)
b) Xj ==
C
...
c
if C'2j = 0: h l ... 1,2, ... ,Hl
lj
lh
(2)1
c) Xj ...
' c 2j ... C 2 ,h if c lj = 0: h2 = 1,2, ..... H2
2
2
H
Xh '
Xhan
d) Zi • Y i
'Ei
hi-l
C~)
~
{i)...
-:-im 'i'm (i=1.. 2,; H +H2+M=n).
l
i
mal
i
Since each {l) is independent of each {2)then Zl and Z2 are independent
1
2
r. v • ' s, and since by the assumption ii) M is greater than or equal to 2, the
conditions of the Basu-Darmois Theorem are satisfied.
r.v. IS.
~i)
So the '"
Xm's are normal
But, by assumption, for every {i) there is a r.v.
~i) i
b~i).
~i),
Xm such
that
=
+
So also each r.v.
which is a linear function of
i
i
m
i
i
a normal r.v... is a normal r.v. Therefore, all the Xj ' s are normal r.v.' s.
I_
I
a lso the
completely proved.
I
I
I
I
x~h),s
the r.v.'s X (j = 1,2, ••• ,n).
j
I
I
I
x~h).
Now since ;
0
= normaJ. r. v. and the
s are mutually
j==l
independent r.v.'s, according to an immediate extension of the Lev~ramer
17
I
I
Part II
Before applying Basu-Darmois' Theorem to prove (in II.4) the propositions
(a'), (b') and (c') considered in I.l we will give in II.l, II.2 and II.3 some
general definitions which are new definitions of recognized forms of independence relations within the structure of multivariate r.v.'s.
Even though these are not essential for the proofs, they are useful for a
deeper examination of the assumptions implying normality in the propositions
(a'), (b'), (c').
II.l--Graph-Independence of General Type.
We will say that m r.v.'s Y , Y , ••• , Y are graph-independent of general
l
2
m
type, or also that the multivariate r.v. !(m) has graph-independence of general
type, if there are 7 different pairs (Yi,Y ) of independent r.v.'s (i~h) and
h
each r.v. Y (i = 1,2, ••• ,m) belongs at least to one of these 7 pairs.
i
The reasons for such a definition are connected with the terminology
employed in "Graph Theory" (see Berge L16]).
Let (y) be the set of m r.v.'s Y , Y , ••• , Y. Let r be the multi-valued
2
m
l
mapping of {y} into {y} which to each Y (i = 1,2, ••• ,m) associates the sub-set
i
(Y}i C {y} of the r.v.'s indepen4ent of Y :
i
{}
Yi
= ( Yl(i) '
(i)
(i) }
Y2 ' ••• , Y (i) •
7
r
1
~
Yi F ~ by assumption (i
S 7(i) S m-l).
-.
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
_,
18
I
I
I
,-
for each h ,. 1, 2, ••• , ')'(i).
then at least Y U Y
h
i
Let u • U(')',m) be the set
i
I
I
Ie
I
I
I
I
I
I
I
€
[yi,y ]
h
€
UCr,m) <-III>
\~ ~
Yi~ Yh ,
Since f • u(l,m) we can write G ... ((Y},f) • ((yJ, u(l,m»
definition
ai,h
= (v 1'v h )
r~ Iv l]
and ')' edges ei,h ... [Vi'Vh ] .. l.\~t,\~}{.
')' edges joining the m vertexes.
ei,h
€
... G(m,')'):
[16] G(m,l) is a graph With m vertexes Vi • (Yi ), 2')'( ...
~
by
')'(i»
arcs
(l,m)
i=l
U
is the set of the
Since Y ~ Y <=.....> Y »(Y then to each edge
h
h
i
i
u(l,m) there correspond the two arcs ai,h andah,i:
thus the graph
G(m,l) is symmetric, and each edge ei,h can be traversed in both directions
(Vi,v ) and (Vh,v ).
h
i
So the edge set U(l,m) establishes a set of chains
[Vi' Vi ' ••• , Vi ] formed of adjacent edges e i ~_ ,8
,e1 1 , ••• ,e1
1
1
2
r
1'" 12,1;
J' 4
r-1' r
(Y ~ Y ~ ••• ~ Y ) which join certain pairs of non-adjacent vertexes
i
i
i
1
2
r
(
)
Vi ' Vi ([Vi 'Vi] , U l,m , that is Y and Y are not independent r.v.'s).
i
i
1
r
1
r
1
r
( *) We introduce here the useful symbol ~
~ to indicate the independence relation
between two r.v.' 8 Y and Y : Y1~ Y • We note that this symbol reminds
2
1
2
us of the cartesian product [Y1] x [Y ]Of the supports [Y ] and [Y ] of
2
2
1
the r.v.'s Y and Y , which is the suppon [y ,Y ] of the r.v. (Y ,Y )
1 2
1 2
1
2
when Y1~ Y • We remark that the relation~ possesses the commu_tative
2
property: Y ~ Y <-> Y ~ Y , and a irreversible distributive property
2
1
1
2
Y ~ (Y , Y ) ...> Y At Y ' Y ~ Y , but does not possess the transitive
1
2
2 3
1
1
3
property: in general Y1 X Y2 ~ Y
Y1 ~ Y •
3
+>
I_
I
If
{[Yi'~]} C (y) x (y) of all the')' pairs (Yi'~) such that Y ~Yh:
i
I
I
I
I
Yi~y~i)
(Y)i n (Y)h.
Obviously Y , (Y)i and (*)
19
3
Those chains [vi ' vi' ••• , vi ], to which there correspond two simple elementary
1
2
r
paths (v1 ' vi , ... , vi ) and (vi ,vi
,. •• ,vi)' are called simple elementary
1
2
~m)
r
r-l
chains of the graph G\ , 'l
Since u( 'l,m) and G(m, "1) depend upon the independencies existing between
pairs of marginal components ot the r.v• .!(m)' we will write also Uy • uCi',m),
Gy ;;; G(m,)') and say G(m,'1) is of order m (no. of vertexes) and size-)' (no. of
edges) •
Sometimes in the following it will be useful to refer to the subscript
pairs (i,h) instead of the corresponding edges ei,h • [(Y~,{YJJ.
For we consider
the set r('1,m) corresponding to the set u('1,m) such that
(26)
We remark that there is always the correspondence G(m,'l) <_> u()',m) <_> r()',m).
With our graph G(m,)') there is associated a symmetric square matrix G
mxn
= ((gih»
of rank m called the "incidence matrix" of the graph G(m,'1) :
m
m
~ gih· ~ ~i· )'(1)8
h.l
h.l
( integer N:
(m)
m m+l)
N. 2' ' T
~ 'l ~ 2 •
A r.v• .!(m) can have a araph-independence of type G(r,'l) with r < m if
only r of the m marginal r.v.' s Yi are graph-independent:
in this case we
say .!(m) has a marginal graph-independence (of order r).
Assuming Gy
• G(m, '1) there eXist three sorts
ot graph-independence as
-(m)
shown in the following definitions A, B and C.
A)
v
i
Connected graph-independence (of type G ) - For any two vertexes
m,'1
and v of G(m, '1) if there eXists always a simple elementary chain
h
20
I
I
-.
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
_I
I
I
I
.-
I
I
I
I
I
I
[Vi'Vk,,,.,Vh ] which joins such two vertexes then the symmetric graph G(m,7)
is said to be strongly connected.
G(m,7)
I
I
I
I
I
I
,
-m
~7
•
In other words, the r. V. Y( ) has
-m
connected graph-independence of type G
if there exists an arrangement
m,7.
= 1,2, ••• ,m
(il ,i2, ••• ,im) of the subscripts i
such that each r.v. Y~(k
is independent of at least one preceding r.v. Y~(k
B)
= 2,3, ••• ,m)
> h = 1,2, ••• ,m-l).
COIIIRlete graph-independence - A r.v. rem) has complete graph-independ-
ence if its m marginal components Yi are
two by two
independent r. v. 's.
is, if I(m) has connected graph-independence of complete type G (m).
m, 2
7 = (~) means 6 ih = 1 always when ilh, i. e., there exists an edge
for each pair of vertexes of Gy
,
.!.(m)
:
said to be c0sPlete.
e
ih
That
Here,
€
Uy
-em)
in this case the graph Gy
= G
is
-em)
Ir}(~)
Obviously a sufficient condition for I(m) to have complete graph-independenceis, that its m marginal cOmpa1ents Yl , Y2 ' ••• ' Ym are all mutually independent r.v. 's : in this case "re will simply say that I(m) has complete independence.
From known properties a necessary condition for rem) to have complete independence is that I(m) must be,
o~
rank m.
We shall prove that the same necessary
condition holds for any r.v. I(m) to have eomplete graph-independence.
fact let us assume
Y ~Yh' h
l
= 2,3, ••• ,m,
and that !em) has ra.nk r
In
= m-l
without constant marginal components.
Then necessarily the m r.v. 's Yi are
m
. .
bound through a linear function: i.e.,
E aiY. = c where a.l=o, i = 1,2, ... ,m,
i=l
~
~
and c is a constant.
Having recourse to the Principle of the formal additivity of eumula.nts
(I.3)
we find
I_
I
= U7,m
and say that the graph-independence possessed by Y( ) is
connected, i. e., of connected type G
Ie
I
= G~7
We write in this case u(7,m)
21
I
I
and, by Lemma 1,
m
-...,>
= 0
m
m
m
1: a. Y.) = 1:
E aia. cov(Y. , Y1J + E a 2 var (Y.) =
i=l ~ ~
i=l k=l.it
~ ~
i=l i
~
A 2 = va.r (
cov(Yi' Yk ) and va.r(Y ) exi.st, i,k = 1,2, ... ,m •
i
M:>reover when cov(Yi' Yk ) exists we know tbat Y ~ Y ==> cov(Y , Y ) = 0 •
i
k
i k
m
Then considering -; = E (-~)Yh + c we have by assumption
h~
m
cov(Yi,Yl ) = cov(Yi , E (-~)Yh+ c) =
h~
m
=
(-~) cov(Y.,Yh ) - a
E
h~
i va.r(Y.)
~
~
= -a.J.
va~(Yi)
F0
(iF1)
hFi
that
This result implies/the two above considered assumptions are in disagreement with ea.ch other and i f we want to retain the assumption
Yi~ Y (==> cov(Yi,Yl )
l
rank r = me
= 0) ,
i = 2,3, ••• ,m , necessarily !(m) must be of
Obviously if vIe assume
disagreement between assumptions.
r::: m-2 we shall find the same kind of
componen~
Yj such that
Yj ~Yi' i = 1,2, ... ,m (iEj), then !(m) is necessarily of rank r = me
C)
J..L
= i(7(i)} •
Disconnected graph-ind.!!Pendence (of type G(m, 7)= [G
m,., 71 ,G~, 72
,
... , G
]) - A r. v. It ) has disconnected graph-independence of type
mr ,7r
,m
[G
,G
~,rl
, ... ,G
r]
mr' r
~,72
_I
I
I
(i)
such that Yi XYj
,
2:
I
I
I
I
I
I
So we have proved the following
Theorem 1 - Given !(m) , if there exists a marginal
Corollary 1. b - r :: rank (!(m»
-.
if Y( ) is composed of r multiva.riate marginal
- m
I
I
I
I
I
_I
I
I
I
,I
I
I
I
I
I
(k)
components !.(~)
(k)
= (Yl
independence of type G
(k')
Y.
J.
t
(k)·
(k)
'Y2
, ••• , Y~ ) each having connected graphk ) and
'1
in such a way that no two marginal r.v.
~, k
are independent when
kf:1"-'.
strongly connected graphs G
~,rk
Obviously
~
2:
2, k
If, for instance,
(
That is, the graph G m, '1
)
yi
is composed of r
wi thout connection edges SJOOng them.
r
= 1,2, ... ,r,
m = 1 then
r
Gy
and I:
k=l
.
~
=m •
= G{m-l, r) •
Ie
I
I
I
I
I
I
I
II.2 -
Intrinsic Graph-Independence.
We will say a r.v. X
(n ~ 2) has intrinsic grapl-independence (either
-(n)
(m,7)
connected or complete or disconnected) of type G
(either G
or G (m)
m'7
I_
I
23
m,
Z
A.A
or [G
«.
A]
, ••• , G . . . ) 11' there eXists a matriX C
•
1.1» such
~dl ~,l2
mr"r
~xn
that the linearly transformed r.v. '!(m ) • C xn ~(n) has graph-independence
m
or
.,G
I
I
(m ,1 )
type G I l with 'Y1
~
1
1
where "1- 'Y is the number of pairs
."
whose independence relation Y ~ Y is not intrinsic.
i
h
[(Y~,(yJ] •
-.
*
e i, h
I
I
I
I
I
I
Also we shall see later
the relation Y ~ Y is not intrinsic to ~(n) when it necessarily implies for
h
i
the r.v. !(n) a condition of graph-independence which must be compatible with
the other assumptions about !(n) in each particular case.
~(n)
assumption of such a graph-independence for
the condition Y ~ Y •
i
h
Moreover, the
in its turn implies expliCitly
So that in such cases there is an extrinsic bond
between particular assumptions of graph-independence for the r.v• .!(m) and the
r.v.
!(n)·
(m
,"1 )
*
RemoVing from G l I t h e set U(
*
which does not carry intrinsic independence we still obtain
A( "1)
(/ ,m )
* ) • A(.)
A()
G m,
(U l 1 _ U(
U "m <-> G m,7 ) where ml-m
"1 -.,
*
1 extremities only of edges e. h and
of vertexes v. which are
*
~
~,
isolated after the removal of such edges.
So, to obtain directly for
can remove from the matrix C
1:.
*
of the ml-m vertexes Vi.
~(n)
~xn
("11 ,ml )
) of the "1 -"1 edges e. h € U
"1 -"1
1
~,
1
Obviously 1
1
a symmetric graph
_I
> 0 is the number
thus remain
= 1 ===>
I
I
I
I
I
I
I
ml=m.
the graph-independence of type
~(m, 1)
we
the ml-m rows corresponding to the subscripts
We obtain a matrix C
such that the r.v.
mxn
= Cmxn-n
X()
has graph-independence of type G(m,1') with, <,' « 'I)'
*
(,' ' m) .
where 7' -, is the number of edges of type e. h belonging to the set U
Y( )
-m
~,
Removing from G(m,,') these "1'-"1 edges we still obtain the
«_> ~("1 ,m) <:_:> ~(1 ,m»
is intrinsic.
so that if (i,h)
€
~("1 ,m)
We remark first of all that if
~(n)
graph~(m,"1)
the independence Y ~ Y
h
i
has intrinsic graph-independ-
ence of type 'G(m,"1) with respect to the matrix C
it has also "1 intrinsic
mxn
complete graph-independencies of type G ,1 (the simplest one of the complete
2
type) each with respect to each of the "1 linear transformations (Y , Y )'
i h
24
i
, h) X(
= C2(xn
-n
_I
)A
I
I
I
,I
I
I
I
I
I
Ie
I
I
I
I
I
I
I
I_
I
where (i,h)
E
I( r,m)
c~~h)
and
~
the i th and hth rows of C .
-
-
• t i l c i2 ••• c in ) is the matrix formed with
mxn
Then the independence Yi
X~
c
... c
1 h2
hn
C~~h)
is not intrinsic if the matrix
satisfies
one of the two following cases.
A)
Assuming for simplicity i-l, h-2 let us suppose that
In Buch conditions we have the following
Theorem 2 - If' (and only if) Y
l
~ Y2
then
.ll)~ !(2),
i.e.,
X,r~ XJ '
for
(j,j') with j = 1,2, ••• ,nl j j' • nl+l, n1+e, ••• , n.
x(l)J( x(2)===:I>Y ~y Isolehave to prove Y ~y ==> x(l)~ x(2). Let
each pair
-
-
(D
r~(l)
!
,!
(2) (1
(1)
12
12
(2)
(1) (2)
,1
) be the c.f. of the r.v. (!
,!
)
-
= (Xl ,X2 '···'Xn
1
+l,t +2, ••• ,t)1 j letWy Y (~,u2)=
nl
n
J l' 2
=q>y (u,l)Cfy (u ) the c.f. of the r.v. (Yl'Y2 ). Remembering (2) and (3)
2
1
2
(!(l) = .£(1) u ' 1(2) = .£(2)U )we obtain by successive steps and comparisons
l
2
with 1(1)
= (tl,t~,
••• ,t
,n
jXn+l,···,xn )
l
i)
ii)
iii)
=~
)1 and 1(2)
= (tn
l
So the matrix c(1,2)
2xn
= til
c12
0
0
o
...
••••
0
) of rank 1 and
c2 n +1·· .. c2 n
, 1
'
the assumption Y ,# Y necess,arily imply. graph-independence of type
l
2
Gn ~
for the r.v. -\n
~).
Conversely this graph-independence implies
,u , n n
1 2 l 2
extrinsica.,lly the condition Y ~ Y •
l
2
B) Let us suppose that !(n)
c lj
= 0,
n
c2j
-•
I
I
I
I
I
I
has complete graph-independence and
F0
for j
= nl~+l,
nl+~+2, ••• , nl-tn2+~ = n.
n +n
l
Setting 1: c lj Xj
j=l
= X,
linear function (Yl' Y2 )' =
(~
•••
I
I
l
2
1:
CljXj
j=n +1
C~2'!(n)
= Y,
the
becomes
_I
1
::)
= (:
where X, Y and Z are
mut~
o
I
I
I
I
I
independent r. v. 's since so are the r. v. 's X.•
J
In this case we have the following.
Theorem 3
- Assuming Y ~ Z, if (and only if) Yl ~ Y2 ~ Y ~ X~ Z ~ X !!
a constant r. v.
TIus result is a consequence of a more general Lemma (see Appendix) about
the existence of the moments of all orders for the r.v. (X,Y,z) when we
asswne
on~ the independence Yl ~ Y2 •
c~1,2)
.. • c ln1c1, 01+l ••• c1, ' \ ""'2
o · · • ... 0 \
xn
\kC ll kc12 ••• kC
0
•••
0
c2 ,n +n +l .•.. c2n )
lnl
l 2
of ranl{ 1 and the assumptions Y
Y and X Y Z X imply that X is a constant
2
l
So the matrix
=(cll
C ll
~
r.v•• Thus each Xj (j
= 1,2,. "'01)
~ ~ ~
is a constant r.v..
•
I
_.
I
I
I.
I
I
I
I
I
I
I
I_
Conversely if X is a constant r. v. then the assumption Y ~ Z implies
extrinsically the condition Y J( Y ,
l
2
In the ~1nder of this Part II we will
be interested in the case of complete graph-independence for the r.v. !(n)
without constant marginal r,v. ·s.
Then, if Yi
~
Y , a necessary and sufficient
h
condition for this independence to be intrinsic to !(n) is rank
(C~h)}
= 2.
If the rank is 1 necessarily we are in the Case A).
Since, given a r.v. !(n) of rank n, the rank of the linear transformed
r.v. !(m) = Cmm !(n) for any mtrix Cmm is r = rank (Cmm ) ~ min {m,n}, then
if we assume that !(m) has complete graph-independence, it follows from known
properties that r = m < n (it is impossible for a r. v. Y( ) of rank r < m to
-
-m
have complete graph-independence : see Corollary 1. a }.
So, in general, if !(n) has intrinsic graph-independence of type
S' (m,1)
remembering that !(n) also has intrinsic graph-independence of any type
SUb-graph G(m' .1'}
G(m' ,7') <.= 'd(m, 7), then for any
=
'G(m, 7) we must
have m' < n.
I
I
I
I
I
I
I
,.
I
In this connection we introduce now the name of aggregation degree IJ. of
a graph G(m,7) to denote the maximum of the order possessed by every sub-graph
G(rOO,~)=(rYi,r~G(m'7): (see II.1). Remembering Corollary 1.b we have
( > 2 ).
(28)
So, given the linear transformation .!(m) = Cmm !(n)' for any graph -a(m, 7)
representing the type of intrinsic graph-independence of !(n) its aggregation
degree IJ. must be at most equal to n:
IJ. ~ r = rank (C
mxn
)
in any case we have
:s min (m,n).
27
I
Thus the assumption of an intrinsic graph-independence for a r.v. !(n) of
rank n Imlst satisfy such a condition.
0--
0_
=
C
x()
Y
has complete independence one will say simply that
nxn - n
-en)
!(n) has intrinsic independence with regard to the matrix C
•
When
men
We remark that this case is related to the well-known case of a r. v. X
-en)
n
of rank n for which in the n-dimensional linear space R
=
[xl'~'.'"
xn ] there
exist n axes of independence [Yl'Y2""'Yn] referred to the axes [xl,~"",xn]
by the non-singular linear function len) = Cnxn !(n):
I(n) = Cnxn !(n) has complete
that is the r.v.
independence.
A necessary and sufficient condition for this complete
independence
to be intrinsic to X( ) is that each pair of rows of C
nmst satisfy neither
- n
nxn
the case A) nor the case B) (if !(n) has complete graph-independence). The
notion of intrinsic indePendence has been introduced by G. Pompilj [10], [13].
In contradistinct1oa to the independence s1tuation of' intrinsic tyPe we
can say that if' !(n) has complete
independence (Imltual independence among
the marginal r.v. X ) then such an independence is extrinsic.
j
II.3--Residues of a r.v. having Intrinsic
a Trivial Matrix.
Graph-In~endence Subordinately
to
If' the r. v. !(n) has intrinsic graph-independence of tn>e G{m,,.) subordinately to the matrix C ' we will say that the marginal r. v. Xj is a residue
nom
of X( ) with regard to G(m,,.), and the matrix C
is trivial with regard to
- n
mxn
both the J.th c~lumn and G(m,,.), if
(30)
where the E ' s are the elements of the matrix « Ei)m:xn associated to the
ij
matrix C
such that
m:xn
28
_.
I
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
-.
I
I
I.
I
I
I
I
I
I
I
I_
={O
if
c ij
=0
1
if
c
.; 0
ij
Since the incidence matrix at the graph 'G(7,m) is
1
(31)
gih
if (i,h)e
={.
~(7,m) (Yi~ Yh ,
and this independence is intrinsic)
o it (i,h) ;. ~(7,m) (otherwise)
then (30) can be expressed by
Sj =
(32)
n
1:
n
1: ~j
i=1 h=1
\.1
(<-=-> Xj is a residue).
= 0
Let us assume now that 2£(n) has both complete
••I
independence (extrin-
sic independence) and intrinsic graph-independence of type G(m, 7) with k residues with regard to the trivial matrix C •
mxn
Without 10st of generality 1et
us suppose such residues are the l.ast k r. v. •s:
I
I
I
I
I
I
I
«~h»mxn with
••• , X
n
= Xk.
*
Thus Sj = 0 for j
= n-k+1,
Xn _k+1 =
*
X:t,
Xn _k+2 = X*
2 '
n-k+2, ••• , n.
n
Considering each r. v. X... as a 1inear component of the r. v. •s Y (=)cijX .>
if Cij=O then the r.v.Xj is nOl! a component of the * r.v. Ii so that i J='f
J
the r. v. Y is independent of X • loi:>reover if X 11 a residua1 component of
j
j
i
the r.v. Y (c
.; 0) then by definition the same X cannot be a residua1
ij
j
i
component of any other r.v. Y such that (i,h)e ~7,m); for if X is a common
j
h
*
component of Y and Y with (i,h)e I(7,m) then Xj is not residua1
h
i
==>
Sj "
0).
Then putting
x;
=
~i)and
c
ij
=
c~i)(.;o),
(cij'~j 1=
where X; is the E,th residua1
~
(i)
component of the r.v. Yi (p = 1,2, ••• ,Pi ~ k) finally we find Yh~ (X].
.
(i) W
(h)
(h)'
(h» ~
(
) A( 7,m)
... , X P )~ (X]. ,
X P ~ Yi tor each i,h eI
• So that
1
X2 ,... ,
•
setting
29
0
,X2(i) '
I
R! =
,
Wi = j:l CijX j
in correspondence with each pair (i,h)€
Rt~~
_.
n-k
~(r,m)
= Yi-RI
we find
'
Yi~ ~
I
I
I
I
I
I
I
,
·
By assumption, Yi ~ Yh •
Wi ~ Wh •
That is, to each r.v. pair [Yi,Y ] ~(r, m)there correspond the r.v.
h
Thus
pair (wi,Wh ) with Wi~ Wh • Therefore the r.v. !(n-k) = 0S.,~, ... ,~)
has intrinsic graph-independence without residues of the Sa.Il'e type ~ em, r)
as !(n) with regard to the non-triv1aJ. matrix Cmx{n_k) formed by the first
mxn :
(n-k) columns of C
So that the two r.v. 's I(m) and !(m) are equivalent with respect to the
_I
equivalence relation established by the common graph-independence of type
~(m,r) •
I
I
I
I
If we indicate by r' < min (m, n-k) the rank of the matrix C ( k)
mx nthen for the aggregation degree of e{m,r)(see(28» we must find
~
..
~(e{m, r»
:s r' :s min (m,n-k)
•
Thus to be in agreement with the assumption of intrinsic graph-independence
the number k of residues must satisfy the following inequality
So, far instance, when !(n) has intrinsic complete graph-independence
{type G (n»
n, 2
then this one is aJ.ways without residues since ~ = ~(G
.
30
en»~
n, 2
= n.
I
I
I
-.
I
I
I.
I
I
I
I
I
I
I
I_
I
I
I
I
I
I
I
••I
II.4 - Now we shall prove the proposition (b') and therefore the propositions
(a l
)
and (c') (1.1).
For let us assume !(n) has both (extrinsic) complete independence
and intrinsic graph-independence of general type
with respect to the non-trivial matrix C
•
mxn
e(m, 1 ) without re sidues
From the definitions and pro-
perties considered above this means
i) !(n) has marginal components X j (j=1,2, ••• ,n) Imltually independent;
ii) C X() = Y( ) has graph-independence of type
11 ) ::> e(m, 1).
mxn- n
- m
G~'
iii) [Yi,Y ]€
h
~(1,m) <==>Yi~ Yh
iv) rank rrCilCi2... Cin)
L\ chlch2• • • Chn
v) Sj = t gih €ij~j
i,h
I
l
= 2
intrinsically;
for each pair (i,h)€
~(1,m)
_1
0
for each j = 1,2, ••• ,n (see (32»
that is the matrix C
satisfies the following conditions
mxn
(J6)
When in particular we consider !(n) having each of the
1 intrinsic
,
graph-independence of type G 1 established by the linear functions
2
in general we can have
cijc i j= 0 for some A subscripts j(A~(i,il)~ n-2);
1
that is, in general, each of the
intrinsic graph-independences of type
(i,i )
l
C 2xn
can be with residues•
G2 ,1 possessed by !(n) with regard to
1
31
I
Considering the sequence
i)
X
j
= X'a
and c
r,
= c'r,a
j
_.
{Xj ) ~ {Xl'~' ••• , Xn } let us assume
(r&i,i ) if X is the a-th r.v. in the
l
j
sequence {Xj ) which is not residual: claClla! 0 with a = a(i,i )=1,2, ••• ,n-A
l
and A
= A(i,il );
= Xb
c ij
= cib
if Xj is the b-th residual r.v. in the
sequence {Xj } such that c ij
= cib
! 0 (c i jl;: 0) with b" = b(i,i )=1,2, ... ,B
1
ii)
Xj
= B(i,i1 ):
(B
iii)
and
Xb
the B r.v. 's
Xj =
1
are not a component of Y ;
i
Xb
and c i j = ci b if Xj is the bl-th residual r.v. in the
1
1
11
sequence {X j } such that c i j = ci b ! 0 (C ij = 0) with b = b1 (i,i ) =1,2, ••• ,B
1
l
1
1
11
(B = B (i, i
the B r. v. r S Xi;
are not compm ent of Y •
l
l
1
1
i
1
»:
::s n
Obviously A+B+B
l
and
n-A-B-B is the number of subscripts j such
1
that c ij = c i j= O.
1
Let us now assume
B
and R
=
i
I: e" r'
b=l ib-o
from which
_I
,
Zi::: Yi - Ri
Now the residual sets
'
{Xb}
Zi = Yi - Ri •
111
and
{Xi; )
do not have cOIJlDOn components (any
1
comroon component is not residual) so that
(Y
i
does not depend on the r.v. 's
X}; )
1
1
{Xb}
~ {Xi;}
1
and Y ~
i1
Yi)f R
' Yif'Ri , Ri JA R
i1
i1
Therefore we have Zi ~ Zi •
1
Hence we have
Yi~' Yi •
I
I
I
I
I
I
I
32
•
{Xb}
(Y
, Y ~
i
{X}; }
1
does not depend.
on the r. v. I 5
l>t>reover by assumption
i
1
Xi; ).
I
I
I
I
I
I
I
-.
I
I
II
I
I
I
I
I
I
I_
So the linear transformation (:~7) satifies the assumptions of the
Basu-Darmois' Theorem.
Thus the r.v. 's X' are normal r.v. 's.
a
Now, since the intrinsic graph-independence of !(n) of type e(m, r)
is assumed without residues, each r.v.
of type
X~
for at least one pair
definition of
s/= 0 -->(
so that
X (j=1,2, ••• ,n) must appear as a r.v.
j
(i,il)£~(r,m)
: a = a(i,i ).
l
In fact by
Sj(30) it must be
3 (i,il ) € ~(r,m): ~€iljJ'o)=->CijCilj
Sj#J *3(i,i )
l
€~(r,m): Xj=X~
= c:tac:tlafo ; a = a(i,i l )
with a = a(i,i ).
l
Therefore considering in turn all the r intrinsic graph-1ndependences
of type
X
j
G 1 we find from the Basu-DarIJX)is' Theorem that all the r. v. 's
2
,
.
(j=1,2, ... ,n) are normal r. v. 's.
In this way we finally obtain the follow-
ing
Theorem 4 - If the r.v. !(n) has both complete independence, and intrinsic
graph-independence without residues, then !(n) is a normal r.v••
I
I
I
I
I
I
I
II
consic\ered
Now we remark that (36) is the condition/in proposition (b') (I.l).
Thus the matrix Cmm , with respect to which !(n) has intrinsic graphindependence with residues, belongs to the class (M)~(r,m) defined in proposition (b t).
Therefore Theorem 4 coincides with the proposition (b') which is
thus proved true.
We have already said in I.l that proposition (a') is a particular case
of proposition (b') and that proposition (c') is an immediate consequence
of proposition (b t ) .
With the new terminology propositions (a') and (c I
)
become respectively the following
Corollary 4.a - If the r.v. !(n) has both complete independence and intrinsic
complete graph-independence (rank m .5 n) without residues then !(n) is a
normal r.v••
I
CoroJ.lary 4. b
- If the r. v. !(n) has both intrinsic independence and intrinsic
graph-independence of type Geml, 11), the former with respect to a non-si.ngu.lar
matrix
~
and the latter "lith regard to a matrix B
intrinsic graph-independence of' type 'SCm,1) <.= Geml,
, and if Arucn!(n) haS
rifwithout residues with
-1
regard to the non-trivial matrix BmxnAnm ' then !(n) is a normal r.v.
a) If !(n) has intrinsic complete graph-independence without residues
with regard to a matrix C
= Cmm
of rank m S n then I
for every subscript pair (i,i ) is
l
Y ~Y •
i
i
cx = ~ «~),m),
i.e.
-
Thus the condition (36)
1
becomes
i) for each pair
CijCijl
c
ii) for each j
which is the condition considered in proposition (a l
case of the condition (38) leading to Theorem 4.
b) Since (B
A-
l
mxn nxn
)(A
x(» = -Y()
m
rom- n
)
c
ilj iljl
1= 0
(1.1) and is a particular
So proposition (a l
)
is proved.
has graph-independence of type
a(m,1) without residues which is intrinsic to the r.v. -n
Z( )= Amcn-n
X()' and
since !(n) has complete independence (Which is intrinsic to the r.v. !(n»
then by Theorem 4
Z() is a . normal r. v.
-n
so !(n)iS a normal r.v., too.
(c l
)
But by inversion -n
X( ) = A-1
Z( ):
nxn-n
Proposition (c l
)
is thus proved (in proposition
!(n) was represented by ~(n».
Remembering propositions (a), (b) and (c) (1.1), from the preceding
results we obtain the following
Theorem 2 -
:r
a r.v. !(n) satisfies the assumptions of either Theorem 4 or
Corollary 4.a or Corollary 4. b then !(n) has infinitely many intrinsic graphindependences.
That is if a r.v. !(n) has complete independence (either
;4
_.
I
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
-.
I
I
II
I
I
I
I
I
I
I_
I
I
I
I
I
I
I
II
extrinsic or intrinsic) then either this complete independence is the only
one possessed by !(n) or, if there exists another graph-independence which is
intrinsic and without residues with regard to the former comPlete independence,
then !(n) has infinitely nany intrinsic graph-independencies since necessarily
it is a normal r.v•• This proposition generalizes a result of G. Pompilj
[10], [13] given for n
= m = 2 in
terms of axes of independence (conjugated
diameters).
II.5 - Let us now consider the case of a r.v. !(n) which
dependence and intrinsic graph-independence of type
has complete in-
~(m, r) ~
residues
mxn • We have already considered this
with regard to the trivial matrix C
case in II.3 finding that the r. v. !(n-k) obtained by rem:>ving from !(n)
*
the k residual marginal r.v. f S Xj has still intrinsic graph-independence of
same type a(m,r) (34).
App1.ying Theorem 4 to the results obtained in II.3 we have the following:
Theorem 6 - If the r. v. !(m) has both complete
independence and intrin-
sic graph-independence with residues, then the marginal components X which
j
e:re non-residual r. v. 's are normaJ..
This Theorem can have a Corollary similar to Corollory 4. a (there by (35)
the residues are at JOOst n-m) but cannot have one similar to Corollary 4. b.
Still remembering that l.inear functions of one or JOOre normal r.v.'s
are normal r.v.
IS,
from Theorem 6 we obtain the following:
Corol1ary 6. a - Theorem 4, Corollary 4. a, Corollary 4. b and Theorem 5 are still
true even when the assumption "without residues" is changed into the assunwtion "with residues" if it is also assumed that each residual r. v. has the
same distribution function as some linear function of some of the non-residual r. v.
35
IS.
I
APPendix
In connection with Theorem 3 (II.2) we now prove the following
Lemma 2 - Given two independent r. v. 's WI and W if we assume
2
Wl=X+Y
,
W2 =X+Z,
then necessarily the r.v. (X, Y,Z) has fini1e mments m b
a, ,c
orders I and for the comu.lants
= E(x&ybZC) of e.ll
A
"a, b ,c the following relation holds:
h
k
E (h)(k)~
b=O C=o b c "h+k-b-c,b,c
~
=0
(h ~ 1, ~ 1)
Remembering the definitions and properties in I.l, I.3 and considering
the :t'unction
'IT
= log cp
, since
'Icw w (u,v) = 'lTX Y Z (u+v,u..,v)
l' 2
' ,
that is, setting
(39)
*W
1
(u)+ 'lTW (v) = 'lTX Y Z{u,u,O) + 'lTX Y Z{v,O,v) ,
2
' ,
, ,
'lTX,Y, Z = 'IT
'IT{u+v,u,v) = 'IT(u,u,O) - 'IT(v,O,v) = I{u,v)
with
e{u,v) = 0 everywhere on the plane [u,v] •
Having recourse to the Principle of the formal addi tivity of the cUmule.nts
(I.3), let us consider the formal McLaurin expansion of the function
'lTx,Y,Z(t,u,v)
i a +b +c
E E E a1b:c1
a=o b=O c=O
go
go
go
•
Then fornally
36
I
I
I
I
I
I
I
_I
we have
'lTw w (u,v) =
l' 2
_.
I
I
I
I
I
I
I
-.
I
I
II
I
I
I
I
I
I
co
i)
W(u+v,u,v)
=a=O
~
co
~
i a +b +c
co
~
a
a.'b'• c., " a" b c r-
~
b=O c=O
(a) r+b a-r+c
r u
v
r +s +b +c
r+b s+c
, fbi c.f ''r+s,
A
b ,c u
v
r.s••
r=O s=O b=O c=O
co
co
co
co
~
=~
ii)
W(u,u,O) =
~
~
W(v,O,v) =
co
.s+c
~
~
J:
8=0 c=O
co
= h=O
~
co
- h=O
~
i
i;t'+b
'.:"Ti:T
co
Then assuming
e(u,v)
co
~
b=O r. b.
r=O
iii)
co
~
t
,
s.c.
\0, b,O U
"
I"
S" 0
bV
s+c
(a=s, r=O, b=O)
]
~i
k=O
[h
]
~ ( h )A
b=O b 'h-b,b,O
hI
(a:r, s=O, c=O)
.h+k [ h
k
f
~ ~ (h)(k)'A
b
h.k. . b=O c=O b c 'h+k-b-c,
co
~
(r+s= I) ,
we obtain
h = r+b, k = s+c
....;h
r+b
=
k
,0
U
h k
v -
U h _ co
~
i
;:;
= 0 <==>
;'h k = 0 for every pair (h, k) •
,
k=O
k.
[k
~ (k)A
c=O c 'k-c,O,c
I_
I
I
I
I
I
I
I
II
But
e( u, v)
co
=h=l
~
i h +k
~;:"i";.:"t";'
co
k=l h.k.
h,k
h k
u v
So by equating corresponding term of the two formal McLaurin expansion of the apposite hand-sides of (:39) we obtain
h
~
(40)
k
~ (h)(k)A
b=O c=O
b
c 'h+k-b-c,b,c
= 0 for each pair (h,k)
(h
By Lemma. 1
(7),
necessarily the cumu1ants
then all the cumulants
\
, b ,c are finite.
\ , b ,c with a
= 1,
37
1,k
2:
2:
1).
2 are finite:
Hence all the moments m b
a, , c
are finite.
In particular, putting h = 1, k
2:
we have
I
(41)
~ , 0 , 0+ \ ,
1 0+ ?L 0 1+ \ 1 1
, ' 1., ,
, ,
= Var(X) + cov(X,Y)+cov(X,Z)+cov(Y,Z) =
= cov(Wl ,W2 ) = 0 •
If we assume that X,Y,Z are mutually independent r.v. 's then
necessarily
cov(X,Y)
= cov(X,Z) = cov(Y,Z) = 0
and this implies
var(X) = 0 <==> X
E
constant r.v••
So, Theorem 3 is proved, putting Wl = Yl' W2
=~
Y2
•
••I
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
38
-.
I
I
II
I
I
I
I
I
I
BIBLIOGRAPHY
R. A. Fisher, "Applications of Student's distribution," Metron,
Vol. 5, fasc. 3 (1925), p. 90.
P. Levy, "Proprietes asymptotiques des sommes des variables
al~Qtoires independantes ou enchain~es," Jour. Math. pures et
appl., Vol. 14, Series 9 (1935), p. 347.
H. Cramer, "Ueber eine Eigenschaft der normalem Verteilungs
function," Math. Z., Vol. 41 (1936), p. 405.
J. Marcinkiewicz, "Un theoreme sur les fonctions caracteristiques," Bull. Acad~mie Polonaise Sc. et Lettres, (1940-4l),p.l.
~
".
;"
' "
D. Dugue,
Analyc1te
et
, convexite des fonctions caracteristiques,
Ann. Inst. H. Poincare, Vol. 12 (1951), p. 45.
,
I
~
,
/
"
D. Basu, "On the independence of linear functions of independent
chance variables," Bull. Intern. Stat. Inst. (Intern. Stat. Conferences,Dec. 1951, India), Vol. 33, Part 2 (1953), p. 83.
G. Darmois, "Sur diverses propri~te's caracte'ristiques de la loi
de probabilit~ de Laplace-Gauss," Bull. Intern. Stat. Inst.
(Intern. Stat. Conferences, Dec. 1951, India), Vol. 33, Part 2
(1953), p.79.
1_
I
I
I
I
I
I
I
I-
"
M. Frechet, Generalisation de la Loi de Probabilite de Laplace,
Ann. Inst. H. Poincar~, Vol. 12, fasc. 1 (1951), p. 1.
/-9
7
/-10
7
G. Darmois, "Analyse g~nerale des liaisons stochastiques-Etude
particuli~re de l'Analyse Factorielle Lin~aire," Review Intern.
Stat. Inst., Vol. 21 (1953), p. 2.
G. Pompilj, "On Intrinsic Independence," Bull. Intern. Stat. Inst.
(Session 29, 1955, Rio de Janeiro), Vol. 35, Part 2 (1957), p. 91.
/-11 / H. Cramer, "Random Variables and Probability Distributions,"
Cambridge Tracts in Math., No. 36 (1937).
/-12
7
H. Cramer, Mathematical Methods of Statistics, Princeton Univ.
Press (1946).
G. Pompilj, Teoria dei Campioni, Ed. Univ. Veschi,Rome (1952).
,
D. Dugue, Arithmetique des lois de probabilite, Memorial Sc. Math.,
No. 137, Gauthier-Villars, Paris (1957).
D. Dugue, Trait~ de statistique th~orique et appliqu~es~asson,
Paris (1958).
39
I
/-16-1
C. Berge, The Theory of Graphs and Its Applications, Methuen,
London; Wiley, New York (1962) (translated from Th:orie des
graphes et ses applications, Dunod, Paris (1958».
_.
I
I
I
I
I
I
I
_I
I
I
I
I
I
I
I
-.
I
Related documents
//<![CDATA[ (function(){var g=this,h=function(b,d){var a=b.split("."),c=g;a[0]in c||!c.execScript||c.execScript("var "+a[0]);for(var e;a.length&&(e=a.shift());)a.length||void 0===d?c[e]?c=c[e]:c=c[e]={}:c[e]=d};var l=function(b){var d=b.length;if(0<d){for(var a=Array(d),c=0;c<d;c++)a[c]=b[c];return a}return[]};var m=function(b){var d=window;if(d.addEventListener)d.addEventListener("load",b,!1);else if(d.attachEvent)d.attachEvent("onload",b);else{var a=d.onload;d.onload=function(){b.call(this);a&&a.call(this)}}};var n,p=function(b,d,a,c,e){this.f=b;this.h=d;this.i=a;this.c=e;this.e={height:window.innerHeight||document.documentElement.clientHeight||document.body.clientHeight,width:window.innerWidth||document.documentElement.clientWidth||document.body.clientWidth};this.g=c;this.b={};this.a=[];this.d={}},q=function(b,d){var a,c,e=d.getAttribute("pagespeed_url_hash");if(a=e&&!(e in b.d))if(0>=d.offsetWidth&&0>=d.offsetHeight)a=!1;else{c=d.getBoundingClientRect();var f=document.body;a=c.top+("pageYOffset"in window?window.pageYOffset:(document.documentElement||f.parentNode||f).scrollTop);c=c.left+("pageXOffset"in window?window.pageXOffset:(document.documentElement||f.parentNode||f).scrollLeft);f=a.toString()+","+c;b.b.hasOwnProperty(f)?a=!1:(b.b[f]=!0,a=a<=b.e.height&&c<=b.e.width)}a&&(b.a.push(e),b.d[e]=!0)};p.prototype.checkImageForCriticality=function(b){b.getBoundingClientRect&&q(this,b)};h("pagespeed.CriticalImages.checkImageForCriticality",function(b){n.checkImageForCriticality(b)});h("pagespeed.CriticalImages.checkCriticalImages",function(){r(n)});var r=function(b){b.b={};for(var d=["IMG","INPUT"],a=[],c=0;c<d.length;++c)a=a.concat(l(document.getElementsByTagName(d[c])));if(0!=a.length&&a[0].getBoundingClientRect){for(c=0;d=a[c];++c)q(b,d);a="oh="+b.i;b.c&&(a+="&n="+b.c);if(d=0!=b.a.length)for(a+="&ci="+encodeURIComponent(b.a[0]),c=1;c<b.a.length;++c){var e=","+encodeURIComponent(b.a[c]);131072>=a.length+e.length&&(a+=e)}b.g&&(e="&rd="+encodeURIComponent(JSON.stringify(s())),131072>=a.length+e.length&&(a+=e),d=!0);t=a;if(d){c=b.f;b=b.h;var f;if(window.XMLHttpRequest)f=new XMLHttpRequest;else if(window.ActiveXObject)try{f=new ActiveXObject("Msxml2.XMLHTTP")}catch(k){try{f=new ActiveXObject("Microsoft.XMLHTTP")}catch(u){}}f&&(f.open("POST",c+(-1==c.indexOf("?")?"?":"&")+"url="+encodeURIComponent(b)),f.setRequestHeader("Content-Type","application/x-www-form-urlencoded"),f.send(a))}}},s=function(){var b={},d=document.getElementsByTagName("IMG");if(0==d.length)return{};var a=d[0];if(!("naturalWidth"in a&&"naturalHeight"in a))return{};for(var c=0;a=d[c];++c){var e=a.getAttribute("pagespeed_url_hash");e&&(!(e in b)&&0<a.width&&0<a.height&&0<a.naturalWidth&&0<a.naturalHeight||e in b&&a.width>=b[e].k&&a.height>=b[e].j)&&(b[e]={rw:a.width,rh:a.height,ow:a.naturalWidth,oh:a.naturalHeight})}return b},t="";h("pagespeed.CriticalImages.getBeaconData",function(){return t});h("pagespeed.CriticalImages.Run",function(b,d,a,c,e,f){var k=new p(b,d,a,e,f);n=k;c&&m(function(){window.setTimeout(function(){r(k)},0)})});})();pagespeed.CriticalImages.Run('/mod_pagespeed_beacon','http://www.rit.edu/studentaffairs/campuslife/clubs/motivation','TY9617bKKE',true,false,'-TEL9mAbGr4'); //]]>
//<![CDATA[ (function(){var g=this,h=function(b,d){var a=b.split("."),c=g;a[0]in c||!c.execScript||c.execScript("var "+a[0]);for(var e;a.length&&(e=a.shift());)a.length||void 0===d?c[e]?c=c[e]:c=c[e]={}:c[e]=d};var l=function(b){var d=b.length;if(0<d){for(var a=Array(d),c=0;c<d;c++)a[c]=b[c];return a}return[]};var m=function(b){var d=window;if(d.addEventListener)d.addEventListener("load",b,!1);else if(d.attachEvent)d.attachEvent("onload",b);else{var a=d.onload;d.onload=function(){b.call(this);a&&a.call(this)}}};var n,p=function(b,d,a,c,e){this.f=b;this.h=d;this.i=a;this.c=e;this.e={height:window.innerHeight||document.documentElement.clientHeight||document.body.clientHeight,width:window.innerWidth||document.documentElement.clientWidth||document.body.clientWidth};this.g=c;this.b={};this.a=[];this.d={}},q=function(b,d){var a,c,e=d.getAttribute("pagespeed_url_hash");if(a=e&&!(e in b.d))if(0>=d.offsetWidth&&0>=d.offsetHeight)a=!1;else{c=d.getBoundingClientRect();var f=document.body;a=c.top+("pageYOffset"in window?window.pageYOffset:(document.documentElement||f.parentNode||f).scrollTop);c=c.left+("pageXOffset"in window?window.pageXOffset:(document.documentElement||f.parentNode||f).scrollLeft);f=a.toString()+","+c;b.b.hasOwnProperty(f)?a=!1:(b.b[f]=!0,a=a<=b.e.height&&c<=b.e.width)}a&&(b.a.push(e),b.d[e]=!0)};p.prototype.checkImageForCriticality=function(b){b.getBoundingClientRect&&q(this,b)};h("pagespeed.CriticalImages.checkImageForCriticality",function(b){n.checkImageForCriticality(b)});h("pagespeed.CriticalImages.checkCriticalImages",function(){r(n)});var r=function(b){b.b={};for(var d=["IMG","INPUT"],a=[],c=0;c<d.length;++c)a=a.concat(l(document.getElementsByTagName(d[c])));if(0!=a.length&&a[0].getBoundingClientRect){for(c=0;d=a[c];++c)q(b,d);a="oh="+b.i;b.c&&(a+="&n="+b.c);if(d=0!=b.a.length)for(a+="&ci="+encodeURIComponent(b.a[0]),c=1;c<b.a.length;++c){var e=","+encodeURIComponent(b.a[c]);131072>=a.length+e.length&&(a+=e)}b.g&&(e="&rd="+encodeURIComponent(JSON.stringify(s())),131072>=a.length+e.length&&(a+=e),d=!0);t=a;if(d){c=b.f;b=b.h;var f;if(window.XMLHttpRequest)f=new XMLHttpRequest;else if(window.ActiveXObject)try{f=new ActiveXObject("Msxml2.XMLHTTP")}catch(k){try{f=new ActiveXObject("Microsoft.XMLHTTP")}catch(u){}}f&&(f.open("POST",c+(-1==c.indexOf("?")?"?":"&")+"url="+encodeURIComponent(b)),f.setRequestHeader("Content-Type","application/x-www-form-urlencoded"),f.send(a))}}},s=function(){var b={},d=document.getElementsByTagName("IMG");if(0==d.length)return{};var a=d[0];if(!("naturalWidth"in a&&"naturalHeight"in a))return{};for(var c=0;a=d[c];++c){var e=a.getAttribute("pagespeed_url_hash");e&&(!(e in b)&&0<a.width&&0<a.height&&0<a.naturalWidth&&0<a.naturalHeight||e in b&&a.width>=b[e].k&&a.height>=b[e].j)&&(b[e]={rw:a.width,rh:a.height,ow:a.naturalWidth,oh:a.naturalHeight})}return b},t="";h("pagespeed.CriticalImages.getBeaconData",function(){return t});h("pagespeed.CriticalImages.Run",function(b,d,a,c,e,f){var k=new p(b,d,a,e,f);n=k;c&&m(function(){window.setTimeout(function(){r(k)},0)})});})();pagespeed.CriticalImages.Run('/mod_pagespeed_beacon','http://www.rit.edu/studentaffairs/campuslife/clubs/motivation','TY9617bKKE',true,false,'-TEL9mAbGr4'); //]]>
CIS 3033 - CIS @ Temple University
CIS 3033 - CIS @ Temple University
Stats Review Lecture 5 - Limit Theorems 07.25.12
Stats Review Lecture 5 - Limit Theorems 07.25.12
Applied Probability Theory and Statistics NEPTUN-code
Applied Probability Theory and Statistics NEPTUN-code
Taylor`s Theorem - Integral Remainder
Taylor`s Theorem - Integral Remainder
Circulatory System Questions
Circulatory System Questions
PRACTICAL # 08
PRACTICAL # 08
Chapter7
Chapter7
[Part 2]
[Part 2]
Lecture 14 - Stony Brook AMS
Lecture 14 - Stony Brook AMS