Download In this part we are going to prove a technical lemma which is used

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Statistics wikipedia , lookup

History of statistics wikipedia , lookup

Probability interpretations wikipedia , lookup

Probability wikipedia , lookup

Ars Conjectandi wikipedia , lookup

Transcript
Part I. DYNKIN’S LEMMA AND ITS APPLICATIONS
In this part we are going to prove a technical lemma which is used throughout mathematical probability theory, when one needs to prove that some statement holds for all sets
in some σ -algebra. Dynkin’s lemma allows to do this in two steps:
• First we show the statement for a class of simple sets which generates the σ -algebra
and is closed under finite intersections.
• Then we prove that the collection of all sets for which the statement holds satisfies
three simple conditions (conditions a) - c) below).
While the statement of the lemma does not look very impressive, its applications are.
Some are included below; you will see many more during this course.
Definitions:
• Let S be a set and let D be a collection of subsets of S . Then D is called a d-system
(to honor E.B. Dynkin) if:
a) S ∈ D.
b) if A, B ∈ D and A ⊆ B then B\A ∈ D.
c) if An is an increasing sequence of elements of D, then An ∈ D.
• A collection I of subsets of S is called a π -system if it is closed under finite intersections.
S
1. Dynkin’s lemma: if I is a π-system and D is the smallest d-system, containing I ,
then D is the σ -algebra generated by I .
Here is a way to organize a proof:
• Show that a collection of subsets of S is a σ -algebra if and only if it is both a π -system
and a d-system. This is obvious in one direction and fairly easy in the other. We thus only
need to show that D is a π-system. This is done in two steps:
• First, prove that D1 = {B ∈ D : ∀C ∈ I B ∩ C ∈ D} is a d-system and, consequently,
D1 = D .
• Next, let D2 = {A ∈ D : ∀B ∈ D A ∩ B ∈ D}. Prove that D2 is a d-system, which, by
the previous step contains I .
• Conclude that D2 = D and show that this ends the proof of Dynkin’s lemma.
2. Applications: Let I be a π-system of subsets of some space Ω and let µ1 and µ2 be
two probability measures on the σ -algebra σ(I). Prove that if µ1 = µ2 on I , then µ1 = µ2 on
σ(I). Apply this to the following specific cases:
a) Uniqueness of the Lebesgue measure.
Prove that there is at most one probability measure λ on the Borel σ -algebra of the
unit interval, such that for any a, b ∈ [0, 1], a < b, λ(a, b) = b−a. This is, of course, the Lebesgue measure. Proof of its existence requires other methods (the Carathéodory extension theorem
or Riesz representation theorem).
b) Definition of a product measure.
1
Let (Ωj , Fj , Pj ), j = 1, 2, be two probability spaces. Let Ω = Ω1 × Ω2 and consider the σ algebra F on Ω, generated by rectangles, i.e. by sets of the form A1 ×A2 , where Aj ∈ Fj . Prove
that there is at most one probability measure P on F such that P [A1 × A2 ] = P1 [A1 ] × P [2[A2 ].
Such a measure (whose existence has to be shown by other means) is called the product
of measures P1 and P2 .
c) Uniqueness of the Kolmogorov extension.
Prove that the measure, whose existence is asserted by Kolmogorov extension theorem
stated in class, is unique. Again, this says nothing about existence.
PART II. INDEPENDENCE
Definition: Let (Ω, F, P ) be a probability space. A finite collection of sub-σ -algebras,
F∞ , F∈ , . . . F
Q\n of F is called independent if for every choice of Aj ∈ F| , j = 1, 2, . . . n, P [A1 ∩ A2 ∩
. . . ∩ An ] = j=1 P [Aj ]. An infinite collection of σ -algebras is defined to be independent if its
all finite subcollections are independent.
1. a) Prove that random variables Xj : Ω → R are independent (in the sense of the
definition given in class) if and only if the generated σ -algebras are independent.
b) For any event A, consider the σ -algebra σ(A) generated by the single set A. How
many elements does σ(A) have? Show that σ(A) = σ(IA )—the σ -algebra generated by the
indicator function of A, defined by IA (ω) = 0 when ω 6∈ A and IA (ω) = 1 when ω ∈ A. Prove
that events Aδ , δ ∈ ∆ are independent if and only if the σ -algebras σ(Aδ ) are independent.
c) Prove that events Aδ are independent if and only if their complements Acδ = Ω\Aδ are
independent.
2. Independence of random variables is not always obvious. This problem gives an
example, which is of importance in mathematical physics. Consider the two-dimensional
integer lattice Z2 . The points of the lattice are called sites. A segment connecting two
nearest-neighbor sites (x, y ∈ Z2 , |x − y| = 1) is called a bond. Bonds divide the plane R2
into squares of unit area, called plaquette. This weird terminology comes from statistical
mechanics/quantum field theory. In graph theory sites are called vertices, bonds are edges
and plaquettes become faces. Now, assume that to each bond b we assign a random variable
Jb with the symmetric Bernoulli distribution:
P [Jb = 1] = P [Jb = −1] =
1
.
2
The Jb are defined on some probability space and they are independent. For each plaquette
p we define
Y
Sp =
Jb ,
b∈∂p
where ∂p is the boundary of p, which consists of four bonds.
Prove that the random variables (Sp ) are independent and have the symmetric Bernoulli
distribution.
2