Download 1 Local martingales and Girsanov`s theorem

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Probability wikipedia , lookup

Transcript
1
Local martingales and Girsanov’s theorem
The purpose of this section is to formulate and prove Girsanov’s theorem on
absolutely continuous changes of measure for discrete time processes.
1.1
Local martingales
As always we have a probability space (Ω, F, P) on which all random variables
are defined. Before defining local martingales, we relax the notion of conditional
expectation of a random variable X by dropping the requirement that E |X| <
∞. Suppose X ≥ 0 and that G is a sub-σ-algebra of F. Then there exists
a G-measurable random variable X̂ such that E X1G = E X̂1G for all G ∈
G (Exercise 1.1), we adopt again the notation X̂ = E [X|G] for any version
X̂. If X is such that E [X + |G] < ∞ a.s. or E [X − |G] < ∞ a.s., we define
X̂ = E [X + |G] − E [X − |G], the generalized conditional expectation of X given
G, denoted (again) E [X|G]. Note that also here we have different versions, and
that these are a.s. equal.
In what follows we work with a fixed filtration F, without explicitly mentioning it further.
Definition 1.1 Let X be an adapted process. If there exists a sequence of
n
stopping times T n such that T n → ∞ and such that the stopped processes X T
are all martingales, we call the process X a local martingale. The sequence (T n )
is called a fundamental or localizing sequence.
It is obvious that any martingale is a local martingale, but the converse is not
n
true, see Exercise 1.2. Note also that the definition implies that X0 = X)T
belongs to L1 (Ω, F, P). There exist alternative definitions for local martingales
that also allow for non-integrable X0 .
Proposition 1.2 A real valued process X is a local martingale iff the following
two assertions are hold true. (i) X0 ∈ L1 (Ω, F0 , P) and (ii) E [|Xn+1 | |Fn ] < ∞
a.s. and E [Xn+1 |Fn ] = Xn for all n in the sense of generalized conditional
expectations.
Proof Suppose X is a local martingale and (T n ) a localizing sequence. Then
n
Tn
(i) immediately follows and we also have E [XkT |Fk−1 ] = Xk−1
, which yields
1{T n >k−1} E [Xk |Fk−1 ] = Xk−1 1{T n >k−1} . Letting n → ∞ gives E [Xk |Fk−1 ] =
Xk−1 . From this generalized martingale property we see that the conditional
expectation E [Xk |Fk−1 ] is finite a.s. As a consequence, one automatically has
E [|Xk | |Fk−1 ] < ∞ a.s.
Pm+1
Conversely we assume (i) and (ii). Let T n = inf{m : k=1 E [|Xk | |Fk−1 ] ≥
n}. Then T n is a stopping time and E |XkTn | ≤ E |X0 | + n < ∞, so XkTn ∈
L1 (Ω, F, P). Since {T n ≥ k} ∈ Fk−1 we also have from (ii) with k instead of
n
n
Tn
n + 1 that E [XkT |Fk−1 ] = Xk−1
, which yields the martingale property of X T .
1
Proposition 1.3 The following results on martingale transforms hold true.
(i) If Y is a predictable process and M a martingale, then X = Y · M is a
local martingale.
(ii) If Y is a predictable process and M a local martingale, then Y · M is a
local martingale.
(iii) If X is a local martingale, then there exists a predictable process Y and
martingale M such that X = Y · M .
Proof (i) Let T n = inf{k ≥ 0 : |Yk+1 | > n}. Then (T n ) is a fundamental
n
sequence of stopping times and Y T is a bounded process for every n. Moren
over, the stopped processes M T are martingales as well, and one easily verifies
n
n
n
n
X T = Y T · M T . It follows from RESULT that X T is a martingale for every
n and so X is a local martingale.
(ii) follows from (iii). Indeed, M being a local martingale, must be of the
form M = Y 0 · M 0 for a predictable process Y 0 and a martingale M 0 . This
implies X = Y · (Y 0 · M 0 ) = Y Y 0 · M 0 and the result follows from (i).
n
(iii) Let Ank = {E [Xn | |Fn−1 ∈ [k,
S k + 1). For every n, the Ak are
P disjoint
and by Proposition 1.2 we have P( k Ank = Ω) = 1. Put un :=
k≥0 (k +
−3
1) ∆Xn 1Ank . Then un ∈ Fn , E |un | < ∞ and E [un |Fn−1 ] = 0. Thus (Mn )
Pn
P
with Mn = i=1 is a martingale with M0 = 0. Put then Yn = k≥0 (k +1)3 1Ank
and we see that ∆Xn = Yn ∆Mn .
Proposition 1.4 Suppose X is a nonnegative local martingale such that E X0 <
∞, then it is a true martingale.
Proof We first show that E Xk < ∞ for all k. Let (T n ) be a localizing sequence.
n
By Fatou’s lemma and by the martingale property of X T we have
n
E Xk ≤ lim inf E XkT = E X0 .
n→∞
Pk
n
Furthermore, XkT ≤ i=0 Xi , of which we now know that this upper bound has
finite expectation. Hence we can apply the Dominated Convergence Theorem
n
Tn
in E [XkT |Fk−1 ] = Xk−1
to get E [Xk |Fk−1 ] = Xk−1 .
1.2
Quadratic variation
If X and Y are two stochastic processes, their optional quadratic covariation
process [X, Y ] is defined as
X
[X, Y ]t =
∆Xs ∆Ys ,
s≤t
where ∆X0 = ∆Y0 = 0. For X = Y we write [X] instead of [X, Y ]. Note the
integration by parts formula
Xt Yt = X0 Y0 + (X− · Y )t + (Y− · X)t + [X, Y ]t ,
2
where the (predictable) process X− at time t equals Xt−1 , defined for t ≥ 1.
If X and Y are square integrable martingales, we have encountered in Section ?? their predictable covariation process hX, Y i, which was the unique predictable process such that XY − hX, Y i is a martingale too. It follows from
the above integration by parts formula, recall that (X− · Y ) and (Y− · X) are
martingales, that also [X, Y ] − hX, Y i is a martingale. In fact, hX, Y i is also the
unique predictable process that, subtracted from [X, Y ], yields a martingale,
already known from Corollary ??,
E [∆Xt ∆Yt |Ft−1 ] = ∆hX, Y it .
(1.1)
This property carries over to the case where X and Y are local martingales under
the extra conditions E [∆Xt2 |Ft−1 ] < ∞ and E [∆Yt2 |Ft−1 ] < ∞, Exercise 1.3,
and under this condition, XY − hX, Y i is a local martingale.
1.3
Measure transformation
In this section we suppose that P and Q are probability measures on (Ω, F).
Their restrictions to Fn are denoted Pn and Qn respectively. We will always
assume that P0 = Q0 , which is automatically the case if F0 is trivial. We start
with a definition whose contents are supposed to be in force throughout this
section. See Exercise ?? for the more restrictive setting with Q P.
Definition 1.5 The probability measure Q is called locally absolutely continuous w.r.t. the probability measure P if Qn Pn for all n ≥ 0. In this case we
can define the Fn -measurable random variables (Radon-Nikodym derivatives,
n
densities) Zn = dQ
dPn . The process Z = (Zn )n≥0 is called the density process.
One easily verifies that Z is a martingale under the probability measure P. Note
that Q(Zn > 0) = 1, but P(Zn > 0) may be less than 1, unless Pn ∼ Qn , in
which case P(F ) = EQ Z1n 1F for F ∈ Fn . Furthermore, on the set {Zn = 0} also
Zn−1 = 0, as follows from the martingale property of the nonnegative process
Z. Hence we can define Zn /Zn−1 with the understanding that it is put equal
to zero on {Zn−1 = 0}.
If M is martingale under P, it is usually not a martingale under Q. We shall
see below how to transform M parallel to the measure change from P to Q such
that the changed process is a martingale again, or a local martingale in a more
general setting. The following lemma generalizes Equation ?? to conditional
expectations.
Lemma 1.6 Let Q P with Radon-Nikodym derivative Z := dQ
dP and G a
sub-σ-algebra of F. Suppose X ∈ L1 (Ω, G, Q). Then Q(E [Z|G] > 0) = 1 and
a.s. under Q
EQ X =
Z
E [XZ|G]
= E [X
].
E [Z|G]
E [Z|G]
3
(1.2)
Proof Observe first that
Q(E [Z|G] = 0) = EQ 1{E [Z|G]=0}
= E Z1{E [Z|G]=0}
= E E [Z1{E [Z|G]=0} |G]
= E E [Z|G]1{E [Z|G]=0} |G] = 0.
Take G ∈ G and consider the following chain of equalities.
E [XZ1G ] = EQ [X1G ]
= EQ EQ [X1G |G]
= E (ZEQ [XG]1G )
= E E ([ZEQ [X|G]|G]1G )
= E (E [Z|G]EQ [X|G]1G ) .
So, E [XZ1G ] = E (E [Z|G]EQ [X|G]1G ). Since E [Z|G]EQ [X|G] ∈ G, the just
proved property Q(E [Z|G] > 0) = 1 and G is arbitrary, Equation (1.2) follows.
Lemma 1.7 Let Q be locally absolutely continuous w.r.t. P with density process Z and let T be a stopping time. Then for F ∈ FT it holds that
Q(F ∩ {T < ∞}) = E ZT 1F ∩{T <∞} .
Proof Exercise 1.4.
Proposition 1.8 Let X be an adapted process and assume Q locally absolutely
continuous w.r.t. P with density process Z.
(i) The process X is a martingale under Q iff the process XZ is a martingale
under P.
(ii) The process X is a local martingale under Q if the process XZ is a local
martingale under P.
(iii) If moreover P is also locally absolutely continuous w.r.t. Q, then the process XZ is a local martingale under P if the process X is a local martingale
under Q.
Proof To prove (i) we use Lemma 1.6 and recall that E [Zn |Fn−1 ] = Zn−1 .
Note that EQ |Xn | < ∞ iff E |Xn |Zn < ∞. We have
EQ [Xn |Fn−1 ] =
E [Xn Zn |Fn−1 ]
,
Zn−1
from which the assertion immediately follows.
(ii) Let (T n ) be a localizing sequence for XZ. Then P(limn→∞ T n < ∞) = 0
and then also Q(limn→∞ T n < ∞) = 0 by virtue of Lemma 1.7. Because
n
n
n
XkT Zk = XkT ZkT + XT n (Zk − ZT n )1{k≥T n }
4
(1.3)
and both(!) terms on the right hand side are martingales (Exercise ??) under
n
P, we can apply part (i) to deduce that X T is a martingale under Q. Hence X
is a local martingale under Q.
(iii) Let X be a local martingale under Q and (T n ) be a localizing sequence
for it. Then Q(limn→∞ T n < ∞) = 0 and by the argument in the proof of (ii)
also P(limn→∞ T n < ∞) = 0, since P is given to be locally absolutely continuous
w.r.t. Q. Similar to Equation (1.3) we have
n
n
n
XkT ZkT = XkT Zk − XT n (Zk − ZT n )1{k≥T n } ,
again with two martingales under P on the right hand side, and so XZ is a local
martingale under P.
Theorem 1.9 1 Let M be a local martingale under P and let Q be locally
absolutely continuous w.r.t. P. The processes
MQ = M −
1
· [M, Z]
Z
M̃ Q := M −
1
· hM, Zi
Z−
and
are well defined under Q and
(i) M Q is a local martingale under Q, and
(ii) if moreover E [|∆Mn |Zn |Fn−1 ] < ∞2 a.s. for all n, then also M̃ Q is a local
martingale under Q.
Proof The property Q(Zn > 0) = 1 allows for division by Zn a.s. under Q.
(i) Let us compute
∆MnQ = ∆Mn −
∆Mn ∆Zn
Zn
∆Mn
(Zn − ∆Zn )
Zn
∆Mn
=
Zn−1 .
Zn
=
Hence, by Lemma 1.6 with Z = Zn , G = Fn−1 and
Z
E [Z|G]
EQ [∆MnQ |Fn−1 ] = E [∆Mn |Fn−1 ] = 0.
(ii) Recall that ∆hM, Zin = E [∆Mn ∆Zn |Fn−1 ] and
Q
∆(M̃ Q Z)n = Zn−1 ∆M̃nQ + M̃n−1
∆Zn + ∆M̃nQ ∆Zn .
1 maybe
2 maybe
martingale case only
E [|∆Mn |∆Zn |Fn−1 ] < ∞, seems to be equivalent
5
=
Zn
Zn−1
we get
Hence
∆hM, Zin
Q
∆Zn + ∆M̃nQ ∆Zn
) + M̃n−1
Zn−1
∆hM, Zin
Q
= Zn−1 ∆Mn + (M̃n−1
−
)∆Zn + ∆Mn ∆Zn − ∆hM, Zin .
Zn−1
∆(M̃ Q Z)n = Zn−1 (∆Mn −
The first two terms are obviously local martingale differences under P, whereas
the same holds true for the difference ∆M̃nQ ∆Zn − ∆hM, Zin under the stated
assumption.
The second assertion of Theorem 1.9 is the most appealing one, since it allows
to write
M = M̃ Q +
1
· hM, Zi,
Z−
where the first term is a local martingale under Q and the second term is
predictable. This gives the Doob decomposition of M under Q.
1.4
Exercises
1.1 Consider a probability space (Ω, F, P). Suppose X ≥ 0 and that G is a
sub-σ-algebra of F. Then there exists a G-measurable random variable X̂ such
that E X1G = E X̂1G for all G ∈ G.
1.2 Consider a probability space (Ω, F, P) and let {An , n ≥ 1} be a measurable
partition of Ω with P(An ) = 2−n . Let (Zn ) be a sequence of random
Pn variables,
independent of the An , with P(Zn = ±2n ) = 21 . Put Yn =
i=1 1Aj Zj for
n ≥ 1, Y∞ , the a.s. limit of the Yn , as well as X0 = 0 and Xn = Y∞ if n ≥ 1,
T n = ∞ · 1{∪1≤k≤n Ak } .
(a) Show that Y∞ is well defined and finite a.s. What is it?
n
(b) Show that XkT = Yn for k ≥ 1.
n
(c) Show that X T is a martingale for every n.
(d) Show that X1 is not integrable (and hence X is not a martingale).
1.3 Prove Equation (1.1) for local martingales X and Y satisfying E [∆Xt2 |Ft−1 ] <
∞ and E [∆Yt2 |Ft−1 ] < ∞.
1.4 Prove Lemma 1.7.
1.5 Show that the process {XT n (Zk − ZT n )1{k≥T n } , k ≥ 0} in the proof of
Proposition 1.8 is a martingale under P.
6