Download 1 Prerequisites: conditional expectation, stopping time

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Limit of a function wikipedia , lookup

Series (mathematics) wikipedia , lookup

Function of several real variables wikipedia , lookup

Sobolev space wikipedia , lookup

Multiple integral wikipedia , lookup

Lp space wikipedia , lookup

Lebesgue integration wikipedia , lookup

Fundamental theorem of calculus wikipedia , lookup

Transcript
John von Neumann Institute
Viandnam National University-HCM
Stochastic Calculus applied in Finance, February 2014
We consider that we are on a ltered probability space
i
i(*) means exo
(Ω, A, (Ft ), IP).
is dicult to solve but its result is useful.
1 Prerequisites: conditional expectation, stopping time
0. Recall Borel-Cantelli and Fatou lemmas.
1. Let
G
A and an almost surely positive
E[X/G] is also strictly positive.
be a sub-σ algebra of
that the conditional expectation
random variable
Prove that the reciprocal is false given a contra-example (for instance use the trivial
X.
Prove
σ -algebra
G ).
2. Let
G⊂H⊂A
and
X ∈ L2 (Ω, A, IP).
Prove (Pythagore Theorem):
E[(X − E[X/G])2 ] = E[(X − E[X/H])2 ] + E[(E[X/H] − E[X/G])2 ].
3. Let
O
be an open sand in
A
and a
F−adapted
continuous process
X.
One notes
T0 = inf{t : Xt ∈ O}.
Prove that
TO
is a stopping time.
4. Let be stopping times
(i) Prove that
S∧T
S
and
T.
is a stopping time.
(ii) Prove
FS∧T = FS ∩ FT .
5. Let be
T
a stopping time and
A ∈ A.
Prove that
TA = T sur A,
= +∞ sur Ac ,
is a stopping time if and only if
6. A real random variable
X
A ∈ FT .
is
FT
X ∈ L1 and a family
α
expectations {E[X/F ], α ∈ A} is
7.
Let
8. let
X
of
measurable if and only if
∀t ≥ 0, X1T ≤t is Ft measurable.
σ -algebras F α , α ∈ A.
Then the family of conditional
uniformly integrable.
F -progressively measurable process
ω 7→ XT (ω) (ω) is FT -measurable
t 7→ Xt∧T is F -adapted.
be a
and
T
a
(Ft )
stopping time. Then
(i) the application
(ii) the process
9. If
X
is an adapted measurable process admitting càd or càg trajectories, it is progressively
measurable.
1
2 Martingales
1. Let
(i) if
ϕ
X
be a martingale,
ϕ
∀t φ(Xt ) ∈ L1 .
sub-martingale ; if ϕ
a function such that
ϕ(X)
is a convex function, then
is a
is a concave function
ϕ(X)is
a super-martingale.
X is a sub-martingale
φ(X) is a sub-martingale.
(ii) When
then
and
ϕ
an increasing convex function such that
2. Martingale convergence: admit the following: let
supt E[|Xt |] < ∞.
such that
3. let
X
be a càd super (or sub)-martingale
1
Then limt→∞ Xt exists almost surely and belongs to L (Ω, A, IP).
And deduce the Corollary :
limt→∞ Xt
∀t φ(Xt ) ∈ L1 ,
if
X
is a càd bounded from below super-martingale, then
L1 (Ω, A, IP).
exists almost surely and belongs to
X
be a martingale. Prove the following are equivalent:
X is uniformly integrable.
(ii) Xt converges almost surely to Y
{Xt , t ∈ IR+ } is a martingale.
1
(iii) Xt converges to Y in L when t
(i)
(which belongs to
L1 )
when
t
goes to innity and
goes to innity.
Indication: (i) → (iii) → (ii) → (i)
4. let be
(Xt )t≥0
a positive right continuous upper-martingale and
T = inf{t > 0 : Xt = 0}.
∀t ≥ T, Xt = 0. (First prove E(Xt 1T ≤t ) = 0.)
surely X∞ = limt→∞ Xt exists. Deduce:
(i) Prove that almost surely
(ii) Prove that almost
{X∞ > 0} ⊂ {∀t, Xt > 0} = {T = +∞}.
Give a contra-example using
{X∞ > 0} =
6 {T = +∞}.
5. If
M ∈ Mloc
is such that
∗
Moreover suppose E[M ] <
6. If
X
E[Mt∗ ] < ∞∀t, then M is a 'true' martingale.
∞, then M is uniformly integrable.
is a closed martingale with
that it also closed with
limt→∞ Xt
Z,
meaning
denoted as
X∞
2
Z
∀t, Xt = E[Z/Ft ],
E[Z/ ∨t≥0 Ft ].
is interable and
equal to
prove
3 Brownian motion
1. Prove that the real Brownian motion is a centered continuous Gaussian process with covari-
ρ(s, t) = s ∧ .
ance function
Conversely a centered continuous Gaussian process with covariance function
ρ(s, t) = s∧
is a
real Brownian motion.
2.
Prove that the Brownian motion is martingale w.r.t.
its proper ltration, i.e.
Ft =
σ(Bs , s ≤ t).
Prove that it is also a Markov process.
3. let be
Gt = σ(Bs , s ≤ t) ∨ N , t ≥ 0.
Prove this ltration is càd, meaning
Gt+ = ∩s>t Gs .
Indication: use
1. the Gt+ -conditional characteristic of the vector (Bu , Bz ), z, u > t is the limit of Gw -conditional
characteristic function of the vector (Bu , Bz ), when w decreases to t,
2. this limit is equal to the Gt -conditional characteristic of the vector (Bu , Bz ), z, u > t,
3. thus for any integrable Y E[Y /Gt+ ] = E[Y /Gt ]. So any Gt+ -measurable is Gt -measurable and
conclude.
X = {(t, ω) ∈ R+ × Ω :
Xω = {t ∈ R+ : Bt (ω) = 0}.
4(*). On considère l'ensemble des zéros du mouvement brownien :
Bt (ω) = 0}
and les sections de celui-ci par trajectoire
Prove that
P -presque
sûrement en
ω
ω∈Ω
:
on a :
Xω est nulle ,
Xω est fermé non borné ( preuve un peu dicile...),
(iii) t = 0 est un point d'accumulation de Xω ,
(iv) Xω n'a pas de point isolé, donc est dense dans lui-même.
(i) la mesure de Lebesgue de
(ii)
5(*). Théorème de Paley-Wiener-Zygmund 1933,
Pour presque tout
ω,
l'application
t 7→ Bt (ω)
preuve pages 110-111, du Karatzas-Schreve.
n'est pas diérentiable.
Plus précisément,
l'événement
IP{ω ∈ Ω : ∀t, limh→0+
6. Let be
(Bt+h − Bt )(ω)
= +∞
h
and
limh→0+
(Bt+h − Bt )(ω)
= −∞} = 1.
h
(Bt )
a real Brownian motion.
B
a) Prove that the sequence n goes to 0 almost surely.
n
b) Use that B is a martingale and a Doob inequality to deduce the majoration
E[ sup (
σ≤t≤τ
c) Let be
Bt 2
4τ
) ] ≤ 2.
t
σ
τ = 2σ = 2n+1 , give a bound for IP{sup2n ≤t≤2n+1 | Btt | > ε} that proves the convergence
of this sequence, then apply Borel Cantelli lemma.
B
d) Deduce limt→∞ t = 0 almost surely. (meaning the large numbers law, cf.
t
correction pages 124-125, in Karatzas-Schreve.)
Yt = t.B1/t ; Y0 = 0 and FtY the natural ltration associated to the process Y.
(Yt , FtY ) is a Brownian motion (use the criterium in 1 and exercise 6 above).
7. Let be
Prove that
problem 9.3,
3
4 Stochastic integral
In this section and the following let be
(Ω, Ft , P )
bility space
such that
dhM it
M
square integrable martingale on the ltered proba-
is absolutely continuous w.r.t. Lebesgue measure
dt: ∃
[0, t] such that dhM it = f (t)dt.
X on [0, T ] such that:
an integrable measurable positive function on any
1. Let be
LT (M )
the set of adapted processes
Z
= E[
[X]2T
T
Xs2 d < M >s ] < +∞.
0
p
d: d(X, Y ) = [X − Y ]2T .
Actually it is a semi-norm which denes an equivalence relation X ∼ Y if d(X, Y ) = 0.
Prove that
LT (M )
is a metric space w.r.t. the distance
2. Prove the equivalence
X
2−j inf(1, [X − Xn ]j ) → 0 ⇐⇒ ∀T, [X − Xn ]T → 0.
j≥1
3. Let be
S
the set of simple processes for which is dened the stochastic integral w.r.t.
It (X) =
J−1
X
Xj (Mtj+1 − Mtj ) + XJ (Mt − MtJ )
on the event
M
:
{tJ ≤ t ≤ tJ+1 }.
j=0
Prove that
(i)
It
It
is a linear application on
It (X)
(ii)
satises the following:
Ft -measurable
is
(iii)
E[It (X)] = 0.
(iv)
It (X)
S.
and square integrable.
is a continuous martingale.
E[(It (X) − Is (X))2 /Fs ] = E[It2 (X) − Is2 (X)/Fs ] = E[
Rt 2
2
(vi) E[It (X)] = E[
Xs d < M >s ] = [X]2t .
0
Rt 2
(vii) < I. (X) >t =
Xs d < M >s .
0
(v)
Rt
s
Xu2 d < M >u /Fs ].
Indication: actually, (vi) and (vii) are consequence of (v).
4. Prove that stochastic integral is associative, meaning: if
w.r.t. the martingale
the martingale
H.M ,
M,
giving the integral
then
G.H
H.M,
and if
G
H
is stochastically integrable
is stochastically integrable w.r.t.
is stochastically integrable w.r.t. the martingale
M
and:
G.(H.M ) = (G.H).M.
a continuous martingale and X ∈ L(M ). let be s < t and Z a
Rt
Rt
bounded random variable. Compute E[
ZX
dM
−
Z
Xu dMu ]2 and prove:
u
u
s
s
5. Let be
M
Z
t
Z
ZXu dMu = Z
s
N
t
Xu dMu .
s
T
T
a stopping time, two processes X and Y such that X = Y , two martingales M
T
T
T
T
such that M = N . Suppose X ∈ L(M ) and Y ∈ L(N ). Prove that IM (X) = IN (Y ) .
6. Let be
and
Fs -measurable
T
(Use that for any square integrable martingale:
4
Mt = 0 p.s. ⇐⇒< M >t = 0 p.s.)
M and N square integrable continuous martingales, and processes X ∈ L∞ (M ),
Y ∈ L∞ (N ). Prove that
R∞
R∞
(i) X.M and Y.N are uniformly integrable, with terminal value
X
dM
and
Ys dNs .
s
s
0
0
(ii) limt→∞ hX.M, Y.N it exists almost surely.
7.
Let
This is a direct application
R of Kunita-Watanabe's inequality.
(iii)E[X.M∞ Y.N∞ ]
= E[
∞
0
Xs Ys dhM, N is ].
Use the following theorem: if M is a continuous local martingale such that E[hM i∞ ] < ∞,
then it is uniformly integrable and converges almost surely when t → ∞. Moreover E[hM i∞ ] =
2
E[M∞
].
8. Let be M and N
L∞ (M ) ∩ L∞ (N ). Prove
two local continuous martingales and real numbers
a
and
b, X ∈
that the stochastic integration with respect to the local continuous
martingales is a linear application, meaning
X.(aM + bN ) = aX.M + bX.N
a local continuous martingale and X ∈ L∞ (M ). Prove there exists a sequence
n
of simple processes (X ) such that ∀T > 0, IP-almost surely:
9. Let be
M
Z
lim
n→∞
T
|Xsn − Xs |2 dhM is = 0,
0
and
lim sup |It (X n ) − It (X)| = 0.
n→∞ 0≤t≤T
10.
Let
partition of
W a standard Brownian motion, ε a number in [0, 1], and Π = (t0 , · · · , tm )
[0, 1] with 0 = t0 < · · · < tm = t). Consider the approximating sum :
Sε (Π) =
a
m−1
X
[(1 − ε)Wti + εWti+1 ](Wti+1 − Wti )
i=0
for the stochastic integral
Rt
0
Ws dWs .
Show that :
1
1
lim Sε (Π) = Wt2 + (ε − )t,
|Π|→0
2
2
where the limit is in probability. The right hand of the last limit is a martingale if and only
if
ε = 0,
so that
W
is evaluated at the left-hand endpoint of each interval
[ti , ti+1 ]
in the
approximating sum ; this corresponds to the Ito integral.
ε = 12 we obtain the
Rt
Ws ◦ dWs = 21 Wt2 .
0
With
such as
Stratonovitch integral, which obeys the usual rules of calculus
indication: explicit an approximation of Ito integral 0t Ws ◦ dWs and of W quadratic variation;
then apply Ito formula to Wt2 .
Or: write Sε (Π) with a combination of Wt2i+1 − Wt2i and (Wti+1 − Wti )2 .
R
5
5 Itô formula
1. The quadratic covariation of two continuoussquare integrable semi martingales
supi |ti+1 − ti | → 0
the limit in probability, when
n
X
hX, Y it = lim
proba
Prove this covariation is null when
X
X
and
Y
is
of:
(Xti+1 − Xti )(Yti+1 − Yti ).
i=1
is a continuous semi-martingale and
Y
a nite variation
process.
X
2. Lévy Theorem : Let be
X
a continuous (semi-)martingale,
is a real Brownian motion if and seulement if
First step: compute the
Fs -conditional
X
X0 = 0
almost surely.
is a continuous local martingale s.t.
characteristic function of
Xt − Xs
hXit = t.
∀
using Itô formula,
s ≤ t.
3.
Prove that the unique solution in
Cb1,2 (R+ , Rd )
of the partial dierential equation (heat
equation)
1
∂t f = ∆f, f (0, x) = ϕ(x), ∀x ∈ Rd
2
d
2
where ϕ ∈ Cb (R ) is f (t, x) = E[ϕ(x + Bt )], B d−dimensional Brownian motion.
Peut-on se passer de l'hypothèse que les dérivées de f and φ sont bornées ?
4.
A
Long and tedious proof...
Let be
M
a
d-dimensional
vector of continuous martingales,
d-dimensional vector with with nite variation, X0 a F0 -measurable
f ∈ C 1,2 (IR+ , IRd ) and Xt := X0 + Mt + At . Prove that IP almost surely:
Z t
Z tX
Z tX
i
f (t, Xt ) = f (0, X0 ) +
∂t f (s, Xs )ds +
∂i f (s, Xs )dMs +
∂i f (s, Xs )dAis
an adapted contious
random variable; let be
0
+
Z
1 tX
2
0
0
0
i
i
∂ij2 f (s, Xs )dhM i , M j is
i,j
5.
exercise 4 with two semi-martingales X = X0 + M + A
Ra)Use
Rt
t
X
dY
=
X
Y
−
X
Y
−
Y dXs − hX, Y it .
s
s
t
t
0
0
0
0 s
and
Y = Y0 + N + C.
This the integral by part formula.
b) Use Ito formula to get the stochastic dierential of the processes
t 7→ Xt−1 ; t 7→ exp(Xt ) ; t 7→ Xt .Yt−1 .
6. Prove that
t
Z
(exp
t
Z
as ds)(x +
Z
bs exp(−
0
0
t
au du)dBs )
0
is solution to the SDE
dXt = a(t)Xt dt + b(t)dBt , t ∈ [0, T ], X0 = x,
after justication of any integral in the formula.
7. Stratonovitch integral is dened as:
Z
t
Z
Ys ◦ dXs =
0
0
t
1
Ys ◦ dXs + hY, Xit .
2
6
Prove that
Let be
ε=
1
. Prove that:
2
lim Sε (Π) =
kπk→0
m−1
X
Z
[(1 − ε)Wti + εWti+1 ](Wti+1 − Wti ) =
0
i=0
kπk = supi (ti+1 − ti ).
Let be X and Y two continuous
t
1
Ws ◦ dWs = Wt2
2
where
lim
kπk→0
Let be
that:
X
a
m−1
X
i=0
semi-martingales,and
π
1
(Yt + Yti )(Xti+1 − Xti ) =
2 i+1
a partition [0,t]. Prove that
Z
t
Ys ◦ dXs .
0
d-dimensional vector of continuous semi-martingales, and f
Z t
f (Xt ) − f (X0 ) =
∂i f (Xs ) ◦ dXsi .
0
7
a
C 2 function.
Prove
6 Stochastic dierential equations
Rt
Rt
Rs
t 7→ (exp 0 as ds)(x + 0 bs exp(− 0 au du)dBs ) is solution to the SDE
dXt = a(t)Xt dt + b(t)dBt , t ∈ [0, T ], X0 = x, after justication of any integral in the formula.
(meaning specify useful hypotheses on parameters a and b.
Rt
2
Bs dBs + t.
2. Let be B a real Brownian motion. Prove that Bt = 2
0
If ∀t X ∈ Lt (B), then:
Z t
Z t
2
Xs2 ds.
(X.B)t = 2 (X.B)s Xs dBs +
1. Prove that the process
0
Let be
Zt = exp((X.B)t −
R
1 t
2
0
Xs2 ds).
0
Prove that
Z
Z
is solution to the SDE:
t
Zs Xs dBs .
Zt = 1 +
0
Prove that
Y = Z −1
is solution to the SDE:
dYt = Yt (Xt dt − Xt dBt ).
Prove that there exists a unique solution to the SDE dXt = Xt bt dt + Xt σt dBt , Xt
b, σ 2 ∈ L1 (IR+ ), computing the stochastic dierential of two solutions ratio.
= x ∈ IR
when
3. Let be Ornstein Uhlenbeck stochastic dierential equation:
dXt = −αXt dt + σdBt , X0 = x,
where
x ∈ L1 (F0 ).
(i) Prove that the following is the solution of this SDE:
−αt
Xt = e
Z
(x +
t
σeαs dBs ).
0
(ii) Prove that the expectation
m(t) = E[Xt ]R is solution of an ordinary dierential equation
t
Xt = x − α 0 Xs ds + σBt . Deduce m(t) = m(0)e−αt .
which is obtained by integration of
(iii) Prove the covariance
V (t) = V ar[Xt ] =
2
σ
F0 -measurable variable, with law N (0, 2α
),
σ 2 −α|t−s|
convariance function ρ(s, t) =
e
.
2α
(iv) Let be
process with
σ2
σ 2 −2αt
+ (V (0) −
)e
.
2α
2α
x
a
8
Prove that
X
is a Gaussian
7 Black-Scholes Model
1.Assume that a risky asset price process is solution to the SDE
dSt = St bdt + St σdWt , So = s,
b
is named trend' and
σ
(1)
volatility. Prove that (??) admits a unique solution, using Ito
S1
i
with S , i = 1, 2 two solutions to the SDE.
S2
formula to compute the ratio
2. Assume that the portfolio
C
θ
value
Vt (θ)
is such that there exists a
C 1,2
regular function
satisfying
Vt (θ) = C(t, St ).
Otherwise,
θ
is the pair
(a, d)
Vt (θ) =
(2)
and
at St0
Z
+ dt St = hθ0 , p0 i +
t
as dSs0
Z
+
0
With this self-nancing strategy
θ
t
ds dSs .
(3)
0
the option seller (for instance option
hedge the option with the initial price
Use two dierent ways to compute the
q = V0 : VT (θ) = C(T, ST ).
stochastic dierential of Vt (θ)
(ST − K)+ )
could
to get a PDE (partial
dierential equation) the solution of which will be the researched function
C.
3. Actually this PDE is solved using the change of (variable,function) :
x = ey , y ∈ IR ; D(t, y) = C(t, ey ).
Thus, prove that we turn to the Dirichlet problem
1
∂t D(t, y) + r∂y D(t, y) + ∂y22 D(t, y)σ 2 = rD(t, y), y ∈ IR,
2
y
+
D(T, y) = (e − K) , y ∈ IR.
Now let be the SDE:
dXs = rds + σdWs , s ∈ [t, T ], Xt = y.
Dedduce the solution
D(t, y) = Ey [e−r(T −t) (eXT − K)+ ],
and the explicit formula, Black-Scholes formula, which uses the fact that the law of
Gaussian law.
9
XT
is a
8 Change of probability measures, Girsanov theorem
Q equivalent to IP dened as Q = Z.P, Z ∈ L1 (Ω, FT , IP)
meaning Q| Ft = Zt .P, Zt = EP [Z/Ft ].
∞
Prove that ∀t and ∀Y ∈ L (Ω, Ft , P ), EP [Y Zt /Fs ] = Zs EQ [Y /Fs ].
Indication: compute ∀A ∈ Fs , the expectations EP [1A Y Zt ] and EP [1A Zs EQ [Y /Fs ]].
2. Let be T ≥ 0, Z ∈ M(IP) and Q = ZT IP, 0 ≤ s ≤ t ≤ T and a Ft −measurable random
1
variable Y ∈ L (Q). Prove (Bayes formula)
1. Let be the probability measure
EQ (Y /Fs ) =
EIP (Y Zt /Fs )
.
Zs
M a IP-martingale, X ∈ L(B) such that Z = E(X.B) is a IP-martingale
dZt = Zt Xt dBt , Z0 = 1). Let be Q := ZT IP an equivalent probability measure
algebra FT .
(i) Prove that dhM, Zi = ZXdhM, Bi.
(ii) Use Itô formula to developp Mt Zt − Ms Zs , calculer EIP [Mt Zt /Fs ].
R.
(iii) Use Itô formula between s and t to process Z.
Xu dhM, Biu .
0
R.
(iv) Deduce M. −
Xu dhM, Biu is a Q−martingale.
0
3. Let be
(remember:
to
IP
on
σ-
4. The following is a contra-example when Novikov condition is not satised: let be the stopping
2
time T = inf{1 ≥ t ≥ 0, t + Bt = 1} and
2
Bt 1{t≤T } ; 0 ≤ t < 1, X1 = 0.
(1 − t)2
R1 2
(i) Prove that T < 1 almost surely, so
Xt dt < ∞ almost surely.
0
Bt2
(ii) Apply Itô formula to the processt →
; 0 ≤ t < 1 to prove:
(1−t)2
Xt = −
Z
0
1
1
Xt dBt −
2
Z
1
Xt2 dt
Z
T
= −1 − 2
0
0
E[Et (X.B)] ≤
exp(−1) < 1 and this fact contradicts that for
it could be 1....
√ any martingale E(Mt ) = M0 , here
σ
Anyway, prove that ∀n ≥ 1 and σn = 1−(1/ n), the stopped process E(X.B) n is a martingale.
(iii) The local martingale
E(X.B) is not a martingale:
t
B 2 dt < −1.
(1 − t)4 t
we deduce from (ii) that
Let be B the standard Brownian motion on the ltered probability space (Ω, (Ft ; t ∈
Rt
IR ), P) and H ∈ L2 (Ω×[0, t]). ∀t, let be Wt := Bt + 0 Hs ds. Prove that the law of W is equivaRT
R
dµB
1 T
lent to the Wiener measure according to the density on FT
=
exp[
−H
dB
−
−Hs2 ds
s
s
dµW
2 0
0
where (on the canonical space) µB (A) := IP{ω : t 7→ Bt (ω) ∈ A}.
5(*).
+
10
9 Representation theorems, martingale problem
Recall:
H02 = {M ∈ M2,c , M0 = 0, hM i∞ ∈ L1 },
M and N are said to be orthogonal if E[M∞ N∞ ]
= 0,
noted
M ⊥N ,
strongly orthogonal if M N is a martingale, noted as M † N .
2
Let be A ⊂ H0 : denote S(A) the smallest stable closed vectorial subspace which contains
and
M ∈ H02 and Y a centered Bernoulli random
be N := Y M. Prove M ⊥N but no M † N.
2
2. Let be M and N ∈ H0 . Prove the equivalencies:
1.
Let be
(i)M † N,
(iii)S(M ) † S(N )
(v)S(M )⊥S(N )
M(A)
A ⊂ H02 (Q).
3. Let be
the set of probability measures
that
Prove that
M(A)
variable independent on
M.
A.
Let
(ii)S(M ) † N
(iv)S(M )⊥N
Q
on
F∞ , Q << IP, IP|F0 = Q|F0 ,
and such
is convex.
M(A) and M (A) (cf. Def 6.1 and 6.17 in Lecture Notes).
c,2
i
Brownian motion on (Ω, Ft , IP). Prove that ∀M ∈ Mloc , ∃H ∈
Study carefully the dierence between
B a n−dimensional
P(B ), i = 1, · · · , n, such that:
4. Let be
i
Mt = M0 +
n
X
(H i .B i )t .
i=1
Indication: apply extremal probability measure theorem (th 6.14) to the set
gleton
{IP})
when
B
M (B)
(actually the sin-
is the set of Brownian motion, then localize.
5. Prove that the above vector process
is such that :
Z tX
n
H
is unique, meaning
|Hs0i − Hsi |2 ds = 0
∀H 0
satisfying
Mt = M0 +
Pn
i=1 (H
0i .B i )
t
almost surely.
0 i=1
6. Let be
M
a vector martingale, the components of which are not strongly orthogonal two by two.
Prove the inclusion
{H, ∀iH i ∈ L(M i )} ⊂ L(M )
but the equality is false.
11
10 Example: optimal strategy for a small investor
Let be a set of price processes:
dXtn
=
Stn = Et (X n ), t ∈ [0, T ],
d
X
with:
σjn (t)dWtj + bn (t)dt, n = 1, · · · , N ; dXt0 = rt dt.
j=1
Suppose the matrix
α > 0.
σ
dt ⊗ dIP almost surely : σσ ∗ ≥ αI, σ ∗ is the
b, σ, r areF−adapted bounded [0, T ] × Ω processes.
satises
The coecients
tranpose matrix of
σ
and
1. Look for a condition so that the market is viable, meaning a condition such that there is no arbitrage
opportunity.
(i) Prove that a market is viable as soon as there exists a risk neutral probability measure
(ii) Propose some hypotheses on the above model, sucient for the existence of
Q.
Q.
2. Propose some hypotheses on the above model, sucient for the market be complete, meaning any
contingent claim is atteignable (hedgeable).
Start with case N = d = 1, then N = d > 1.
Remark
σ surjective, there is no uniqueness of vector u so that σdW +(b−r)dt = σdW̃ .
QS is bijective with σ −1 (r − b).
let be a set of price processes S , a risk neutral probability measure on(Ω, (Ft )) is a probability
: If
d<N
and
In this case, the market is not complete and the set
Recall:
measure
Q
equivalent to
Q-martingales;
IP
such that the discounted prices
denote their set
denoted as
S̃ n ,
are uniformly integrable
Qp .
θ an admissible strategy.
Ṽt (p) = e−rt Vt (p) satises:
3. Let be
value
e−rt S n ,
Prove it is self-nancing if and only if the discounted portfolio
t
Z
Ṽt (p) = V0 (p) +
< θs , dp̃s > .
0
(use Ito formula)
4. Let be the relation dened as
c1 ≺ c2
where the application
ψ
ψ(c1 ) ≤ ψ(c2 )
si
is dened on the consumption set
X
by:
ψ(a, Y ) = a + EQ [Y ].
Prove that it is a convex increasing continuous complete preference relation.
5. A sucient and necessary condition for a strategy
objective consumption
(π, c) to be admissible:
let be xed the discounted
RT
−rs c ds. Prove that
s
0 e
T
Z
(∗)
EQ [
e−rs cs ds] ≤ x
0
is equivalent to the existence of an admissible stategy
π
such that
XT = x +
RT
0
πs .dS̃s .
6. Optimal strategies.
Prove that actually the problem is as following: the small investor evaluates the quality of his investment with an utility function (Ui is positive, concave, strictly increasing,
maximisation:
Z
(c, XT ) → EIP [
C1
class); he look for the
T
U1 (cs )ds + U2 (XT )]
0
under the above constraint 5 (*). Solve this constrained optimisation problem using Lagrange method
and Kuhn and Tucker Theorem.
12